Today, federal safety investigators opened a new investigation aimed at Tesla's electric vehicles. This is now the 14th investigation by the National Highway Traffic Safety Administration and one of several currently open. This time, it's the automaker's highly controversial "full self-driving" feature that's in the crosshairs—NHTSA says it now has four reports of Teslas using FSD and then crashing after the camera-only system encountered fog, sun glare, or airborne dust.
Of the four crashes that sparked this investigation, one caused the death of a pedestrian when a Model Y crashed into them in Rimrock, Arizona, in November 2023.
NHTSA has a standing general order that requires it to be told if a car crashes while operating under partial or full automation. Fully automated or autonomous means cars might be termed "actually self-driving," such as the Waymos and Zooxes that clutter up the streets of San Francisco. Festooned with dozens of exterior sensors, these four-wheel testbeds drive around—mostly empty of passengers—gathering data to train themselves with later, with no human supervision. (This is also known as SAE level 4 automation.)
But the systems that come in cars that you or I could buy are far less sophisticated. Sometimes called "level 2+," these systems (which include Tesla Autopilot, Tesla FSD, GM's Super Cruise, BMW Highway Assistant, and Ford BlueCruise, among others) are partially automated, not autonomous. They will steer, accelerate, and brake for the driver, and they may even change lanes without explicit instruction, but the human behind the wheel is always meant to be in charge, even if the car is operating in a hands-free mode.