The National Highway Traffic Safety Administration (NHTSA) is investigating Tesla’s Full Self-Driving (FSD) software following concerns raised by the automaker’s social media posts.
These posts appear to suggest that FSD can operate as a robotaxi without requiring driver attention. The agency’s investigation, which began in October, covers 2.4 million Tesla vehicles equipped with the FSD system following reports of four crashes, including a fatality in 2023, where conditions like sun glare, fog, and airborne dust may have contributed to the incidents.
The NHTSA raises concerns about Tesla’s promotional materials, including posts on the social media platform X, formerly known as Twitter. These posts appear to encourage drivers to use Full Self-Driving (FSD) technology while not maintaining full attention on the road.
One post featured an individual using FSD to drive 13 miles to an emergency room during a heart attack, while another described a 50-minute drive home from a sporting event. NHTSA argued that these posts conflict with Tesla’s stated messaging that FSD requires driver oversight and intervention.
In a letter sent to Tesla in May, NHTSA expressed its concerns about the posts and asked the automaker to reconsider how it communicates the capabilities of its driver-assistance systems. Tesla responded by reiterating that its owner’s manual and other materials clearly state that FSD is not autonomous and requires driver vigilance.
As part of its ongoing investigation, NHTSA has set a deadline of December 18 for Tesla to answer questions related to the system’s performance, especially in conditions with reduced roadway visibility. The agency is examining whether Tesla’s system provides adequate feedback to drivers, enabling them to make real-time decisions about when the system’s capabilities have been exceeded.
This investigation explores a tragic incident in Rimrock, Arizona, where a driver struck a 71-year-old woman after she exited her vehicle following a rear-end collision. A Tesla operating in FSD mode struck her. At the time, the driver of the Tesla struggled with sun glare and has not faced any charges in the incident.
In December 2023, Tesla agreed to recall more than 2 million vehicles in the U.S. to install new safeguards in its Autopilot system under NHTSA’s pressure. However, the agency is still evaluating whether these safeguards are sufficient to address the ongoing safety concerns associated with Tesla’s advanced driver-assistance systems.