Dealers' #1 source for auto industry news, content, coaching & analysis

NHTSA probes Tesla’s FSD software amid safety concerns over misleading social media posts

As part of its ongoing investigation, the agency set a December 18th deadline for Tesla to answer questions related to the system's performance.

The National Highway Traffic Safety Administration (NHTSA) is investigating Tesla’s Full Self-Driving (FSD) software following concerns raised by the automaker’s social media posts. 

These posts appear to suggest that FSD can operate as a robotaxi without requiring driver attention. The agency’s investigation, which began in October, covers 2.4 million Tesla vehicles equipped with the FSD system following reports of four crashes, including a fatality in 2023, where conditions like sun glare, fog, and airborne dust may have contributed to the incidents.

The NHTSA raises concerns about Tesla’s promotional materials, including posts on the social media platform X, formerly known as Twitter. These posts appear to encourage drivers to use Full Self-Driving (FSD) technology while not maintaining full attention on the road.

One post featured an individual using FSD to drive 13 miles to an emergency room during a heart attack, while another described a 50-minute drive home from a sporting event. NHTSA argued that these posts conflict with Tesla’s stated messaging that FSD requires driver oversight and intervention.

In a letter sent to Tesla in May, NHTSA expressed its concerns about the posts and asked the automaker to reconsider how it communicates the capabilities of its driver-assistance systems. Tesla responded by reiterating that its owner’s manual and other materials clearly state that FSD is not autonomous and requires driver vigilance.

As part of its ongoing investigation, NHTSA has set a deadline of December 18 for Tesla to answer questions related to the system’s performance, especially in conditions with reduced roadway visibility. The agency is examining whether Tesla’s system provides adequate feedback to drivers, enabling them to make real-time decisions about when the system’s capabilities have been exceeded.

This investigation explores a tragic incident in Rimrock, Arizona, where a driver struck a 71-year-old woman after she exited her vehicle following a rear-end collision. A Tesla operating in FSD mode struck her. At the time, the driver of the Tesla struggled with sun glare and has not faced any charges in the incident.

In December 2023, Tesla agreed to recall more than 2 million vehicles in the U.S. to install new safeguards in its Autopilot system under NHTSA’s pressure. However, the agency is still evaluating whether these safeguards are sufficient to address the ongoing safety concerns associated with Tesla’s advanced driver-assistance systems.

Stay up to date on exclusive content from CBT News by following us on Facebook, Twitter, Instagram and LinkedIn.

Don’t miss out! Subscribe to our free newsletter to receive all the latest news, insight and trends impacting the automotive industry.

CBT News is part of the JBF Business Media family.

Jaelyn Campbell
Jaelyn Campbell
Jaelyn Campbell is a staff writer/reporter for CBT News. She is a recent honors cum laude graduate with a BFA in Mass Media from Valdosta State University. Jaelyn is an enthusiastic creator with more than four years of experience in corporate communications, editing, broadcasting, and writing. Her articles in The Spectator, her hometown newspaper, changed how people perceive virtual reality. She connects her readers to the facts while providing them a voice to understand the challenges of being an entrepreneur in the digital world.

Related Articles

Latest Articles

From our Publishing Partners