The National Highway Traffic Safety Administration (NHTSA) has taken aim at Tesla, raising concerns about the company’s promotion of its Full Self Driving (FSD) driver assistance system on social media platform X (formerly Twitter). In a letter to the electric vehicle giant dated May 14, the agency pointed out that Tesla’s X account has been sharing or endorsing posts that portray FSD as a more capable technology than it actually is. This, according to the NHTSA, conflicts with Tesla’s own statements that FSD requires active driver supervision at all times.
The regulator highlighted that Tesla’s social media posts, which often depict drivers engaging in activities other than driving while FSD is active, could mislead consumers into believing that the system is capable of fully autonomous driving. “We believe that Tesla’s postings conflict with its stated messaging that the driver is to maintain continued control over the dynamic driving task. We similarly observe that these postings may encourage viewers to see FSD-Supervised as a Chauffer or ‘Robotaxi’ rather than a partial automation / driver assist system that requires persistent attention and intermittent intervention by the driver,” the NHTSA stated in the letter, which was made public on Friday.
The NHTSA’s letter also emphasizes the powerful influence social media has on public perception compared to vehicle owner manuals. The regulator’s concerns are not limited to social media messaging. In October 2023, the NHTSA launched a comprehensive investigation into nearly 2.4 million Tesla vehicles equipped with FSD, following reports of several crashes involving the technology. The agency has identified four incidents where Tesla vehicles crashed while FSD was engaged in areas with reduced visibility, such as conditions with sun glare, fog, or airborne dust. One of these incidents resulted in the tragic death of a pedestrian.
The NHTSA has now given Tesla until December 18 to respond to its questions concerning the crashes and provide detailed information about any other incidents the company is aware of. This isn’t the first time the NHTSA has scrutinized Tesla’s driver assistance systems. In December 2023, Tesla issued a recall for over 2 million vehicles in the US equipped with Autopilot, a suite of driver assistance features that includes Autosteer. The recall was initiated after the NHTSA deemed the feature’s controls inadequate to prevent misuse. While Tesla pledged to enhance its safeguards, the NHTSA launched a separate investigation in April 2024 to assess the effectiveness of the company’s fix.
The NHTSA’s scrutiny of Tesla’s FSD technology and its social media communications underscores the growing concerns surrounding the development and deployment of autonomous driving systems. As the technology continues to evolve, regulatory bodies are increasingly focused on ensuring that drivers are adequately informed about the capabilities and limitations of these systems and that they are used safely and responsibly.