The National Highway Traffic Safety Administration (NHTSA) has raised concerns over Tesla’s social media posts suggesting its Full Self-Driving (FSD) software may operate autonomously as a “robotaxi” without driver attention, warning that such messaging contradicts the system’s design as a driver-assist feature.
In October, NHTSA launched an investigation into 2.4 million Tesla vehicles with FSD following four accidents, including a fatal 2023 crash, under conditions like sun glare and fog.
In a May email, NHTSA advised Tesla to reconsider its messaging on platforms like X, citing posts about an FSD-driven emergency hospital trip and a drive home from a sports event.
The agency stated, “We believe that Tesla’s postings conflict with its stated messaging that the driver is to maintain continued control.” Tesla responded that its owner’s manual and other resources stress the need for driver vigilance.
NHTSA’s ongoing probe seeks Tesla’s response by Dec. 18 on whether FSD sufficiently alerts drivers when its operational limits are reached.
This follows a December 2023 recall of over 2 million U.S. vehicles to enhance safeguards in Tesla’s Autopilot system, which the agency continues to evaluate.
Source: Reuters