NHTSA has escalated its investigation into Tesla’s “Full Self-Driving” system’s inability to handle reduced visibility conditions, upgrading the probe to an Engineering Analysis covering an estimated 3,203,754 vehicles — the step that typically precedes a recall.
The agency found that FSD’s degradation detection system fails to warn drivers when cameras are blinded by common road conditions like sun glare and fog, and that Tesla may be under-reporting related crashes.
Third concurrent FSD investigation
The new Engineering Analysis (EA26002), opened yesterday, upgrades the preliminary evaluation (PE24031) that NHTSA launched in October 2024 after identifying four crashes in reduced visibility conditions, including one that fatally struck a pedestrian.
The scope has since expanded to nine total incidents with one fatality and one injury. And NHTSA is now examining six additional potentially related incidents on top of those.
This makes it the third concurrent federal investigation into FSD. NHTSA is already running a separate probe (PE25012) into 58 incidents involving traffic violations like running red lights and crossing into opposing lanes, plus a separate inquiry into Tesla’s crash reporting practices.
The upgrade from a Preliminary Evaluation to an Engineering Analysis is significant. NHTSA typically completes an EA within 18 months, and this phase involves deeper technical testing, additional information requests, and peer manufacturer comparisons. Historically, an EA is the final investigative step before the agency either closes a case or pushes for a recall.
The core problem: FSD can’t tell when it’s blind
The central finding is damning for Tesla’s camera-only approach to autonomous driving. According to NHTSA, FSD’s degradation detection system — the software designed to recognize when cameras can’t see properly and alert the driver — fails under common roadway conditions.
NHTSA states that in the crashes it reviewed, the system “did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred.”
Worse, the vehicles either lost track of or completely missed other cars directly ahead of them before impact. The system essentially went blind and didn’t know it — or told the driver too late to do anything about it.
The conditions involved are not exotic edge cases. We’re talking about sun glare, fog, and airborne dust — things any driver encounters regularly. Tesla’s system relies exclusively on cameras after the company removed radar in mid-2021 against the advice of its own engineers who warned that cameras alone would be vulnerable to environmental interference.
Tesla only started fixing the problem after reporting a fatal crash
One of the most revealing details in the NHTSA document is the timeline of Tesla’s response. A fatal crash involving FSD and reduced visibility occurred on November 28, 2023. Tesla submitted the required Standing General Order (SGO) report for that crash on June 27, 2024 — already seven months later.
The very next day, June 28, 2024, Tesla began developing an update to the degradation detection system. NHTSA notes that it still does not know when that update was actually deployed or which vehicles have received it.
When NHTSA discussed the incidents with Tesla during the preliminary evaluation phase, Tesla’s own analysis conceded that its updated degradation detection system, had it been installed at the time, “may have affected” only 3 of the 9 identified crashes. That means even Tesla acknowledges its fix wouldn’t have helped in the majority of incidents.
Under-reporting concerns
NHTSA also flagged a concerning data gap. Tesla told the agency that internal “data and labeling limitations” prevented it from uniformly identifying and analyzing crashes that occurred while the degradation detection system was engaged.
NHTSA believes this limitation “could have led to under-reporting of subject crashes over portions of the defined time-period.” This echoes the separate investigation into Tesla’s crash reporting practices and the ongoing struggle to get Tesla to turn over FSD traffic violation data, where the company has received multiple deadline extensions.
The pattern is consistent: NHTSA keeps finding that Tesla either can’t or won’t provide clear data about FSD-related crashes.
Electrek’s Take
This investigation escalation is the most significant regulatory threat to Tesla’s “Full Self-Driving” deployment we’ve seen, and it cuts to the heart of a problem we’ve been flagging for years: a camera-only system has inherent vulnerability to visibility degradation, and Tesla’s software doesn’t adequately compensate for it.
What makes this particularly concerning is a failure mode that doesn’t get enough attention — camera fogging inside the housing. Tesla’s cameras can develop condensation between the lens and the outer cover, particularly in cold weather or humid conditions, and the system sometimes doesn’t detect it. When that happens, FSD continues operating with impaired vision and the driver has no idea unless they are looking directly at the camera feeds. That’s exactly the type of “degraded state” NHTSA is investigating, and it’s a hardware design issue that no software update can fully solve.
The broader picture is even worse for Tesla. We now have three concurrent NHTSA investigations into FSD, covering visibility failures, traffic violations, and crash reporting gaps. Tesla is simultaneously fighting to hand over data in one probe while this new one escalates. And all of this unfolds while Tesla continues to expand FSD’s availability and CEO Elon Musk continues to promise unsupervised “Full Self-Driving” is imminent.
The gap between Tesla’s autonomous driving claims and the regulatory reality has never been wider. An Engineering Analysis covering 3.2 million vehicles, with a fatal crash in the record and evidence of systemic visibility detection failures, is exactly the kind of probe that ends in a recall. Tesla needs to take this seriously — and so do the drivers relying on a system that can’t tell when it’s blind.
FTC: We use income earning auto affiliate links. More.
Comments