All Tesla vehicles with Full Self-Driving (Supervised) in the US are now under NHTSA investigation after a fatal crash raised doubts about the system’s handling of low-visibility situations.
Automakers have to report when they are made aware of crashes involving their advanced driver-assistance systems (ADAS), like Tesla’s Autopilot and FSD. Those reports are called Standing General Orders (SGO).
When reviewing Tesla’s SGOs, NHTSA believes it found a concerning pattern as reports, which include a fatal crash, were related to reduced visibility conditions.
NHTSA wrote in its report:
The Office of Defects Investigation (ODI) has identified four Standing General Order (SGO) reports in which a Tesla vehicle experienced a crash after entering an area of reduced roadway visibility conditions with FSD -Beta or FSD -Supervised (collectively, FSD) engaged. In these crashes, the reduced roadway visibility arose from conditions such as sun glare, fog, or airborne dust. In one of the crashes, the Tesla vehicle fatally struck a pedestrian. One additional crash in these conditions involved a reported injury.
It triggered the agency’s Office of Defect Investigation (ODI) to open a Preliminary Evaluation of Tesla’s FSD, which covers all Tesla vehicles built since 2016.
Here’s what the investigation is trying to assess:
- The ability of FSD’s engineering controls to detect and respond appropriately to reduced
roadway visibility conditions; - Whether any other similar FSD crashes have occurred in reduced roadway visibility conditions
and, if so, the contributing circumstances for those crashes; and - Any updates or modifications from Tesla to the FSD system that may affect the performance of FSD in reduced roadway visibility conditions. In particular, this review will assess the timing,
purpose, and capabilities of any such updates, as well as Tesla’s assessment of their safety
impact.
An ODI preliminary evaluation is one of the first steps toward a recall, but Tesla has been through that process several times, and more often than not, the automaker has been able to avoid significant recalls that aren’t simple over-the-air software updates.
Electrek’s Take
Top comment by AlienEntity1
I drive a lot of different cars, all of which are equipped with some form of ADAS, including Tesla Autopilot. In my experience, radar is a very useful and effective 'forward looking' technology and is better than a visual-only system. Just as your human eyes are limited, so are cameras - in limited visibility situations (at night, in the rain, with a dark pedestrian for example) you want things like radar or lidar. Musk is behind because of this, and the Tesla FSD is compromised for good, at least HW3 and HW4.
The low visibility issue is certainly not new. Even simple sun glare can sometimes completely debilitate Tesla’s FSD. Fog is also an issue I’ve experienced several times, but most often, I do get an alert about bad weather from FSD that warns of degraded performance.
Generally, on those occasions, I assess the weather, and if it’s something that I feel as a human driver is easy to handle, I’ll give FSD a chance, but if it’s not, I don’t take any risk.
But it’s fairly straightforward that under the current hardware, both HW3 and HW4, Tesla is not equipped for FSD to handle many weather conditions, which makes level 5 autonomy impossible despite claims otherwise by Elon Musk.
Level 4 autonomy, which accounts for certain road condition exceptions, is the limit, even though there are doubts about it.
FTC: We use income earning auto affiliate links. More.
Comments