A new report has revealed damning footage showing several Tesla vehicles on Autopilot crashing into police vehicles on the highway.
Tesla has been under federal investigation about its Autopilot potentially having a problem with crashes with emergency vehicles on the side of the road.
The US National Highway Traffic Safety Administration (NHTSA) first opened an investigation into Tesla Autopilot over its possible involvement in 11 crashes involving emergency and first responder vehicles back in 2021.
It has since ramped up the investigation to include 16 crashes.
Shortly after the announcement, Tesla updated its owner’s manual to note that Autopilot now “detects and slows down for emergency vehicles’ lights at night,” but additional similar crashes have happened since then.
Now the investigation is coming back into the news because Wall Street Journal has obtained and released some in-car footage from some of these crashes.
Here’s the video that it released about the new footage. (Warning: some of the content is graphic.)
Electrek’s Take
First off, let’s be clear. What this video shows are user errors. Plain and simple. For the love of god, in the main video that they feature, they even said that the driver was intoxicated. We can also see that there was more than plenty of time to react if the driver was paying attention.
Top comment by A98u723
The reality that’s likely to set in for most purchasers of FSD is that, as equipped, Level 5 is simply not possible. Nor is Level 4 or Level 3. The package can’t handle simple but life-and-death safety issues, despite dozens of iterations that have only marginally improved performance. And Musk has said legacy models will no longer get any equipment upgrades. So the necessary sensors and computing power do not exist for most of us, and despite paying upwards of $15k, and risking life and property on Tesla’s behalf, we are holding what amounts to a bill of goods.
We don’t get necessary hardware. We don’t get compensation for our testing work. No, we get an FU.
Let that sink in.
If the drivers had used the Autopilot as intended, those accidents wouldn’t have happened. So it’s hard to blame Tesla for any of this.
However, it is still a bad look for Tesla because these crashes happened at the same time that Elon Musk claimed that Tesla would achieve full self-driving capability any day.
I still think that Autopilot and Full Self-Driving package users should use the features as guided in the user manual, which means paying attention at all times and being ready to take control, but at the same time, it doesn’t require a big stretch of the imagination to understand why people would think Autopilot should be able to detect those vehicles.
Autopilot should be able to detect and avoid these crashes.
FTC: We use income earning auto affiliate links. More.
Comments