Skip to main content

New damning footage shows several Tesla vehicles on Autopilot crashing into police

A new report has revealed damning footage showing several Tesla vehicles on Autopilot crashing into police vehicles on the highway.

Tesla has been under federal investigation about its Autopilot potentially having a problem with crashes with emergency vehicles on the side of the road.

The US National Highway Traffic Safety Administration (NHTSA) first opened an investigation into Tesla Autopilot over its possible involvement in 11 crashes involving emergency and first responder vehicles back in 2021.

It has since ramped up the investigation to include 16 crashes.

Shortly after the announcement, Tesla updated its owner’s manual to note that Autopilot now “detects and slows down for emergency vehicles’ lights at night,” but additional similar crashes have happened since then.

Now the investigation is coming back into the news because Wall Street Journal has obtained and released some in-car footage from some of these crashes.

Here’s the video that it released about the new footage. (Warning: some of the content is graphic.)

Electrek’s Take

First off, let’s be clear. What this video shows are user errors. Plain and simple. For the love of god, in the main video that they feature, they even said that the driver was intoxicated. We can also see that there was more than plenty of time to react if the driver was paying attention.

Top comment by A98u723

Liked by 19 people

The reality that’s likely to set in for most purchasers of FSD is that, as equipped, Level 5 is simply not possible. Nor is Level 4 or Level 3. The package can’t handle simple but life-and-death safety issues, despite dozens of iterations that have only marginally improved performance. And Musk has said legacy models will no longer get any equipment upgrades. So the necessary sensors and computing power do not exist for most of us, and despite paying upwards of $15k, and risking life and property on Tesla’s behalf, we are holding what amounts to a bill of goods.

We don’t get necessary hardware. We don’t get compensation for our testing work. No, we get an FU.

Let that sink in.

View all comments

If the drivers had used the Autopilot as intended, those accidents wouldn’t have happened. So it’s hard to blame Tesla for any of this.

However, it is still a bad look for Tesla because these crashes happened at the same time that Elon Musk claimed that Tesla would achieve full self-driving capability any day.

I still think that Autopilot and Full Self-Driving package users should use the features as guided in the user manual, which means paying attention at all times and being ready to take control, but at the same time, it doesn’t require a big stretch of the imagination to understand why people would think Autopilot should be able to detect those vehicles.

Autopilot should be able to detect and avoid these crashes.

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

Comments

Author

Avatar for Fred Lambert Fred Lambert

Fred is the Editor in Chief and Main Writer at Electrek.

You can send tips on Twitter (DMs open) or via email: fred@9to5mac.com

Through Zalkon.com, you can check out Fred’s portfolio and get monthly green stock investment ideas.


Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications