Skip to main content

Tesla Autopilot crash caught on dashcam shows how not to use the system

Earlier this week, a Tesla Model S hit a barrier on the highway near Dallas, Texas. The driver, who fortunately wasn’t injured, first blamed Tesla’s Autopilot for the crash.

We now have footage of the accident and it actually shows a situation that the Autopilot probably shouldn’t be expected to be able to handle, at least not yet. Ultimately, it serves as a reminder not to trust the system without paying attention.

Following our articles on a series of accidents last year where the Autopilot was activated during or right before the crashes, some readers were confused on whether the driver or the Autopilot should be considered at fault.

Since under its current form, Tesla’s Autopilot is only a “driver assist” system and drivers are asked to keep their hands on the steering wheel, the responsibility falls on the driver. Of course, that’s unless the Autopilot malfunctions and automatically steers away from the lane and into the side of the road, which is almost what we were led to believe with this latest accident, but that has so far never happened as far as we know.

The driver described the accident in a Reddit post on Monday:

“I was driving in the left lane of a two lane highway. The car is AP1 (first generation Autopilot) and I’ve never had any problems until today. Autopilot was on and didn’t give me a warning. It misread the road and hit the barrier. After the airbags deployed there was a bunch of smoke and my car rolled to a grinding stop. Thankfully no one was hurt and I walked away with only bruises.”

He attached pictures of the aftermath:

Fast forward to 3 days later. Another Redditor on the Tesla Motors subreddit found footage of the accident taken from the dashcam of a vehicle following the Tesla during the event.

The footage shows that the Tesla needed to merge or change lane in order to avoid the barrier – something the Autopilot should never be left to do without the driver intervening.

What is also clear from the footage is that the design of the road here is quite awful since even the driver in the vehicle with the dashcam almost hit the barrier and there presumably wasn’t any driver assist at play in this case.

As far as the Autopilot’s Autosteer feature, it did its job, which is to keep the vehicle in its lane which was still marked on the road leading right into the barrier.

What potentially didn’t work is the ‘Forward Collision Warning’ feature since the driver claims that there was no warning. Some would assume that Automatic Emergency Braking (AEB) should have kicked in, but it’s actually not designed to engage if there’s an alternative and in this case, the vehicle wasn’t supposed to brake in order to avoid the barrier – it could have been even more dangerous considering a vehicle was close behind and there was traffic to the right of the vehicle.

Tesla explains what the feature does:

“AEB does not engage when an alternative collision avoidance strategy (e.g., driver steering) remains viable. Instead, when a collision threat is detected, forward collision warning alerts the driver to encourage them to take appropriate evasive action.”
Of course, collision warning is no substitute to paying attention when driving.
Basically 3 things went wrong here and if any one of them hadn’t, this collision wouldn’t have happened.
  1. The road construction was poorly implemented. Looking at this footage, you can see that the construction comes up relatively quick and the lane markers go right into the wall. Usually there are cones and signs leading up to the cutover.  That’s confusing to humans as well as vehicles.
  2. The driver should have been awake at the wheel and looking forward. Had he been alert, he could have easily taken over in time to steer the car to the right. Clearly he wasn’t alert in this instance with his hands on the wheel per Tesla’s instructions.
  3. The vehicle theoretically should have detected the upcoming wall and either stopped or freaked out in some way.  We don’t know if it started beeping ahead of the collision inside the vehicle since the driver’s story is already suspect. In traffic, the car could have decided it was less risky to sideswipe the wall rather than cut over into what it perceived as another lane of traffic. More importantly, Tesla doesn’t advertise that its cars should be considered level 4 or 5 autonomous but to get there, it will need to make the cars smart enough to handle this type of situation.
Until Tesla’s Autopilot reaches level 4 or 5 autonomy, which shouldn’t be until at least the end of the year, the driver remains responsible for monitoring the vehicle. That means keep your eyes on the road and be safe out there.

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

Comments

Author

Avatar for Fred Lambert Fred Lambert

Fred is the Editor in Chief and Main Writer at Electrek.

You can send tips on Twitter (DMs open) or via email: fred@9to5mac.com

Through Zalkon.com, you can check out Fred’s portfolio and get monthly green stock investment ideas.


Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications