Skip to main content

Tesla Autopilot ‘functioned as designed’ but ‘played a role’ in 2016 fatal crash, says NTSB

As expected yesterday following a report, the U.S. National Transportation Safety Board (NTSB) is partly blaming Tesla’s Autopilot system for the fatal, May 7, 2016, crash of a Tesla near Williston, Florida.

They determined that the Autopilot’s “operational limitations played a role” in the crash even though it “functioned as designed”, according to the board.

As previously reported, a 2015 Tesla Model S 70D, traveling eastbound on US Highway 27A (US-27A) west of Williston, Florida, struck and passed beneath a refrigerated semitrailer powered by a 2014 Freightliner Cascadia truck-tractor. At the time of the collision, the truck was making a left turn from westbound US-27A across the two eastbound travel lanes. Impact with the right side of the semitrailer sheared off the roof of the Tesla. The driver and sole occupant of the Tesla died in the crash; the commercial truck driver was not injured.”

You can read more about the circumstances of the crash here.

During a press conference today, NTSB wanted to clear some misconceptions about the event, which has been wrongly painted as “the first self-driving car death” by the media.

Of course, the Tesla Model S involved in the accident was not a “self-driving car”. It was operating with Tesla’s first-generation Autopilot system, which is a level 2 semi-autonomous driver assist system.

NTSB made that clear this morning in a series of statements:

Instead, they described the event as “the 1st known case of a highway fatality in an auto operating with this level or higher levels, of automated control systems.”

They also described the purpose of the investigation:

The board said that Autopilot functioned as designed during the accident since it wasn’t meant to prevent this type of crash:

Where Autopilot played a role is through its “operational limitations”, said the chairman. Reuters reported:

“The chairman of the U.S. National Transportation Safety Board (NTSB) said Tuesday “operational limitations” in the Tesla Model S played a “major role” in the May 2016 crash that killed a driver using the vehicle’s semi-autonomous “Autopilot” system.”

NTSB says that “humans are very poor at monitoring automated systems” and that systems need to ensure that driver stay vigilant and keep monitoring their vehicle, something Tesla asks of drivers when using Autopilot.

Since the accident, Tesla introduced more alert systems to ensure that drivers keep their hands on the steering wheel when using Autopilot and as we recently reported, they also added a driver-facing camera in their latest Autopilot hardware suite, which could be used to monitor the driver’s attention.

``

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

Comments

Author

Avatar for Fred Lambert Fred Lambert

Fred is the Editor in Chief and Main Writer at Electrek.

You can send tips on Twitter (DMs open) or via email: fred@9to5mac.com

Through Zalkon.com, you can check out Fred’s portfolio and get monthly green stock investment ideas.


Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications