Skip to main content

Tesla Autopilot is under federal scrutiny again after Model S crash

After the Tesla Autopilot came under scrutiny over the last two years following a fatal accident in a Model S while the vehicle was on Autopilot, the driver assist system seemed to have caught a break.

But now another accident has resulted in the launch of another investigation.

Earlier this week, we reported on a Tesla Model S rear-ending a fire truck at a reported speed of 65 mph while the vehicle was on Autopilot, according to the driver.

There was no injury, but the fact that Autopilot was possibly active prompted the U.S. National Transportation Safety Board (NTSB) to launch an investigation.

The federal agency announced yesterday that they are sending two investigators to conduct a “field investigation” regarding the crash:

In their previous investigation of the fatal crash in 2016, NTSB concluded that Tesla’s Autopilot ‘functioned as designed’ but ‘played a role’ in the accident.

The board said that Autopilot functioned as designed during the accident since it wasn’t meant to prevent this particular type of crash, but that it still played a role through its “operational limitations”.

They said that “humans are very poor at monitoring automated systems” and that systems need to ensure that driver stay vigilant and keep monitoring their vehicle, something Tesla asks of drivers when using Autopilot.

It took over a year for NTSB to share its findings of the 2016 accident.

Electrek’s Take

Of course, this investigation is going to be significantly different since the driver is alive and should be able to answer questions.

We don’t have all the facts right now, but it certainly looks like a case of the driver not paying attention while the Autopilot was driving on the highway. If I had to guess what happened, I’d say that the Autopilot failed to see the stationary truck on the highway and since the driver wasn’t paying attention or assumed that the Autopilot would slow down and brake, he didn’t do anything.

We have seen the same situation on several occasions and ultimately, the responsibility falls on the driver since Autopilot is still only a driver assist system and not a self-driving system under its current version.

Now, it would be great if the Autopilot was able to stop those situations from happening, and we have seen the system do it on occasion, but we can’t assume that it will do it 100% of the time, which is why the driver always needs to monitor and take action if necessary.

What do you think? Let us know in the comment section below.

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

Comments

Author

Avatar for Fred Lambert Fred Lambert

Fred is the Editor in Chief and Main Writer at Electrek.

You can send tips on Twitter (DMs open) or via email: fred@9to5mac.com

Through Zalkon.com, you can check out Fred’s portfolio and get monthly green stock investment ideas.