After the Tesla Autopilot came under scrutiny over the last two years following a fatal accident in a Model S while the vehicle was on Autopilot, the driver assist system seemed to have caught a break.
But now another accident has resulted in the launch of another investigation.
Earlier this week, we reported on a Tesla Model S rear-ending a fire truck at a reported speed of 65 mph while the vehicle was on Autopilot, according to the driver.
There was no injury, but the fact that Autopilot was possibly active prompted the U.S. National Transportation Safety Board (NTSB) to launch an investigation.
The federal agency announced yesterday that they are sending two investigators to conduct a “field investigation” regarding the crash:
Two NTSB investigators from HQ to conduct field investigation of Jan. 22, crash involving a Tesla and fire truck, near Culver City, CA. Focus of field investigation is driver and vehicle factors.
— NTSB Newsroom (@NTSB_Newsroom) January 23, 2018
In their previous investigation of the fatal crash in 2016, NTSB concluded that Tesla’s Autopilot ‘functioned as designed’ but ‘played a role’ in the accident.
The board said that Autopilot functioned as designed during the accident since it wasn’t meant to prevent this particular type of crash, but that it still played a role through its “operational limitations”.
They said that “humans are very poor at monitoring automated systems” and that systems need to ensure that driver stay vigilant and keep monitoring their vehicle, something Tesla asks of drivers when using Autopilot.
It took over a year for NTSB to share its findings of the 2016 accident.
Electrek’s Take
Of course, this investigation is going to be significantly different since the driver is alive and should be able to answer questions.
We don’t have all the facts right now, but it certainly looks like a case of the driver not paying attention while the Autopilot was driving on the highway. If I had to guess what happened, I’d say that the Autopilot failed to see the stationary truck on the highway and since the driver wasn’t paying attention or assumed that the Autopilot would slow down and brake, he didn’t do anything.
We have seen the same situation on several occasions and ultimately, the responsibility falls on the driver since Autopilot is still only a driver assist system and not a self-driving system under its current version.
Now, it would be great if the Autopilot was able to stop those situations from happening, and we have seen the system do it on occasion, but we can’t assume that it will do it 100% of the time, which is why the driver always needs to monitor and take action if necessary.
What do you think? Let us know in the comment section below.
FTC: We use income earning auto affiliate links. More.
Comments