The safety of Tesla Autopilot came back into focus after it was confirmed that the driver assist system was on during the fatal accident that killed a Model X owner in Mountain View last month.
Now the family of the deceased say that they are prepared to sue Tesla after a media interview.
As we previously reported, the Model X was driving on Autopilot when it entered the median of a ramp on the highway as if it was a lane and hit a barrier about a hundred and fifty meters after going into the median.
The impact was quite severe because there was no crash attenuator since it was already destroyed from a previous crash. The driver was rushed to the hospital, but he, unfortunately, died of his injuries.
Sevonne Huang, the wife of the driver, Walter Huang, gave an interview to ABC7 yesterday.
During the interview, she said that her husband had previously complained about the Autopilot’s behaviour at hat exact location:
Sevonne Huang: “And he want to show me, but a lot of time it doesn’t happen.” Dan Noyes: “He told you that the car would drive to that same barrier?” Sevonne: “Yes.” Noyes: “The same barrier that he finally hit?” Sevonne: “Yeah, that’s why I saw the news. I knew that’s him.”
The family hired attorney Mark Fong and say that they are prepared to sue Tesla.
“Unfortunately, it appears that Tesla has tried to blame the victim here. It took him out of the lane that he was driving in, then it failed to break, then it drove him into this fixed concrete barrier. We believe this would’ve never happened had this Autopilot never been turned on.”
Tesla responded to the interview in a statement:
“We are very sorry for the family’s loss.
According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.
The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang’s drive that day.
We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive.”
I can’t blame the family for having this reaction or any kind of reaction after such a tragic loss, but when it comes to the lawsuit, it looks like they are destroying their own case.
As Tesla said in the statement and as it was confirmed by other Tesla owners recreating the circumstances of the crash, an attentive driver would have plenty of time to go back to the correct lane after the car enters the median, which means that Huang was most likely not paying attention.
On top of it, his wife says that he was aware that Autopilot had difficulties handling this specific situation and yet he decided to activate it anyway and apparently not pay close attention.
In my opinion, as long as Tesla is being clear about drivers needing to stay attentive and keep their hands on the steering wheel, there’s not much of a case here that Tesla is responsible for the accident.
But with this said, as Tesla Autopilot improves it seems that some drivers are growing more confident with the driver assist system and are putting too much trust in it.
I think it’s important to remind everyone that as long as Tesla doesn’t claim it’s anything more than a level 2 driving system and it’s not thoroughly tested to be anything more than that, they should always stay vigilant and be ready to take control.
FTC: We use income earning auto affiliate links. More.
Subscribe to Electrek on YouTube for exclusive videos and subscribe to the podcast.