Skip to main content

Another fatal Tesla crash on Autopilot goes to trial, but with new evidence this time

Another fatal crash on Tesla Autopilot is going to trial, and while Tesla has won all of them in the past, this one has new evidence that could help the plaintiffs.

Over the years, there have been a handful of fatal crashes involving Tesla’s Advanced driver assistance systems (ADAS) features, more commonly referred to as their brand names: Autopilot and Full Self-Driving (FSD) Package.

The families of the victims have taken the accidents to trials for wrongful death at times, but Tesla always won.

That’s because, in virtually all cases, Tesla was able to show that the driver was not paying attention at the moment of the accident or leading up to it. When using Autopilot or FSD Beta, Tesla tells drivers that they need to pay attention at all times and to be ready to take control at all times.

If drivers are not doing that, they are misusing the system.

However, some have argued that Tesla should take more responsibility for creating an exaggerated level of confidence in its ADAS systems and to limit the misuse by better ensuring that drivers are paying attention.

Now, a new trial is about to take place, and the lawyers of the family of the deceased Tesla driver have uncovered evidence that they claim shows Tesla knew it was too easy to abuse Autopilot.

The trial is about one of the most publicized Tesla Autopilot accidents. We reported on it extensively when it first happened and in follow-ups regarding several investigations of the crash.

The Tesla Autopilot Crash

The crash occurred in March 2018 and involved Apple engineer Walter Huang.

Huang was driving his Model X on Autopilot when it entered the median of a ramp on the highway as if it were a lane, a common problem with Tesla’s Autopilot at the time. About 150 meters after entering the median, it hit a barrier.

The impact was quite severe because there was no crash attenuator since it was already destroyed by a previous crash. The driver was rushed to the hospital, but he died of his injuries.

NHTSA investigated the accident and confirmed that the vehicle was using Autopilot at the time of the crash. However, according to phone data, it blamed the driver, who was playing a video game on his phone, and the lack of a crash attenuator, which affected the severity of the crash.

The Trial

The family has sued Tesla for wrongful death, and it is going to be quite an uphill battle for them because it looks like he was using his phone while driving, which is a traffic violation and against Tesla’s guidance on how to use Autopilot.

That said, the family’s lawyers benefit from learning from previous similar trials and they are taking a different approach. They are not denying Huang’s misuse of Autopilot, but they are focusing on Tesla’s communications, which they claim led to the driver misusing Autopilot.

As we previously reported, as part of the discovery process for the trial, the family’s lawyers have focused on several statements made by Tesla, and specifically Elon Musk, about Tesla’s Autopilot and Full Self-Driving efforts that could lead drivers to be overconfident in the systems.

In its defense, Tesla went as far as bizarrely claiming that some of Musk’s comments on the matter might have been deep fakes without specifying which ones.

The trial is now set to start next week in a San Jose court and more pieces of evidences are coming out as the court determines what they will be able to show to the jury.

Reuters report on an email that Jon McNeil, then Tesla’s president, sent to CEO Elon Musk and Sterling Anderson, Tesla’s head of Autopilot at the time, in which McNeil admitted to reading emails while using Autopilot:

“I got so comfortable under Autopilot, that I ended up blowing by exits because I was immersed in emails or calls (I know, I know, not a recommended use),.”

The lawyers are also arguing that Tesla never never “studied how quickly and effectively drivers could take control if Autopilot accidentally steers towards an obsacle,” based on Tesla witnesses and experts.

It sounds like the trial is going to revolve around what Tesla communicated to owners and what it has done internally to ensure owners use its systems safely.

But it will also focus on the fact that after the accident, Tesla has taken more steps to ensure driver attention, including introducing driver monitoring through its cabin-facing camera and more recently, it even had a recall to increase driver alerts to pay more attention when using Autopilot and FSD Beta.

The fact that Tesla had a recall recently over the issue could play a big role in this trial as it wasn’t the case in the previous ones won by the automaker.

Electrek’s Take

When cases involve a death, it’s always a sensitive matter, and the Tesla community is quick to put all the blame on the drivers.

That’s especially easy to do when the driver was using his phone at the moment of the crash, which is not legal, and he had seemingly more than a few seconds to react when the Autopilot made a mistake and went into the median.

That said, I think it is reasonable to explore, at least, the possibility that Tesla has contributed to the misuse of its own ADAS system.

Top comment by Timthenchntr

Liked by 24 people

Seems like this is the weakest case of the several I have read about. The driver in this case was a California software engineer, someone who presumably had a good understanding of both software capabilities and Silicon Valley culture of communication. The driver was experienced with the car and had noted to friends, family and to Tesla that is was not reliable all the time. In particular, he had complained about the exact exit ramp where he had a catastrophic accident.

Fatal accidents are terrible things, but it seems to me that this is a very weak case.

View all comments

We can’t deny that misuse was, and maybe still is, fairly common among Tesla owners at the time of this crash. We even reported on Elon Musk’s own wife at the time posting videos of her misusing Autopilot on her Instagram.

Now, on a legal basis, I don’t know how valuable this argument is, but it sounds like some experts think there’s a case.

Matthew Wansley, a Cardozo law school associate professor, agrees that Tesla had an obligation to prevent “foreseeable misuse”:

“If it was reasonably foreseeable to Tesla that someone would misuse the system, Tesla had an obligation to design the system in a way that prevented foreseeable misuse.”

Either way, I think it can’t hurt to debate the issue, especially if it helps publicize the fact that Tesla drivers need to pay attention at all times when using Tesla’s ADAS systems.

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

Comments

Author

Avatar for Fred Lambert Fred Lambert

Fred is the Editor in Chief and Main Writer at Electrek.

You can send tips on Twitter (DMs open) or via email: fred@9to5mac.com

Through Zalkon.com, you can check out Fred’s portfolio and get monthly green stock investment ideas.