Skip to main content

Tesla worker died in horrible crash on Full Self-Driving beta, but he was also drunk

The horrible fatal crash of a Tesla employee using Full Self-Driving Beta has been reported in detail for the first time to highlight responsibility in those accidents.

The Crash

The Washington Post released a new report on the crash today, which happened back in 2022.

Hans von Ohain, a recruiter at Tesla, and his friend Erik Rossiter set out outside Denver, Colorado, in the former’s Tesla Model 3 to go golfing.

During the drive there, Rossiter says that von Ohain was driving on FSD beta, Tesla’s driver-assist system that takes over all the driving controls but the driver needs to keep their hands on the steering wheel and be ready to take control at all times.

Rossiter said that FSD Beta swerved several times during the drive there and von Ohain had to take control.

They played 21 holes and drank alcohol during the day before driving back. Rossiter said he seemed composed and “by no means intoxicated” when getting into the car for the drive back.

The Washington Post described the crash:

Hours later, on the way home, the Tesla Model 3 barreled into a tree and exploded in flames, killing von Ohain, a Tesla employee and devoted fan of CEO Elon Musk. Rossiter, who survived the crash, told emergency responders that von Ohain was using an “auto-drive feature on the Tesla” that “just ran straight off the road,” according to a 911 dispatch recording obtained by The Washington Post. In a recent interview, Rossiter said he believes that von Ohain was using Full Self-Driving, which — if true — would make his death the first known fatality involving Tesla’s most advanced driver-assistance technology.

While Rossiter admittedly doesn’t have a great recollection of what happened, he did say he remembers getting out of the car, a big orange glow, and then trying to get his friend out of the car as he was screaming inside of the burning car. A fallen tree was blocking the driver’s door.

An autopsy of Von Ohain found that he died with a blood alcohol level of 0.26 — more than three times the legal limit.

Colorado State Police determined that intoxication was the main factor behind the accident, but it also conducted an investigation into the possible role of Tesla’s Full Self-Driving Beta.

The Responsibility

Von Ohain’s widow Nora Bass wants Tesla to take responsibility for her husband’s death:

“Regardless of how drunk Hans was, Musk has claimed that this car can drive itself and is essentially better than a human. We were sold a false sense of security.”

She hasn’t been able to find a lawyer to take the case because he was intoxicated.

Colorado State Patrol Sgt. Robert Madden, who led the investigation, has rolling tire marks at the site of the crash, which means that the motor kept sending power to the wheels at the time of impact.

There were also no skid marks found.

Madden said:

“Given the crash dynamics and how the vehicle drove off the road with no evidence of a sudden maneuver, that fits with the [driver-assistance] feature”

We don’t have access to the logs. The police were not able to recover it after the fire, and Tesla reportedly told the police that it didn’t receive the logs over the air. Therefore, it couldn’t confirm if any driver-assist features were activated at the time of the crash.

Electrek’s Take

That’s horrible. I can’t imagine trying to drag your screaming friend out of a burning car. I am sorry for Von Ohain’s loved ones.

Based on the information we have here, it does seem like Von Ohain was intoxicated and overconfident in FSD Beta. The feature failed badly, and he couldn’t take control in time to avoid the fatal crash.

They are both at fault. Von Ohain, rest in peace, had no excuse for getting behind the wheel intoxicated, and it sounds like Tesla’s FSD Beta failed badly.

Top comment by Effopec

Liked by 16 people

Based on my experience, if the car just drove off the road I'd say it is probably more likely that he had it on TACC, but thought it was in FSD. This can be an easy mistake to make and if you expect the car to take a curve and it doesn't it could be too late to correct. To me, the fact that turning the wheel kicks you out of self driving, but keeps power to the wheels can be quite dangerous.

View all comments

But if we dig a little bit deeper, it is an interesting situation.

To be honest, the fact that he was a Tesla employee makes this whole situation a lot more complicated. It means that he should have known very well that you need to pay attention on FSD Beta and be ready to take control at all times.

Now, it might be because of his intoxication that he decided that it would be a good idea to use FSD Beta on winding mountain roads while intoxicated, or he might have been taking chances with FSD Beta even when not intoxicated, which is what his wife is pointing to about a “false sense of security.”

This is definitely something where Tesla can improve: managing expectations when it comes to FSD Beta, which is not easy to do when you literally call it “Full Self-Driving.”

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

Comments

Author

Avatar for Fred Lambert Fred Lambert

Fred is the Editor in Chief and Main Writer at Electrek.

You can send tips on Twitter (DMs open) or via email: fred@9to5mac.com

Through Zalkon.com, you can check out Fred’s portfolio and get monthly green stock investment ideas.