When looking at the picture above, you would think that it’s the aftermath of a fatal accident, but the Tesla Model X driver actually walked out of it with “no injuries aside from a stiff neck”. He credited the vehicle’s safety for saving his life, but he also blamed the Autopilot for what he claims was “driving full speed into the back of a semi”.

It’s actually a little more complicated than that.

After the accident, which happened last Thursday in California, the owner of a Model X P90D with first generation Autopilot posted his account of the crash on Facebook:

“There was a pickup truck that was out of gas in the right lane (lights were either dim or off, and give the night, was hard to see). A semi was pulling up onto it, saw it, braked and swerved into my middle lane. Autopilot did not disengage, but did the emergency beep about 1 second before impact. I was looking off to the side, and impacted the truck immediately after I heard the beep and looked forward.”

As usual, the driver is always responsible for monitoring the vehicle when Autopilot is engaged and should always be ready to take control. In this case, the driver admitted that he wasn’t looking forward at the time of the crash.

The purpose of the Autopilot’s main feature, Autosteer, is to keep the vehicle in its lane, which is exactly what it did during the accident. What could have prevented or reduced the force of the impact are the Automatic Emergency Braking or Steering features, but they only engage if it’s the only option. In this case, it would have likely required to move to the left lane, which could have also been dangerous.

The “emergency beep” he referenced was the collision warning.

He shared the following picture with the account of the accident:

And continued with his account:

“The bottom of the semi went straight into my passenger seat (luckily no one was there) and broke off the headrest as well as the entire roof. Miraculously, I came out of the accident with no injuries aside from a stiff neck.”

Miraculously indeed. It looks like the bulk of the damages stopped inches away from the driver’s seat, which fortunately was the only occupied seat in the vehicle.

He ended his account of the event with a warning to Autopilot users:

“The autopilot caused me to drive full speed into the back of a semi, but the amazing safety ratings saved my life. While I’m grateful that I’m alive, I just want to put this on notice to not get overly comfortable with the autopilot and that there are still many flaws and unaccountable situations.”

That’s an interesting statement. He is right that users should be careful not to get too comfortable. Tesla CEO Elon Musk disclosed that they have more problem with “expert users” getting too comfortable with the system than new users who are just learning how to use it.

Where things get more complicated is when he says that it’s a “flaw” and blames the Autopilot for crashing into the semi. That is only valid if the Autopilot was a level 3+ autonomous driving system, but it’s not. It’s a level 2 driver assist system and its purpose is not to avoid those kinds of accidents. It can help avoid them sometimes and some data shows that it reduces the crash rate by ~40%, but you cannot rely on it to avoid them altogether.

Nonetheless, it serves as a good reminder not to get too comfortable. We can start talking about putting responsibilities on the system and getting comfortable with the next generation of autonomous driving systems, like maybe with some upcoming software updates on Tesla’s next generation Autopilot. In the meantime, stay safe.