Skip to main content

Tesla says FSD was off before Cybertruck crash — but the video tells a different story

A viral dashcam video shows a Tesla Cybertruck slamming into a concrete overpass barrier on a Houston highway while allegedly using “Full Self-Driving.” Elon Musk responded by saying Tesla’s logs show the driver disengaged the system 4 seconds before impact.

Tesla fans jumped on this as yet another example of the media spreading FUD about the company. But watching the actual video tells a more nuanced — and more concerning — story about the real problem with FSD.

The crash

The incident happened on August 18, 2025, on Houston’s 69 Eastex Freeway. Justine Saint Amour was driving her Cybertruck with “Full Self-Driving” engaged as the vehicle approached a Y-shaped overpass split. The road required the vehicle to follow a right-hand curve.

Instead, the dashcam footage, released by the plaintiff’s law firm Hilliard Law and aired by Fox News, shows the Cybertruck barreling straight ahead at what appears to be highway speed, plowing through traffic cones separating the lanes and slamming into the concrete barrier at the edge of the overpass. Parts flew off the vehicle on impact.

Advertisement - scroll for more content

Saint Amour had her 1-year-old child in the backseat. The child was unharmed, but Saint Amour suffered two herniated discs in her lower back, one in her neck, sprained tendons in her wrist, and numbness and weakness in her right hand. She is suing Tesla for over $1 million, with the lawsuit including 16 allegations of negligent conduct, and the unusual claim that Tesla was negligent in retaining Elon Musk as CEO.

Musk’s response and the 4-second defense

After the video went viral, Musk posted on X that “Logs show driver disengaged Autopilot four seconds before crashing.” Tesla propagandists ran with this, framing the incident as proof that the media unfairly blames FSD for crashes that happen under manual driving.

But this framing misses the point entirely.

Saint Amour’s attorney Bob Hilliard acknowledged that his client disengaged the system before impact. The reason she disengaged it is the critical detail: the system was already failing. As Hilliard put it, she tried to take control, but it was too late to correct the vehicle’s trajectory.

Watch the video closely. The Cybertruck is approaching the overpass curve at full highway speed with no sign of slowing down or turning. The system appears to have completely missed the curve. By the time the driver realized FSD was not going to navigate the turn correctly at that speed and grabbed the wheel, the vehicle was already committed to a straight-line path into the barrier. Four seconds is not enough time to correct a multi-ton truck traveling at highway speed toward a concrete wall, depending on your level of driving.

The real problem: overconfidence and the supervision trap

This incident is a textbook example of what we’ve been documenting at Electrek for years, the fundamental flaw in Tesla’s approach to “supervised” autonomy.

Just yesterday, we covered how former Uber self-driving chief Raffi Krikorian crashed his Tesla on FSD and wrote a damning essay in The Atlantic about the “vigilance decrement” problem. The research is clear: drivers need 5 to 8 seconds to mentally reengage after an automated system fails. Emergencies unfold faster than that.

Tesla’s “Full Self-Driving” works well enough, most of the time, that drivers trust it. That trust is the problem. When the system suddenly fails, as it did approaching this overpass curve, the driver is caught in the worst possible position: not fully engaged, forced to process what’s happening, make a decision, and execute a correction in a vehicle that’s already heading toward disaster.

The fact that the driver disengaged FSD 4 seconds before the crash doesn’t prove the system was blameless. It proves the exact opposite, the driver recognized FSD was failing and desperately tried to correct it, but the system had already put her in an unrecoverable situation.

This is the same pattern we documented in November when we wrote about Tesla’s “Schrödinger’s FSD” problem: when FSD works, Tesla takes credit. When it crashes, the driver gets blamed.

A mounting pattern

This is far from an isolated incident. Earlier this month, a viral video showed FSD driving a Tesla through railroad crossing barriers near an approaching train. NHTSA’s ongoing investigation into FSD has documented 80 traffic violations, including 14 crashes and 23 injuries. And Tesla just had to pay a $243 million judgment over an Autopilot crash — a number that signals where courts are heading on these cases.

Meanwhile, Waymo’s fully autonomous vehicles are operating without any driver supervision at all, in multiple cities, with a strong safety record. That’s the benchmark. Not a system that works 99% of the time and leaves a human to handle the 1% that can kill them.

Electrek’s Take

Tesla fans using the “4-second disengagement” as a gotcha are missing the forest for the trees. Yes, the driver was technically in control of the vehicle at the moment of impact. But she was in control because FSD was already failing by driving too fast ahead of this sharp turn — it was heading straight into a concrete barrier at highway speed with no sign of correcting.

Everyone who has frequently used FSD or Autopilot and paints this 4-second disengagement as a “gotcha” moment is being disengenous, and that includes Elon Musk.

Top comment by Anupreet Singh

Liked by 19 people

I saw another picture of the accident in which the tire skid marks are seen on the road. So, the driver did try to brake hard, but and Fred says, it was too late for that speed and mass to avoid that concrete barrier.

View all comments

I have tens of thousands of miles on FSD, and I’ve experienced the system coming too fast into a turn at least half a dozen times.

We’ve said this before and we’ll keep saying it: the problem with FSD isn’t what happens when the driver is paying attention and the system works. The problem is what happens when the system gives you every reason to trust it, and then suddenly doesn’t work. The driver has to recognize the failure, assess the situation, decide on a correction, and physically execute it, all in less time than the system needs to create the danger.

Musk and Tesla’s propagandists can point to the logs all they want. The video shows what actually matters: FSD approaching a standard highway curve at full speed with zero indication it was going to navigate it. That’s the failure. Everything that happened after, including the panicked disengagement, is a consequence of that failure.

The framing that this was “manual driving, not FSD” is technically true for the final 4 seconds and deeply dishonest about the full sequence of events. It’s exactly the kind of liability shell game that courts are increasingly rejecting, as that $243 million verdict makes clear. Tesla created the system, sold it as “Full Self-Driving,” and profits from the ambiguity. At some point, it has to own the consequences.

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

Comments

Author

Avatar for Fred Lambert Fred Lambert

Fred is the Editor in Chief and Main Writer at Electrek.

You can send tips on Twitter (DMs open) or via email: fred@9to5mac.com

Through Zalkon.com, you can check out Fred’s portfolio and get monthly green stock investment ideas.