Elon Musk claimed this week that Tesla’s “Full Self-Driving” is so much safer than human drivers that it could save 90% of the roughly one million lives lost in car crashes globally each year — a 10X improvement in safety.
The problem is that Tesla has never released the data that would support anything close to that claim, and Musk is already using it to pre-frame the lawsuits Tesla is facing over FSD crashes as an unavoidable cost of progress.
In a post on X, Musk wrote that “Tesla self-driving saves a lot of lives – the statistics are unequivocal,” adding that even when Tesla improves safety 10X and saves “90% of the million lives lost in auto accidents every year, Tesla will still get sued for the 10% who did die.”
He was quote-tweeting a viral anecdote about a Model 3 on FSD swerving around a pedestrian who walked into highway traffic.
The “10X safer” claim doesn’t match the data Tesla has released
Tesla’s only public safety data on FSD is its quarterly Vehicle Safety Report, which compares miles driven with Autopilot or FSD engaged to the US national average of miles per crash from NHTSA.
That comparison has been criticized for years by independent researchers because it stacks the deck in Tesla’s favor in at least three ways:
Road-type mismatch. Autopilot and FSD are used overwhelmingly on highways, which are already the safest roads per mile driven. The NHTSA baseline mixes highways with city streets, rural roads, and parking lots, where crashes are far more frequent.
Vehicle-age mismatch. Teslas are, on average, among the newest cars on US roads. New vehicles with modern passive safety, AEB, and lane-keeping systems crash less than the 12-year-old average car in the NHTSA fleet — with or without FSD engaged.
Driver mismatch. Tesla owners skew older, wealthier, and more urban than the overall US driving population — a demographic that already crashes less than average.
Crash-definition mismatch. Tesla only counts a crash when an airbag or other pyrotechnic restraint deploys. NHTSA’s crash baselines are built from police-reported crashes, which include vast numbers of lower-severity collisions where no airbag ever fires. Tesla is effectively comparing “crashes severe enough to set off an airbag” against “any crash a cop wrote up” — different denominators, by a wide margin. Safety researcher Phil Koopman and others have flagged this as one of the biggest distortions in Tesla’s numbers.
Control for any one of those factors and Tesla’s “safer than human” gap shrinks. Control for all three and independent analyses have repeatedly found that FSD’s safety advantage is, at best, unproven.
Tesla does not release disengagement data, crash-by-severity data, miles driven by road type, or the denominator it uses to calculate its own crash rate. Waymo, by contrast, publishes peer-reviewed safety comparisons with matched human-driver baselines on the same roads, and has been transparent enough that insurers like Swiss Re have run their own analyses on its fleet.
There is no version of Tesla’s current public data that gets you to “10X safer,” let alone “saves 900,000 lives a year.” Musk is making up a number.
The “10% who do die” — and what Tesla is actually being sued for
The more revealing part of Musk’s post is the framing of the lawsuits. He’s presenting them as an inevitability — Tesla saves the 90%, gets sued by the families of the 10%, and that’s just the price of doing the right thing.
That’s not what the lawsuits are about.
Tesla is being sued — and in some cases has already lost or settled — over crashes where the plaintiffs argue that Autopilot or FSD actively contributed to the crash. Not that the system failed to save a driver who would have died anyway, but that the system made a mistake a reasonable human driver wouldn’t have made, or that Tesla’s marketing led drivers to trust the system in situations where it wasn’t capable.
Those are two very different categories:
1. A crash FSD couldn’t prevent (driver error, pedestrian darting out, another car running a red light).
2. A crash FSD caused — either by making a perception or planning error, or by lulling an over-confident driver into not paying attention to a situation the system couldn’t handle.
Musk is conflating the two. He’s rhetorically moving every FSD crash into category one so that every lawsuit looks like an ungrateful family suing the company that tried to save their loved one.
That’s not a fair read of the Autopilot and FSD litigation record. In the 2023 Banner case, the 2024 Huang family settlement, and the ongoing wrongful-death cases tied to highway crashes on Autopilot, the central allegation is that Tesla’s driver-assistance system did something wrong — phantom braking, failing to recognize a stationary object, steering into a barrier, disengaging a fraction of a second before impact — and that Tesla’s marketing of “Full Self-Driving” encouraged drivers to rely on it beyond its actual capability.
You can believe Tesla’s driver-assistance systems prevent some crashes and believe Tesla is legally and morally responsible when those systems cause other crashes. Those aren’t contradictory positions. Musk’s framing only works if you pretend the second category doesn’t exist.
Electrek’s Take
We’ve been covering Tesla’s “safer than humans” talking point for nearly a decade, and it has never been backed by data that would survive peer review. The company has the mileage, the telemetry, and the engineering talent to produce a Waymo-style safety analysis with matched baselines. It has chosen not to.
That’s a choice, and it’s a telling one. If FSD were genuinely 10X safer than a human driver on comparable roads, Tesla would publish the numbers tomorrow — it would be the single most valuable marketing asset in the history of the car industry. Instead, we get a tweet with a made-up “90% of a million lives” figure and a pre-emptive complaint about lawsuits.
The “10% who do die” framing is the part that should bother people. It treats every family suing Tesla as collateral damage in a utilitarian trade they never agreed to, when in reality many of those lawsuits are about specific failures of a specific system that was specifically marketed as being more capable than it is. Tesla can’t sell “Full Self-Driving” for nearly a decade, collect billions in deferred revenue on the promise, and then recast every crash as the unavoidable cost of saving everyone else.
If Musk wants the 10X number to mean anything, Tesla needs to show the work. Until then, it’s just a tweet.
FTC: We use income earning auto affiliate links. More.
Comments