Skip to main content

Elon Musk claims zero crashes in Tesla’s Full Self-Driving Beta over a year into the program

Elon Musk claims that Tesla has not had a single crash in its Full Self-Driving Beta program since the start over a year ago, but that’s just as much proof that the testers are being careful as a proof that the system is safe.

“Full Self-Driving Beta” (FSD Beta) is an early version of Tesla’s self-driving software that is currently being tested by a fleet of Tesla owners selected by the company and through its “safety test score.“

The software enables the vehicle to drive autonomously to a destination entered in the car’s navigation system, but the driver needs to remain vigilant and ready to take control at all times. Tesla started the program in October 2020, and it has now pushed the software to several thousands of customers.

The test program has been criticized for putting advanced autonomous features in the hands of customers and leaving the responsibility with them by calling it a level two autonomous system in Beta testing – Tesla has defended itself by saying that it has been careful with slowly rolling out the features to customers it deems “safer drivers”.

In response to a comment by Tesla shareholder Ross Gerber on Twitter, CEO Elon Musk confirmed yesterday that Tesla believes there still has been any accident in the Full Self-Driving Beta program over a year after the launch:

It would mean that he is disputing a previous crash report to the National Highway Traffic Safety Administration (NHTSA). A Model Y owner in the FSD Beta claimed in a complaint to NHTSA that the system caused a crash, but the complaint couldn’t be confirmed.

If the report was indeed inaccurate, it is impressive that Tesla didn’t have an accident in likely millions of miles on FSD Beta. NHTSA says that on average there’s an accident every 500,000 miles for human drivers (aka all drivers).

Electrek’s Take

While impressive, it is probably more proof that Tesla owners in the FSD Beta program are being careful than the system itself is safe because we have seen plenty of videos where the FSD Beta would have caused an accident if it wasn’t for the driver taking control.

It’s a “so far so good” situation, but we know that accidents are inevitable. Once one happens, I expect to see a significant ramp-up in criticism of Tesla’s approach to testing its self-driving system. In the meantime, Tesla is enjoying the use of a lot of data from a test fleet that is not only for free, but is made of customers who paid a lot of money to test the system.

We can argue if this is right or not, but you can’t argue that as a business, this is one hell of a move.

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

Comments

Author

Avatar for Fred Lambert Fred Lambert

Fred is the Editor in Chief and Main Writer at Electrek.

You can send tips on Twitter (DMs open) or via email: fred@9to5mac.com

Through Zalkon.com, you can check out Fred’s portfolio and get monthly green stock investment ideas.


Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications