Skip to main content

Tesla self-driving test driver: ‘you’re running on adrenaline the entire eight-hour shift’

A new report based on interviews with former test drivers who were part of Tesla’s internal self-driving team reveals the dangerous extremes Tesla is willing to go to test its autonomous driving technologies.

While you can make the argument that Tesla’s customers are self-driving test drivers as the automaker is deploying what it calls its “supervised self-driving” (FSD) system, the company also operates an internal fleet of testers.

We previously reported on Tesla hiring drivers all over the country to test its latest ‘FSD’ software updates.

Now, Business Insider is out with a new report after interviewing nine of those test drivers who are working on a specific project called ‘Rodeo’. They describe the project:

Test drivers said they sometimes navigated perilous scenarios, particularly those drivers on Project Rodeo’s “critical intervention” team, who say they’re trained to wait as long as possible before taking over the car’s controls. Tesla engineers say there’s a reason for this: The longer the car continues to drive itself, the more data they have to work with. Experts in self-driving tech and safety say this type of approach could speed up the software’s development but risks the safety of the test drivers and people on public roads.

One of those former test drivers described it as “a cowboy on a bull and you’re just trying to hang on as long as you can” – hence the program’s name.

Other than sometimes using a version of Tesla FSD that hasn’t been released to customers, the test drivers generally use FSD like most customers, with the main difference being that they are more frequently trying to push it to the limits.

Business Insider explains in more detail the “critical intervention team” with project Rodeo:

Critical-intervention test drivers, who are among Project Rodeo’s most experienced, let the software continue driving even after it makes a mistake. They’re trained to stage “interventions” — taking manual control of the car — only to prevent a crash, said the three critical-intervention drivers and five other drivers familiar with the team’s mission. Drivers on the team and internal documents say that cars rolled through red lights, swerved into other lanes, or failed to follow posted speed limits while FSD was engaged. The drivers said they allowed FSD to remain in control during these incidents because supervisors encouraged them to try to avoid taking over.

These are behaviors that FSD is known to do in customer vehicles, but drivers generally take over before it goes too far.

The goal of this team is to go too far.

One of the test drivers said:

“You’re pretty much running on adrenaline the entire eight-hour shift. There’s this feeling that you’re on the edge of something going seriously wrong.”

Another test driver described how Tesla FSD came within a couple of feet from hitting a cyclist:

“I vividly remember this guy jumping off his bike. He was terrified. The car lunged at him, and all I could do was stomp on the brakes.”

The team was reportedly pleased by the incident. “He told me, ‘That was perfect.’ That was exactly what they wanted me to do,” said the driver.

You can read the full Business Insider report for many more examples of the team doing very dangerous things around unsuspecting members of the public, including pedestrians and cyclists.

How does this compare to other companies developing self-driving technology?

Top comment by Yerch McYerchikins

Liked by 1 people

I don't know enough about how this works to have an opinion, but the article generates a question that I hope those with greater knowledge can answer, i.e. how is this supposed to help "teach" the car to drive? My understanding was that machine learning occurred during interventions, e.g. stopping the car when it is about to run a red light creates a new situation where it comes to recognize variations in red lights. Wouldn't letting the car run red lights confirm for the system that it navigated the intersection correctly? I wish I understood what the expectation of having test drivers operate in this manner was supposed to accomplish.

View all comments

Market leader Waymo reportedly does have a team doing similar work as Tesla’s Rodeo “critical intervention team”, but the difference is that they do the testing in closed environments with dummies.

Electrek’s Take

This appears to be a symptom of Tesla’s start-up approach of “move fast, break things”, but I don’t think it’s appropriate.

To be fair, none of the nine test drivers interviewed by BI said that they were in an accident, but they all described some very dangerous situations in which outsiders were dragged into the testing without their knowledge.

I think that’s a bad idea and ethically wrong. Elon Musk claims that Tesla is about “safety first”, but the examples in this report sound anything but safe.

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

Comments

Author

Avatar for Fred Lambert Fred Lambert

Fred is the Editor in Chief and Main Writer at Electrek.

You can send tips on Twitter (DMs open) or via email: fred@9to5mac.com

Through Zalkon.com, you can check out Fred’s portfolio and get monthly green stock investment ideas.


Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications