A misleading article titled ‘The customer is always wrong: Tesla lets out self-driving car data – when it suits‘ in the Guardian today sparked criticism over Tesla’s policy about releasing Autopilot data following a few crashes where the drivers blamed Tesla’s driver assist system. I say that it’s misleading because the author, Sam Thielman, claims that we are talking about “self-driving” vehicles, which is obviously not the case.
The only “self-driving data” that Tesla ever released was through the California DMV’s disengagement report for just over 500 miles driven by a test fleet in California last year.
What he is instead talking about are the logs that Tesla has sometimes released in the media after drivers have claimed that some of the company’s Autopilot features have caused crashes. Thielman asserts that Tesla conveniently only releases the data when it serves them – hence the “customer is always wrong” comment.
He wrote:
“And while the company has handed data to media following crashes, it won’t provide its customers’ data logs to the drivers themselves, according to interviews conducted by the Guardian.”
And continued:
“The Guardian could not find a single case in which Tesla had sought the permission of a customer who had been involved in an accident before sharing detailed information from the customer’s car with the press when its self-driving software was called into question. Tesla declined to provide any such examples and disputed the description of its automation software, called Autopilot, as “self-driving”.”
Ironically, the article is mostly based on a particular case where Tesla didn’t even release any data to the media. We are talking about the case of a Model S driver in Switzerland who rear-ended a van while using Autopilot on the highway.
The driver posted a video of the accident and claimed that the Autopilot’s active cruise control feature caused the accident. Tesla didn’t share any data logs with the media – we reported on the accident when it happened – instead, they pointed to fact that it is the driver’s responsibility to stay vigilant and to take control if needed.
Tesla’s owner manual has a warning especially for the kind of situation that led to the accident:
“Warning: Traffic-Aware Cruise Control can not detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object, bicycle, or pedestrian is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death. In addition, Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately.”
Since Tesla didn’t even share the logs in this case, it’s a bad example. But to be fair, the article also cites other examples.
In every example however, Tesla was only proving data following the driver either going to the media or blaming the Autopilot when talking to the local authorities. We reported on each case individually when they happened – except for one:
- Tesla Model S driver claims his car crashed into a trailer on its own, Tesla says ‘Summon’ was activated
- Tesla responds to ‘cover-up’ claims in ‘Montana Autopilot Accident’, offers more details on investigation
- Autopilot wasn’t on during Model X crash in PA and Musk says it would have prevented accident
As for the fact that Tesla doesn’t share the data with the drivers themselves, it’s important to note that what we are calling “data logs” are not an actual plain text list of what happens in the car. Tesla logs data from sensors inside its vehicles with a proprietary logging format. We reported before on a hacker managing to decipher some of it to corroborate Tesla’s statements in our story about Model X drivers claiming that the vehicles accelerated/crashed on their own, but it is still something that needs to be interpreted.
Tesla states that it “discloses only the minimum amount of information necessary”. Here’s an official statement from a Tesla spokesperson in response to the claims made in the Guardian story:
“Autopilot has been shown to save lives and reduce accident rates, and we believe it is important that the public have a factual understanding of our technology. In unusual cases in which claims have already been made publicly about our vehicles by customers, authorities or other individuals, we have released information based on the data to either corroborate or disprove these claims. The privacy of our customers is extremely important and something we take very seriously, and in such cases, Tesla discloses only the minimum amount of information necessary.”
FTC: We use income earning auto affiliate links. More.
Comments