Tesla’s latest Autopilot feature, Navigate on Autopilot, has been trashed in a review by Consumer Reports who said the system is “less competent than a human driver”, makes “poor decisions”, and creates “potential safety risks.”
Navigate on Autopilot enables on-ramp to off-ramp driving on the highway with the system doing its own lane changes based on speed and the destination entered in the navigation system.
It is supposed to handle interchanges and off-ramps.
In their new review of Navigate on Autopilot, Consumer Reports found the system to be poor:
“In practice, we found that Navigate on Autopilot lagged far behind a human driver’s skill set: The feature cut off cars without leaving enough space and even passed other cars in ways that violate state laws, according to several law enforcement representatives CR interviewed for this report. As a result, the driver often had to prevent the system from making poor decisions.”
Jake Fisher, Consumer Reports’ senior director of auto testing, added:
“The system’s role should be to help the driver, but the way this technology is deployed, it’s the other way around. It’s incredibly nearsighted. It doesn’t appear to react to brake lights or turn signals, it can’t anticipate what other drivers will do, and as a result, you constantly have to be one step ahead of it.”
The magazine argues that the state of the system points to Tesla being late on its goal to deliver a full self-driving system by the end of the year.
That was a pretty harsh review.
Based on my several hundred miles of experience with NoA, I agree with many of the weaknesses highlighted in the review, but I think they miss the point by calling it “less competent than a human driver” and a “safety risk.”
They make it sound like it is useless at the moment and I disagree.
It’s still very much a driver assist system and therefore, the fact that is less competent than humans is not really the point. And if it’s used as a driver assist system, then it doesn’t really result in a safety risk.
Of course, that’s aside from clear bugs like suggesting a lane change into oncoming traffic.
In my own experience, I rarely have to intervene when using the system. The most common intervention is to go back into the right lane faster after passing because I feel like NoA is a bit too careful in being far ahead of the passed vehicle.
But I always pay attention and try to understand the system’s intentions.
It always seems to be trying to do the right thing, but if I don’t like the hesitation, I take over.
For the most part, I found that it reduces my driving workload and I remain safe as long as I pay attention. It even prevented me from missing my exit on two occasions.
I think there’s plenty of room for improvement as highlighted by CR, but I disagree about portraying the feature as dangerous. It’s simply not if used as intended.
Subscribe to Electrek on YouTube for exclusive videos and subscribe to the podcast.