Let me preface this by highlighting the fact that there’s currently no fully self-driving car on the roads and the prototypes currently in test programs are far from perfect and therefore don’t represent the potential of a true level 5 autonomous driving system.
Now with this out of the way, it seems like people just can’t stop crashing into GM’s ‘self-driving’ Bolt EV prototypes.
Home Solar Power
California’s DMV keeps track of the companies testing autonomous vehicles in the state. It issues permits for the prototypes and it requires to submit reports on the test programs, as well as reports on accidents involving self-driving test cars.
Most of them are quite benign and uneventful low-speed fender-benders, but after reading through all of them, I found a few somewhat interesting trends.
First off, GM’s Cruise Automation prototypes based on the Chevy Bolt EVs are involved more often in the accidents than vehicles in any other test programs.
Out of the 13 accidents reported to the DMV so far in 2017, 9 involved GM’s Cruise Automation prototypes.
To be fair, GM operates one of the biggest fleets of autonomous test vehicles and therefore, they are more likely to be involved in accidents. GM is adding hundreds of prototypes to its fleet.
Another more interesting trend is that if we are to believe the reports, the self-driving prototypes are rarely if ever responsible for the accidents.
Instead, it looks like people have a weird tendency to crash into GM’s cars.
For example, here’s GM Cruise’s report for an accident that happened earlier this month:
“A Cruise autonomous vehicle (*Cruise AV*), initially operating in autonomous mode, was involved in an incident while traveling northbound a Potrero Avenue turns into Brannan Stree. The Cruise AV was proceeding straight through the intersection, which bends to the right, when a black Dodge Charger came up quickly from behind. The Charger was in a left-turn only lance immediately to the left of the Cruise A; […]”
Here’s the satellite map of that intersection for visualization:
GM Cruise continued in the report:
“[…] but instead of turning left, the Charger tried to overtake the Cruise AV and to proceed straight as well. At this point, the driver of the Cruise AV took over manual control. As the Charger cut off the Cruise AV, it scraped the Cruise AV’s front left sensor. At the time of the collision, the Cruise AV was traveling at 4 mph, while the Charger was traveling at approximately 12 mph. The Charger fled the scene without exchanging information. The driver of the Cruise AV called the police to report the incident as a hit-and-run, but the police were not dispatched and no report was filed.”
If the account of the event is true, that’s a clear example of the driver of the non-autonomous vehicle being responsible. Whether or not the Cruise AV was driving correctly, the driver of the Charger had no right of way if he was in a left-turn only lane.
And again, it’s actually one of many examples of the drivers of the non-autonomous vehicles being at fault.
A more common type of accident involving autonomous test cars is a rear-end collision. Even when the test car is being driven manually. Here’s an example from another report of an accident that happened just a week before the one with the Charger:
“A Cruise AV, operating in conventional mode, was involved in a collision while preparing to turn left from Folsom Street onto 6th Street. The driver of the Cruise AV decelerated and stopped to let a pedestrian clear the crosswalk. A Ford Explorer behind the Cruise AV then impacted the rear passenger-side corner of the Cruise AV. The police were called, but declined to respond citing the lack of any reported injury.”
That’s a type of accident more open to interpretation, but generally speaking, the person hitting from behind is most often found responsible since they need to leave enough space to decelerate and stop in time.
There are several more examples of those types of accidents when the Cruise AV is also in autonomous mode.
As I mentioned in my comment to preface this article, current self-driving prototypes are far from perfect and therefore, they can have seemingly weird driving habits that could potentially have something to do with those accidents.
A part of driving in busy city streets involves anticipating the actions of other drivers and it could potentially be more difficult for some when it comes to self-driving cars.
It highlights the problem that we will have during the inevitable transition of the fleet to self-driving and the period during which drivers will have to share the roads with those autonomous driving systems.
People should try to act around self-driving prototypes like they would around any other vehicle, safely with caution. That’s how self-driving cars will learn and they can’t come soon enough because as pretty much every statistic shows, humans suck at driving.