Self-Driving Cars Have Twice The Accidents, But Only Because Humans Aren't Used To Vehicles Following The Rules
from the I'm-sorry-you-hit-me,-Dave dept
When Google discusses its latest self-driving car statistics (provided monthly at the company’s website), the company is quick to highlight that with two million miles of autonomous and manual driving combined, the company’s self-driving cars have only been involved in 17 minor accidents, none of them technically the fault of Google. Or, more specifically, these accidents almost always involve Google’s cars being rear ended by human drivers. But what Google’s updates usually don’t discuss is the fact that quite often, self-driving cars are being rear ended because they’re being too cautious and not human enough.
And that’s proven to be one of the key obstacles in programming self-driving cars: getting them to drive more like flawed humans. That is, occasionally aggressive when necessary, and sometimes flexible when it comes to the rules. That’s at least been the finding of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab, which says getting self-driving cars onto the highway can still be a challenge:
“Last year, Rajkumar offered test drives to members of Congress in his lab?s self-driving Cadillac SRX sport utility vehicle. The Caddy performed perfectly, except when it had to merge onto I-395 South and swing across three lanes of traffic in 150 yards (137 meters) to head toward the Pentagon. The car?s cameras and laser sensors detected traffic in a 360-degree view but didn?t know how to trust that drivers would make room in the ceaseless flow, so the human minder had to take control to complete the maneuver.”
And while Google may crow that none of the accidents their cars get into are technically Google’s fault, accident rates for self-driving cars are still twice that of traditional vehicles, thanks in part to humans not being used to a vehicle that fully adheres to the rules:
“Turns out, though, their accident rates are twice as high as for regular cars, according to a study by the University of Michigan?s Transportation Research Institute in Ann Arbor, Michigan. Driverless vehicles have never been at fault, the study found: They?re usually hit from behind in slow-speed crashes by inattentive or aggressive humans unaccustomed to machine motorists that always follow the rules and proceed with caution.”
But with a sometimes-technophpobic public quick to cry foul over the slightest self-driving car mishap, car programmers are proceeding cautiously when it comes to programming in an extra dose of rush-hour aggression. And regulators are being even more cautious still. California last week proposed new regulations that would require that all self-driving cars have full working human controls and a driver in the driver’s seat at all times, ready to take control (which should ultimately do a wonderful job of — pushing the self-driving car industry to other states like Texas).
The self-driving car future is coming up quickly whether car AI or self-driving auto philosophical dilemmas (should a car be programmed to kill the driver if it will save a dozen school children?) are settled or not. Google and Ford will announce a new joint venture at CES that may accelerate self-driving vehicle construction. And with 33,000 annual fatalities caused by highway-bound humans each year, it still seems likely that, overly-cautious rear enders aside, an automated auto industry will still likely save significant lives over the long haul.