Google's Self-Driving Car Causes First Accident, As Programmers Try To Balance Human Simulacrum And Perfection

from the get-out-of-my-lane,-Dave dept

Google’s self-driving cars have driven millions of miles with only a dozen or so accidents, all of them being the fault of human drivers rear-ending Google vehicles. In most of these cases, the drivers either weren’t paying attention, or weren’t prepared for a vehicle that was actually following traffic rules. But this week, an incident report by the California Department of Motor Vehicles (pdf) highlighted that a Google automated vehicle was at fault in an accident for what’s believed to be the first time.

According to the report, Google’s vehicle was in the right-hand turn lane in a busy thoroughfare in Google’s hometown of Mountain View, California, last month, when it was blocked by some sand bags. It attempted to move left to get around the sand bags, but slowly struck a city bus that the car’s human observer assumed would slow down, but didn’t. All in all, it’s the kind of accident that any human being might take part in any day of the week. But given the press and the public’s tendency toward occasional hysteria when self-driving technology proves fallible, Google’s busy trying to get out ahead of the report.

Google historically compiles monthly reports for its self-driving car project; and while its February report addressing this specific accident hasn’t been made public yet, Google’s been releasing an early look to the media. In the report, Google notes that just like humans, trying to predict another driver’s behavior isn’t always successful on the road:

“Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.

This is a classic example of the negotiation that?s a normal part of driving ? we?re all trying to predict each other?s movements. In this case, we clearly bear some responsibility, because if our car hadn?t moved there wouldn?t have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.

We?ve now reviewed this incident (and thousands of variations on it) in our simulator in detail and made refinements to our software. From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.”

Live and learn. Or compute and learn. Whatever. If automated vehicles were going to cause an accident, it’s at least good that this appears to be an experience (don’t give city bus drivers the benefit of the doubt) programmers will learn from. The problem historically is that like so many technologies, people are afraid of self-driving cars. As such, automated vehicles can’t just be as good as human beings, they’ll have to be better than human beings for people to become comfortable with the technology seeing widespread adoption.

By any measure self-driving cars have been notably safer than most people imagined, but obviously there’s still work to do. Recent data from the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab suggests that self-driving cars have twice the accidents of human-driven vehicles — but again largely because people aren’t used to drivers that aren’t willing to bend the rules a little bit. Striking an acceptable balance between having an automated driver be perfect — and having an automated driver be more human like — is going to be a work in progress for some time.

Filed Under: , , , ,
Companies: google

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Google's Self-Driving Car Causes First Accident, As Programmers Try To Balance Human Simulacrum And Perfection”

Subscribe: RSS Leave a comment
85 Comments
Paul Renault (profile) says:

Re: Re: Re: Re:

Uh, there’s such a law in the ‘States?

The car/driver changing lanes is supposed to ensure that a lane change can be safely performed, including making sure that there’s enough room. The lane-changer cannot assume that the other driver has seen them; that driver could be be keeping an eye one some other concern that the lane-changer can’t see, say in the other lane.

Not to mention, that in some jurisdictions (Québec, to name one), buses always have priority, even when they’re pulling into your lane. Vous avez été avertis.

That One Other Not So Random Guy says:

Re: Re: Re:3 Re:

Its called Lane Splitting and while you are all sitting in bumper to bumper I get to glide by and actually get somewhere. (Try the 4th on the parkway 🙂 Lemme guess… you, are that guy that sits in the middle lane doing 55 while traffic passes you on the left and right. Or… that guy that does 54 in the right lane until I try and pass then you speed up to 80. Or you’re that tool that pulls closer to the line while some “idiot” on a bike tries to pass. Yeah buddy. I’ve seen your type… usually in a blur when I pass you.

Mason Wheeler (profile) says:

Re: Re: Re:4 Re:

No, I’m the guy who tries to drive as safely as possible even when surrounded by idiots who think they own the road.

Consider three things:

1) Lane splitting puts three motorists (the biker and the two cars he’s going between) in a position where they have a greatly-reduced margin of error.
2) To err is human.
3) When the inevitable does eventually happen, (see first two points,) you’re the only one of the three who is not protected by a couple tons of metal armor.

If that realization does not put you off lane splitting forever, then yes, you are an idiot.

Anonymous Coward says:

Re: Re: Re:3 Re:

Some motorcyclists claim that lane splitting is actually safer because it allows motorcycles to get away from cars more easily by passing them up in opposed to eclipsing them along their translational axes. Having cars all around a motorcycle seems unsafe, especially if the motorcycle is in a blind spot and being smaller they do not want to be in a blind spot because they’re harder to see. For them to pass up the cars and get away from them by leading ahead is arguably safer for them, they want to be as far away from other cars as possible and making them drive by the same rules as though they were a car will ensure that they are with the other cars which is arguably less safe for them.

Anonymous Coward says:

Re: Re: Re:5 Re:

I agree that lane splitting under the wrong conditions makes things more dangerous. What I noticed motorcyclists try to do to minimize the risk is they try to lane split between two cars that give the most space. For instance if you are too close to the car on your right they will lane split to the left (but not all motorcyclists do this equally, some may be less inclined to make the effort). If traffic isn’t moving that reduces the chances that you will side swipe it as it attempts to pass you up as you probably aren’t changing lanes. Perhaps a motorcycle can honk their horn as they pass cars (their horns aren’t very loud I don’t think). But still, I agree that it increases the danger in your scenario. There is a balance between efficiency and risk. OTOH the less time the motorcycle is on the road the less chance the motorcycle has of getting in an accident. Getting to their destination faster = less time on the road. A motorcycle that stays in one place stuck in traffic may be more at risk in some regards as well.

I hate having a motorcycle stuck in a lane around me when there is stop and go traffic and cars everywhere. It seems like the potential for hitting them from behind if they are in the axis of your motion is greater than if they are parallel next to you. There are a lot of arguments to be made on both sides, it might be useful to look at some data on how most motorcycle accidents occur and what the laws are in the areas that they occur.

But like car drivers there are good and bad motorcycle drivers. I don’t you can uniformly say that lane splitting is always bad just like you can’t uniformly say that passing up another car is bad. In some situations it may be more dangerous than others and it’s up to the driver to make those decisions. Some people are more careful than others when they do it.

John Fenderson (profile) says:

Re: Re: Re:6 Re:

My problem with “lane splitting” being legal (fortunately, I live in a state where it isn’t) has more to do with predictability. The most important thing that traffic laws give us is predictability: you have a very good idea what the other traffic is going to do.

Lane-splitting is inherently a surprising thing: it means that every motorcycle on a multilane road now becomes a random factor that other drivers have to pay extra attention to, just in case they start driving where there are no lanes.

It violates the #1 driver safety rule: never do anything surprising.

Michael (profile) says:

Re: Re: Re:3 Re:

It is a failure to yield violation any time you are required to yield the right-of-way but fail to.

An example of this would be two drivers arriving at a 4 way stop at the same time, the person to the left is required to yield to the person to their right. Also, hitting someone while driving through a “yield” sign would easily qualify.

I am sure there are a number of other situations, but in the case of the Google car, the Google car failed to yield to the bus, not the other way around.

Mason Wheeler (profile) says:

Re: Re: Re:4 Re:

Also, hitting someone while driving through a “yield” sign would easily qualify.

Yield signs are a bit of a sore spot for me at the moment, because in Pennsylvania I see them all over the place in places where they should not be: at the end of on-ramps.

If you’ve ever been to a driver’s ed course, you’ll remember that the purpose of an on-ramp is specifically to give you space to get up to speed and merge onto the highway safely. But around here, the civil engineers appear to have failed to understand that: instead of continuing for a reasonable distance (ie. at least half a mile), the lanes provided by most on-ramps vanish right after they meet up with the main highway, with a big YIELD sign there, which is dangerous (it’s only safe to merge if you’re going approximately the same speed as traffic in the lane you’re merging into, and yield can potentially mean having to come to a complete stop with no more room to accelerate!) and defeats the entire purpose of having the on-ramp in the first place.

Mind you, I’ve got nothing against Yield signs used well. They have a legitimate purpose. I just don’t see very many of them used right anymore.

Mason Wheeler (profile) says:

According to the report, Google’s vehicle was in the right-hand turn lane in a busy thoroughfare in Google’s hometown of Mountain View, California, last month, when it was blocked by some sand bags. It attempted to move left to get around the sand bags, but slowly struck a city bus that the car’s human observer assumed would slow down, but didn’t. All in all, it’s the kind of accident that any human being might take part in any day of the week.

Where in the world are you from, where that looks like an everyday occurrence? I’ve lived all over the US and also outside it, and I don’t believe I’ve ever seen sandbags in the road obstructing traffic, particularly in the middle of “a busy thoroughfare”!

Gwiz (profile) says:

Re: Re:

Where in the world are you from, where that looks like an everyday occurrence? I’ve lived all over the US and also outside it, and I don’t believe I’ve ever seen sandbags in the road obstructing traffic, particularly in the middle of “a busy thoroughfare”!

I’ve seen sandbags in the middle of the road where I live more than once. They are used to weigh down temporary construction signs and sometimes do not get picked up with the sign.

Anonymous Coward says:

Re: Re: Re: Re:

Moral laws are not necessarily part of either of those definitions, since they often operate on a cultural basis. As for laws, mostly they evolve through branching, where you have a simple law as a basis and then keep piling on branches to make the simple laws work in every aspect of reality. Most of todays lawmaking is so complex that it requires a larger department of economists and lawyers to evaluate if/how something can be made into law.

If something is enforced to make money, it has to be extremely clear where the lines are drawn and thought has to be given as to where the money goes since unclear enforcement and conflict of interests are primary recipies for corruption down the line.

Lawrence D’Oliveiro says:

Re: Sure, program the car to DELIBERATELY break the law. That will go over well.

That would lead to an interesting situation indeed. Which do you think the regulatory authorities would look more kindly on: programming the car to

  1. Break the law occasionally for the sake of safety, or
  2. Stick scrupulously to the official rules, compromising safety for other road users?

Frankly, I can see this becoming an ideological issue, where one side stubbornly insists on strict enforcement of certain laws, refusing to acknowledge accumulating evidence of the wrongness of that course. Does that sound familiar?

Mason Wheeler (profile) says:

Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.

Sorry, but no. I like Google, but I’ve gotta fault them on this. Any competent driver knows to never assume another vehicle will not do something stupid.

When I’m about to change lanes and there’s any possibility of another vehicle entering the same space, I put my turn signal on and turn the wheel just a tiny bit, drifting over slowly so the other driver can get the message. Then I watch the other car, and if they don’t clearly defer to me and make room for me to move in, I abort, pull back to the center of my lane, cancel the turn signal… and then pull in behind them and honk at them for being a jerk who doesn’t know to defer to someone with a turn signal on.

But I never just go and assume they’ll make room for me before they actually make room for me. That’s just asking to get in a wreck, as we see here.

jupiterkansas (profile) says:

Re: Re:

Self driving cars work by making assumptions about the world around them. This time it was in error, and the human observer – making the same assumption – didn’t stop the vehicle to prevent an accident.

But the real important thing here is that they could analyze the crash (and thousands of variations on it) to prevent a similar accident from occurring. Unlike human drivers, which will cause the same accidents over and over, self-driving cars can learn not to repeat past mistakes.

Mason Wheeler (profile) says:

Re: Re: Re:

True enough. I just find this a particularly stupid assumption to make, and it boggles the mind a little to think that both the programmer and the driver made the same bad assumption. You never assume someone’s going to make room for you before they actually make room for you. (Especially when they’re bigger than you. I could go on a rant here about human psychology and bullying, but… do I even have to?)

Gwiz (profile) says:

Re: Re:

…and then pull in behind them and honk at them for being a jerk who doesn’t know to defer to someone with a turn signal on.

Defer? In my state it is the person who is changing lanes responsibility to make sure the lane is clear. There is nothing in our laws that says other motorists have to “defer” to your turn signal, adjust their speed or even make room for you.

Mason Wheeler (profile) says:

Re: Re: Re:

It’s not about the law; it’s about courtesy. If someone who’s already ahead of you needs to pull in, civilized drivers drop back a little and let them. Jerks continue forward as fast as they can, and evil California troll drivers see the signal and pull forward just far enough to be in their blind spot and then stay there, endangering the lives of everyone involved.

Gwiz (profile) says:

Re: Re: Re:3 Re:

Anybody changing lanes is an idiot?

No. People who drive as if thier destination and thier time is infinitely more important than anyone else on the road with them is a pet peeve of mine. Using your turn signal as if it’s a command signal that others should defer to, instead of a signal of your intentions is indictative of that sort of driver to me. Those are the idiots I was referring to.

nasch (profile) says:

Re: Re: Re:4 Re:

Sure, but in heavy traffic a signal can also be used to ask someone to let you move over. You don’t have to, but it’s a dick move not to if you can do it easily. Everyone would be better off if we were a little more ready to yield (even when we don’t have to) and help those around us have a nicer experience in traffic.

Gwiz (profile) says:

Re: Re: Re:5 Re:

Sure, but in heavy traffic a signal can also be used to ask someone to let you move over. You don’t have to, but it’s a dick move not to if you can do it easily.

I agree and I do exactly that when it feels like I was asked. It’s when I feel like I was commanded or just automatically expected to yield that I tend to be a bit of an asshole. Courtesy is a two-way street.

It might be that I really am an asshole, since I often have to resist the urge to park in handicapped spaces just to watch handicapped people make handicapped faces.

Anonymous Coward says:

Re: Re: Google changes rules

This. I wished pedestrians in Boston understood this. The law may say that vehicles MUST yield to pedestrians crossing a street (whether in a legal crosswalk or not is another story)… but most people seem to forget that Newton trumps the MA State legislature every time.

JoeCool (profile) says:

Re: Re: Google changes rules

There are more levels to this rule:
1 – Bigger vehicles always have the right of way.
2 – More expensive cars always have the right of way.
3 – Piece of crap cars always have the right of way.

Why? Number 1 is obvious – they don’t allow folks in because they don’t have too; they’ll win any kind of collision. Number 2 is because all expensive cars come with a certificate that says the owner can break any laws they feel like because they have money. And finally, number 3 is because the guy in the piece of crap just doesn’t care – he paid jack for the car, and jack is what you’ll get from him in any kind of accident.

Anon says:

Re: Google changes rules

Yes, bigger is better, might makes right etc. You have to drive like everyone else is an asshole, because so many of them are. When there’s a merge, you assume someone will be courteous. They don’t have to be, I’ve seen many instances where dickweeds seem to deliberately attempt to block you from the on-ramp merge, for example.

nasch (profile) says:

Re: Re: Google changes rules

They don’t have to be, I’ve seen many instances where dickweeds seem to deliberately attempt to block you from the on-ramp merge, for example.

Not that that doesn’t happen, but don’t be too quick to assume malice. The person could be completely oblivious to your presence. It’s amazing how often drivers are unaware of the vehicles around them.

JoeCool (profile) says:

Re: Re: Re: Google changes rules

Google needs to give particular emphasis to city buses as in many cities, they’re exempt for many of the same laws the rest of us follow. Example – the city of Houston, TX. City buses don’t need insurance, always have the right of way, and can park in the street when the driver feels it’s time to eat, or is ahead of schedule, or for whatever reason the driver feels like. Took the bus for nearly eight years there, and saw things that would make your hair curl.

Anonymous Coward says:

Needs of the many or needs of the few?

So in my processing, I would compute that the larger object always wins and therefore err on the side of caution. Also, living in a large city I would never assume the other vehicle will allow me to get over. In fact, I would assume they would not let me over, especially if I signal first. It seems in big cities that a turn signal is an indicator to the guy in the other lane to close ranks and block you from getting over.

Anonymous Anonymous Coward says:

Not to be trusted

It will be at least a decade or four before I trust anything Carnegie Mellon has to say. And that is only if they stop dissembling. Statistics put out, out of context, seems like it is part of a trend.

Their behavior around the Tor debacle was duplicitous and disgraceful. Not that they did the research, but that they denied it. They should have done the research while keeping Tor informed of their findings in near real time. If that lost them the government money, so be it.

Anonymous Coward says:

The unspoken bargain.

Imagine yourself driving along on a busy freeway, when the car besides you signals to enter your lane. Normally, you might attempt to make room for them, however today the road is too thickly packed to maintain a safe distance while allowing for this car to enter your lane. Ignoring your predicament, the signaling car squeezes in front of you, then quickly exits at the passing off ramp. Being cut off is never fun or safe or even legal, but you can understand this driver’s problem and forgive this minor risk, given you now see they merely needed to make the exit. These sorts of minor incidents occur every day and most of us are more than capable of remaining safe. We all understand traffic laws exist for our safety, but we also know these laws do not cover every conceivable situation we encounter. Even if a human can drive lawfully 99% of the time, occasionally we all need to break a rule to stay safe, or prevent an accident. I do not believe society can function if every citizen blindly obeyed every rule and law to the last letter. In order for autonomous vehicles to safely share the road with humans, these computers will have to be programmed to break a traffic law from time to time. I imagine this will be the hardest obstacle to overcome. How can we imbue an AI to take “safe” risks?

Anonymous Coward says:

The tension between expected behavior and legally proper driving shows just how disconnected our traffic laws have become from their ostensible purpose (promoting safety).

I actually have a lot more sympathy for the self-driving car in this case (bus should have yielded but did not)* than I do when it gets rear-ended while driving 25 when the speed limit is 45.

* – I formed my opinion on this primarily from the ars technica article, which described the lane positions somewhat differently.

Anonymous Coward says:

Re: Re:

Why form your opinion from the Ars article when you have the accident report right here?

According to the report, at one point the car “moved to the right-hand side of the lane to pass traffic in the same lane”. Also, the bus was doing all of 15 MPH and was struck in the side. Perhaps that will inform your opinion as to what the car was doing vs what the bus was doing, and who was at fault.

Anonymous Coward says:

While driving people also communicate by hand gestures, facial expressions, and horn honking among other methods. They communicate both with other drivers and pedestrians. These are things that automated vehicles may not be able to pick up on or interpret very well whereas humans are generally good at understanding each other.

Also if someone is on their cell phone other surrounding drivers may see that and compensate by being extra careful. If someone sees another car swerving a human may either anticipate that the car ahead is swerving around something, so they will then anticipate a potential obstacle ahead and react accordingly, or they will anticipate that the other driver is perhaps intoxicated, and they will attempt to compensate. How well do automated cars do this?

Also humans tend to avoid being in the blind spots of other vehicles for too long, especially large ones. They are pretty good at knowing if they are in a blind spot or not (if you can’t see the other driver’s face or its reflection). How well do automated cars do this?

And if a car in an adjacent lane stops it could be that someone is crossing the street so you should be more careful of pedestrians. People tend to know this. Do automated cars?

Things like accident avoidance are very important things to consider when driving and not just how well a car can follow a set of rules to avoid being legally at fault.

Ninja (profile) says:

Re: Re:

These are things that automated vehicles may not be able to pick up on or interpret very well whereas humans are generally good at understanding each other.

Hmmm. I must disagree. Besides, autonomous cars will probably be able to talk to each other in milliseconds and notice the “honks/expressions” much faster among themselves.

How well do automated cars do this?

As they learn to identify distracted drivers this will most likely improve. The factor that makes driving insecure and causes accidents is actually the human factor. We shouldn’t be making predictions, we should just follow the rules and drive defensively and giving preference to those ahead of us.

Blind spots are not an issue to automated cars. Their blind spots will be as wide as their sensors precision. As for when dealing with human drivers the cars can probably evaluate such blind spots much better than humans. Maybe the cars haven’t learned or haven’t been taught this yet.

And if a car in an adjacent lane stops it could be that someone is crossing the street so you should be more careful of pedestrians. People tend to know this. Do automated cars?

Actually people are incredibly bad at it. I try to signal with my arm when I’m in the right lane to help but I’ve seen accidents almost happen because other drivers simply couldn’t care less (or are distracted enough) that I actually think it’s better to just keep flowing till there’s a gap in traffic where drivers can see from afar that pedestrians are trying to cross.

Things like accident avoidance are very important things to consider when driving and not just how well a car can follow a set of rules to avoid being legally at fault.

That’s what they are doing here. You see that both the car automated system AND the person inside thought the bus would slow down. A person that could have taken the controls if needed. Note that the vehicle moved slowly so it shouldn’t have been a problem if the human inside decided to take control. It’s not the AI fault. It’s just the human factor.

Kode (profile) says:

What’s missing from this story is the fact that the google car *didn’t* change lanes, it tried moving back to the centre of its own lane.

Apparently the lanes are wide enough to allow 2 cars if the car turning right hugs the right of the lane, which the google car was programmed to do as a courtesy to other road users (not blocking traffic for people who aren’t turning right).

Ninja (profile) says:

I’m assuming the autonomous car signaled that it intended to move that way using the proper lights and the system evaluated the distance to be enough to the bus driver to notice the signal and reduce his speed to enable smooth merging. The question actually is if the driver noticed the signal or was distracted (paying attention to other elements from the traffic for instance, I’m not meaning the driver is at fault) or not. And if he did, why didn’t he allow the car to merge. Those who drive among us know that there are plenty of those assholes that simply ignore you and actually speed up in order to prevent you from merging.

The cars need to consider these assholish behaviors to avoid collisions. Maybe merge slowly and give full priority to human drivers or something like that. You can only program them to act strictly within the laws if there aren’t humans driving in the vicinity. Obviously if there are only autonomous cars they can communicate among themselves and this would make things even easier.

Mason Wheeler (profile) says:

Re: Re:

Assuming the cars have a way of communicating with other autonomous cars, the problem you’re describing is equivalent to a programming problem known as a “live lock”. There are several well-understood solutions for successfully resolving a live lock, which basically boil down to (much more formalized versions of) “have them flip a coin to decide which one gets to go first.”

This is not actually a real problem.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...