Engineers Say If Automated Cars Experience 'The Trolley Problem,' They've Already Screwed Up

from the I'm-sorry-I-can't-do-that,-dave dept

As self-driving cars inch closer to the mainstream, a common debate has surfaced: should your car be programmed to kill you if it means saving the lives of dozens of other people? This so-called "trolley problem" has been debated at universities for years, and while most consumers say they support automated vehicles that prioritize the lives of others on principle, they don't want to buy or ride in one, raising a number of thorny questions.

Should regulations and regulators focus on a utilitarian model where the vehicle is programmed to prioritize the good of the overall public above the individual? Or should self-driving cars be programmed to prioritize the welfare of the owner (the "self protective" model)? Would companies like Google, Volvo and others prioritize worries of liability over human lives when choosing the former or latter?

Fortunately for everybody, engineers at Alphabet's X division this week suggested that people should stop worrying about the scenario, arguing that if an automated vehicle has run into the trolley problem, somebody has already screwed up. According to X engineer Andrew Chatham, they've yet to run into anything close to that scenario despite millions of automated miles now logged:
"The main thing to keep in mind is that we have yet to encounter one of these problems,” he said. “In all of our journeys, we have never been in a situation where you have to pick between the baby stroller or the grandmother. Even if we did see a scenario like that, usually that would mean you made a mistake a couple of seconds earlier. And so as a moral software engineer coming into work in the office, if I want to save lives, my goal is to prevent us from getting in that situation, because that implies that we screwed up."
That automated cars will never bump into such a scenario seems unlikely, but Chatham strongly implies that the entire trolley problem scenario has a relatively simple solution: don't hit things, period.
"It takes some of the intellectual intrigue out of the problem, but the answer is almost always ‘slam on the brakes’,” he added. “You’re much more confident about things directly in front of you, just because of how the system works, but also your control is much more precise by slamming on the brakes than trying to swerve into anything. So it would need to be a pretty extreme situation before that becomes anything other than the correct answer."
It's still a question that needs asking, but with no obvious solution on the horizon, engineers appear to be focused on notably more mundane problems. For example one study suggests that while self-driving cars do get into twice the number of accidents of manually controlled vehicles, those accidents usually occur because the automated car was too careful -- and didn't bend the rules a little like a normal driver would (rear ended for being too cautious at a right on red, for example). As such, the current problem du jour isn't some fantastical scenario involving an on-board AI killing you to save a busload of crying toddlers, but how to get self-driving cars to drive more like the inconsistent, sometimes downright goofy, and error-prone human beings they hope to someday replace.

Reader Comments

The First Word

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 24 Aug 2016 @ 10:49pm

    I sure hope they're doing some of their testing on wet roads and icy roads, you know, the ones where slamming the brakes is NOT always a viable option. But I'll bet ya they aren't.

    reply to this | link to this | view in chronology ]

    • icon
      David (profile), 25 Aug 2016 @ 3:39pm

      Re: Dumb question.

      Are you aware of anti-lock brakes or auto-correcting a slide? Both of those already exist in manually driven cars. There is less requirements today to do that in software as the vehicle already does it.

      Thinking is a great idea, but I'll bet ya you aren't doing it.

      reply to this | link to this | view in chronology ]

    • icon
      JMT (profile), 25 Aug 2016 @ 5:18pm

      Re:

      "I sure hope they're doing some of their testing on wet roads and icy roads, you know, the ones where slamming the brakes is NOT always a viable option."

      Actually with modern ABS and stability control systems that's exactly what you should do. These systems are much better than most drivers.

      "But I'll bet ya they aren't."

      Yes, you're much smarter than they are, which is why you thought of it and they didn't...

      reply to this | link to this | view in chronology ]

  • icon
    JoeCool (profile), 24 Aug 2016 @ 11:08pm

    He's right.

    I taught driver's ed and defensive driving for many years, and things like the Trolley Problem are errors people made long before that are easily avoided if you think about it. An example I loved to give classes: You're driving to work in the morning and run into a bank of thick fog. Do you a) slow down and get rear ended by someone behind who isn't going to slow down, or do you b) maintain your speed and hope there's nothing ahead of you? The correct answer is c) don't drive through the fog. Get off the freeway onto a surface street where you can go slow through the fog without getting rear ended, or d) pull of the road (maybe at a Dunkin Donuts) and wait out the fog, or e) realize the fog occurs every damn morning this time of year and leave early before it forms. See, being caught in the fog and having to make a life or death decision means you blew it LONG before you ever got to the fog, and that there are many more answers than kill yourself or kill others.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Aug 2016 @ 11:52pm

      Re: He's right.

      He is right, assuming the that the car is working properly. however if for example, that stone you just went over has taken out a brake line, the car can no longer stop in the distance known to be clear.
      Also, when you run into that bank of fog, without being able to leave the freeway before you reach it and you cannot predict it, you now how to decide on your speed until you can get to safety. (Hint, people drive on reads they have never traveled before, and where they do not know the local weather and other anomalies).

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Aug 2016 @ 6:48am

      Re: He's right.

      "Never drive through fog" is an entirely unrealistic answer.

      reply to this | link to this | view in chronology ]

      • icon
        JoeCool (profile), 25 Aug 2016 @ 9:47am

        Re: Re: He's right.

        No it's not. You ALWAYS have the pull off the road option. ALWAYS. You just choose not to use it and blindly drive through the fog hoping for the best. And most of the time you're right! And once in a while, you're part of a sixty car pile-up with a dozen people dead.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 25 Aug 2016 @ 10:57am

          Re: Re: Re: He's right.

          When you hit fog, on a strange road, you do not have that option until you can find a lay-by, or other pull-off. When you are struggling to keep going at 30mph because of lack of visibility, and idiots who knew the road are coming up and overtaking you at 60+, and its only a two lane road, the five miles or so it took to find a lay-by was nerve wracking to say the least. Due to hedges, pulling of and stopping without finding a lay-by was not an option, and slowing enough to use a field gate was much too risky, especially as no guarantee that using such would clear the road, and stopping to find out was much too risky.
          The visibility was such that someone coming up on me at 60+ could see my tail light in time to overtake, but not to slow down to my speed, and not knowing the road, I did not where, if anywhere, there were bends that required slower speeds.

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 25 Aug 2016 @ 12:01pm

            Re: Re: Re: Re: He's right.

            There will be no self driving cars allowed in the UK because there are just too many issues with it. Got it.

            reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 25 Aug 2016 @ 12:27pm

              Re: Re: Re: Re: Re: He's right.

              My comment was not against self driving cars, but rather the statement that you can avoid driving in bad fog by pulling of the road. In fog like that, self driving cars, so long as all cars are self driving, could be much safer, as they could all slow down to a sensible speed, like 15-20 mph, as Self driving cars are still limited by what they can see .

              reply to this | link to this | view in chronology ]

          • icon
            JoeCool (profile), 25 Aug 2016 @ 12:49pm

            Re: Re: Re: Re: He's right.

            If you did find yourself in such a situation, clearly the best thing to do is keep your speed at a level where you can see, but not so low as to get rear-ended, and to pull off at the VERY FIRST place to do so. It's not perfect, but like you point out, you sometimes find yourself in imperfect situations.

            reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 25 Aug 2016 @ 1:27pm

            Re: Re: Re: Re: He's right.

            Turn around. That's also an option. Exposure time to someone running up on you, when properly executed is quite short. Certainly shorter than the time to cover 5 miles @ 30 mph.

            reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 25 Aug 2016 @ 2:17pm

              Re: Re: Re: Re: Re: He's right.

              Two lanes, between hedges, with potential traffic coming up at 50-60 miles an hour unable to see you in time to start braking, just stopping without being able to get off the road was suicide, never mind attempting a three point turn..

              reply to this | link to this | view in chronology ]

              • icon
                JoeCool (profile), 25 Aug 2016 @ 5:54pm

                Re: Re: Re: Re: Re: Re: He's right.

                Um - iffen I were on a road like that, I'd NOT get on a road like that. I'd find another route, even iffen it were twice as long. Half of safe driving is planning routes that aren't virtual suicide. Need to make a left turn at an intersection where a dozen people are killed turning left every year? DON'T! Go a couple blocks further down where you have a protected left turn and they've only killed one person in the last decade.

                reply to this | link to this | view in chronology ]

          • identicon
            Andrew D. Todd, 25 Aug 2016 @ 6:56pm

            Re: Re: Re: Re: He's right.

            The railroads have a similar problem. The stopping distance of a train is so great that an engineer quite frequently cannot see to his stopping distance. So they have track signals, which automatically detect the presence of a train, and report it down the line a couple of miles, to activate warning lights mounted beside the track. On the whole, the system works.

            reply to this | link to this | view in chronology ]

    • icon
      jupiterkansas (profile), 25 Aug 2016 @ 8:37am

      Re: He's right.

      I always choose option D whether there's fog or not.

      reply to this | link to this | view in chronology ]

  • icon
    ECA (profile), 24 Aug 2016 @ 11:19pm

    dIDNT FIGURE THIS UNTIL now

    Ok..
    Its taken me awhile to figure liability out from an automated car..
    See..IF' Im not driving, WHO pay insurance, who is responsible for this car? And IF' Im paying a small fortune for a car, WHY not just get a driver, for LESS..

    The Thing that will happen, is UPON buying said type of vehicle, you will be introduced to a LIST.
    This list will BE' the programming the car drive on..

    Drive threw the GROUP or kill the passenger?
    Speed if the traffic is slow?
    Speed if it is allowed?
    Drive CLOSE to large vehicles?
    Maintain speed only in CITIES?

    That is the only way they can transfer Liability, and responsibility..

    I will wait and record, the FIRST good rainy night, on a back road, where the Street lines Arnt really there...

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Aug 2016 @ 6:50am

      Re: dIDNT FIGURE THIS UNTIL now

      I'm sure the google engineers developing the fucking self-driving car will be absolutely shocked to find out about the existence of backroads and rain.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Aug 2016 @ 7:33am

      Re: dIDNT FIGURE THIS UNTIL now

      Perhaps you should muse on the correct usage of the shift button before you put your brain power towards more difficult ideas?

      reply to this | link to this | view in chronology ]

  • identicon
    oliver, 24 Aug 2016 @ 11:21pm

    FUZZY LOGIC

    Hey

    Fuzzy Logic to the rescue!
    There is nothing that fuzzy logic can't fix.
    :-)

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 24 Aug 2016 @ 11:47pm

    "It takes some of the intellectual intrigue out of the problem, but the answer is almost always ‘slam on the brakes’,” he added.

    Better keep those cars in California then, because that's the LAST thing you do with snow or ice!

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Aug 2016 @ 11:59pm

      Re:

      "Better keep those cars in California then, because that's the LAST thing you do with snow or ice!"

      Newsflash, buddy - cars have ABS now. Braking on snow and ice is fine. As is steering around obstacles...

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 25 Aug 2016 @ 12:27am

        Re: Re:

        Newsflash to you too, buddy... anything that man ever has, and ever will create, is guaranteed to fail at some point. Where I live, I have 4-6 months of snow and ice a year. Over 30 years of driving and I've never had an accident. I will ALWAYS trust myself more than ABS or a self driving car. You can have them both.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 25 Aug 2016 @ 7:19am

          Re: Re: Re:

          Yeah, you not crashing in 30 years is not really statistically significant. About 37 000 people die in car accidents each year in the US. According to a report to congress, 6.8 % of road accidents have some kind of vehicle deficiency as a cause (mostly bad tires, i.e bad maintenance of the owner). 88.2 % had no defects at all contributing to the crash. Yes, your ABS might fail, but you are far, far more likely to do it yourself. Given the statistics above, a trippeling in technical failures and a halving of the "human errors" as a consequence of autonomous cars would save in the ballpark of 15 000 lives each year in the US alone.

          Humans are incredibally bad at understanding risks, and we generally accept order of magnitudes higher risks if we feel a sense of control. Just face it, a computer will be able to focus on the entire surrounding, all the time. It can optimize break torque on each wheel before you even react that something is on the road, it never gets tired, annoyed or distracted. It will absolutely not be perfect, but if the goal is to save lifes, you only need to beat humans who are terrible drivers! I really don't get the "must be perfect" argument.

          Report: https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/811059

          reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 25 Aug 2016 @ 7:34am

          Re: Re: Re:

          You've completely missed his point. If you have to brake for an emergency and have ABS, you push down hard on the brakes, the same as you would for dry pavement. Do you not "pump" them. Again, we are talking about an EMERGENCY, not something you'd do in normal driving, regardless of road conditions.

          reply to this | link to this | view in chronology ]

        • icon
          JMT (profile), 25 Aug 2016 @ 5:23pm

          Re: Re: Re:

          "I will ALWAYS trust myself more than ABS..."

          Unless you are a very experienced driver who regularly practices extreme braking (e.g. a race driver), your trust is misplaced.

          reply to this | link to this | view in chronology ]

      • identicon
        Quiet Lurcker, 25 Aug 2016 @ 5:12am

        Re: Re:

        Sorry, but traction control and ABS do not confer immunity to the laws of physics or to human stupidity. They're a nice emergency backstop (no pun intended) but my take is, if I'm driving so poorly that I need the assistance, I have no business behind the wheel for that trip.

        reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 25 Aug 2016 @ 7:52am

        Re: Re:

        And I've seen several cars in the ditch along the side of the road with ABS every winter. We add safety features but with every feature people think they can take that extra risk. People think they can just kick into 4wd during a monsoon and take corners as if it's a sunny day.

        reply to this | link to this | view in chronology ]

  • icon
    Lalo Martins (profile), 25 Aug 2016 @ 12:07am

    Nobody wants the car to drive poorly

    how to get self-driving cars to drive more like the inconsistent, sometimes downright goofy, and error-prone human beings


    This is IMO inaccurate. What we want instead is for them to get better at allowing for the goofy humans (like seeing that a human is being a bit too eager behind you on the right on red situation).

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Aug 2016 @ 12:07am

    "one study suggests that while self-driving cars do get into twice the number of accidents of manually controlled vehicles, those accidents usually occur because the automated car was too careful -- and didn't bend the rules a little like a normal driver would (rear ended for being too cautious at a right on red, for example). As such, the current problem du jour isn't some fantastical scenario involving an on-board AI killing you to save a busload of crying toddlers, but how to get self-driving cars to drive more like the inconsistent, sometimes downright goofy, and error-prone human beings they hope to someday replace."

    No, let the cars be careful. The humans will eventually adjust, especially as the self drivers become more common.

    reply to this | link to this | view in chronology ]

  • identicon
    James, 25 Aug 2016 @ 12:19am

    Fixed: "those accidents usually occur because the driver of the non-automated car was too reckless"

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Aug 2016 @ 7:33am

      Re:

      "Fixed: 'those accidents usually occur because the driver of the non-automated car was too reckless' "

      No, I like it as originally stated. They are too cautious compared to normal drivers, not your 85 year old father, or you.

      reply to this | link to this | view in chronology ]

  • identicon
    Pat, 25 Aug 2016 @ 12:41am

    no good answer

    It's really hard to answer. Trolley problem is known for years and I think that the answer depends on how you perceive morality. What's good what's better etc. In fact - there is no good answer. Despite of - don't hit things. That's for sure.

    And I agree completely with comment number 11. I'm the same. I will always trust more myself than any equipment in my car. And honestly I can't imagine that automated car will be able to adjust to every kind of situation, weather etc.

    reply to this | link to this | view in chronology ]

  • identicon
    Mark Wing, 25 Aug 2016 @ 2:45am

    The real danger of automated cars is that they'll get hacked by Russians and drive you straight to a GOP rally.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Aug 2016 @ 2:58am

    Me/Not-Me

    "Should regulations and regulators focus on a utilitarian model where the vehicle is programmed to prioritize the good of the overall public above the individual?"

    Not if they want ME to buy or ride in one. At the very least, the choice needs to be represented by a toggle switch on the dashboard labeled "Me/Not-Me."

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Aug 2016 @ 7:27am

      Re: Me/Not-Me

      Would you feel better if they just programmed the car not to hit anything, but included your switch that has a green light that comes on when you switch it to "Me", but otherwise does nothing?

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Aug 2016 @ 3:22am

    all mood

    The question is totally mood. The car will never know if the obstacle is a bus full of toddlers or a wall of cardboard boxes. The answer is to brake and Hit the obstacle with the least amount of speed. Hence the least damage. Pedestrian recognition might actually work, but anything else are purely philosophical erxecises with no Basis in reality

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Aug 2016 @ 4:09am

      Re: all mood

      You mean Moot, and you are wrong. A woman pushing a baby stroller into a street too close to stop in time, will cause the car to veer over to avoid them both. Your car will crash you into a tree or parked car and trust that your airbags and crumple zones will save your life as well.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 25 Aug 2016 @ 6:04am

        Re: Re: all mood

        Sounds good; How about the baby stroller thing combined with crowded sidewalks on either side? What happens when the vehicle has to decide in a no win situation? You can program every contingency know to man and just when you think you've got it all figured out, man will do something completely irrational/unpredictable. Ever been driving down the road and seen someone not paying attention, and as a result of that observation/gut feeling, you avoided an accident? You could argue that if ALL cars were driver less, you wouldn't need that intuition... but what then of malfunctions? Hacks? Unpredictable weather? Geological events?. countless other things can/will come into play. I'm not saying it can't be done, I'm just saying that it's going to be a long road...:)

        reply to this | link to this | view in chronology ]

        • icon
          JoeCool (profile), 25 Aug 2016 @ 9:55am

          Re: Re: Re: all mood

          If your choices are plow into a baby carriage "suddenly" appearing in the road or plow into a crowded sidewalk, you were GOING TO FAST FOR CONDITIONS. Always drive at a speed where you can deal with "sudden" changes in the roadway. Example, DON'T TAILGATE A PICKUP WITH POORLY TIED DOWN FURNITURE. Stay back far enough that you can safely change lanes when something falls out. "But then five cars will pull in ahead of me!" Good! Let them run into the furniture while you stay safe. God! People have the stupidest responses to safe driving practices... that's why we kill more people with cars than any other way.

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 25 Aug 2016 @ 11:40am

            Re: Re: Re: Re: all mood

            You completely missed the forest for the tree's JoeCool. The entire "to fast for conditions" theory breaks completely down with one mechanical failure, geological event, animal running into the road, bird strike, or any other random event. Unpredictable shit happens on the road and you may be driving just fine for current conditions when a strap breaks and a lamp from a poorly strapped down load comes bouncing down the road.

            "Always drive at a speed where you can deal with "sudden" changes in the roadway."

            You mean at a speed where you can dodge an airplane randomly landing on a road?
            http://www.startribune.com/small-plane-lands-on-i-35-near-wyoming-closes-highway/385284761/

            .. Or how about a medical emergency that causes a massive pile up...

            https://www.washingtonpost.com/news/dr-gridlock/wp/2016/07/12/crash-involving-about-20-vehicles -along-i-395-jams-traffic-in-arlington-county/

            How about a random mechanical failure?

            http://www.aa1car.com/library/auto_accident.htm

            You can't cover all the bases with "to fast for conditions". That shit doesn't work in real life.

            Adding, or in some cases counteracting the "random event" part of driving is the human aspect; You don't even realize how many decisions you make when you drive, how many intuitive control responses you send to the wheel/pedals... you are not only driving your car, you are assessing everything around you, both factually and emotionally. You can see the idiot on teh cell phone or texting and decide if he's a threat before he gets into your comfort zone...You may see a flash, hear an odd sound, get a funny feeling, see something that "bugs" you....etc.

            Now, you may be able to "program" some of these things into a computer, great. If only computers are driving, then it may work. But one emotional, uncontrollable, irrational, irresponsible person gets behind the wheel... Or one random event or mechanical failure and the entire logic and statistic based system comes crashing down into chaos.

            reply to this | link to this | view in chronology ]

            • icon
              JoeCool (profile), 25 Aug 2016 @ 12:57pm

              Re: Re: Re: Re: Re: all mood

              You're talking about one in a million events where I am talking about everyday events. You're totally missing the point.

              reply to this | link to this | view in chronology ]

              • identicon
                Anonymous Coward, 25 Aug 2016 @ 1:58pm

                Re: Re: Re: Re: Re: Re: all mood

                So a quick google search and review of a few different web sites and it looks like around 3% of all accidents are from mechanical failure. Hardly one in a million.

                reply to this | link to this | view in chronology ]

                • icon
                  JoeCool (profile), 25 Aug 2016 @ 5:57pm

                  Re: Re: Re: Re: Re: Re: Re: all mood

                  But all the REST of the conditions you state change that to one in a million. That 3% covers EVERYTHING, including the 99% of that 3% that don't involve any injuries at all, just bumping fenders.

                  reply to this | link to this | view in chronology ]

                  • identicon
                    Anonymous Coward, 26 Aug 2016 @ 4:39am

                    Re: Re: Re: Re: Re: Re: Re: Re: all mood

                    You can spin it any way you like, automobiles have too many parts, variables, and are subject so many external conditions to assume the chaos theory won't apply if you automate them. The only way you could limit it is if you completely re designed our transportation and re defined how we move around. IMO If were going to go to that level, it would be more efficient to put the energy in mass transportation and get rid of personal transportation altogether. But it seems you guys have all the answers, good luck with that. Hopefully they will give us the choice to turn it off or on.. that or we'll have to wait until the first major life taking accident where a politician up for re election turns it into a "think of the children" issue.

                    reply to this | link to this | view in chronology ]

                    • identicon
                      Anonymous Coward, 26 Aug 2016 @ 5:33am

                      Re: Re: Re: Re: Re: Re: Re: Re: Re: all mood

                      it would be more efficient to put the energy in mass transportation and get rid of personal transportation altogether.

                      Heres the catch 22 with that.
                      You cannot go more than a few hundred yards without either your journey being recorded or being arrested for criminal behavior; that is hiding where you want to go.

                      reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 25 Aug 2016 @ 1:39pm

              Re: Re: Re: Re: Re: all mood

              Shrug. You're first two examples are trivial. a properly set up sensor should determine velocity vectors. If you can see an object at least twice (and the Lidar turns at speeds sufficient to so this many times a second) you can get a velocity vector. the plan just appears as just another object traveling at a certain speed. Even if it just "appears." Given that landing speeds for a plane are around freeway traffic speeds, gentle braking would be all that is needed. if the plane lands opposite the direction of traffic, then it may not be possible to evade. But that necessarily occurs at a speed faster than any human can handle as well.

              With the suddenly stopped traffic, just point the sensor further down the road and make sure it can handle the extra info.

              Sudden brake failure. This occurs exceedingly rarely. Cars have sensors. autonomous cars necessarily monitor how much braking occurs when brakes are used vs. expected brake force. When those do not match, you can have a failure condition to reduce speeds in line with actual brake force available, pull over, alert for service, refuse to move, etc. So you have to have, at a minimum, a cascade failure where the brakes fail AND the automation needs to respond to an emergency without having used the brakes post failure. Even then, an autonomous car would be able to detect the failure and determine that it is unable to stop in time far faster than a human would be able to without panicking. It could then decide to pull the emergency brake or simply increase brake pressure. Advanced systems may even redistribute brake force to route around the failed brake. These are not hard cases to design for.

              reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 25 Aug 2016 @ 1:44pm

              Re: Re: Re: Re: Re: all mood

              As to an irrational person ramming your autonomous car, you can get the Texas Edition (TM) which automatically deploys a gun into your lap on contact.

              reply to this | link to this | view in chronology ]

          • icon
            Mason Wheeler (profile), 25 Aug 2016 @ 12:44pm

            Re: Re: Re: Re: all mood

            Example, DON'T TAILGATE A PICKUP WITH POORLY TIED DOWN FURNITURE.

            Better example: Don't tailgate anyone, anywhere, under any circumstances.

            Around here, the single most effective thing the police could do to make roads safer is to start treating tailgaters exactly the same way as drunk drivers. It's really that bad.

            reply to this | link to this | view in chronology ]

  • identicon
    Crcb, 25 Aug 2016 @ 4:18am

    The war on general purpose driving

    Let the owner decided? Don't talk nonsense! Why should we have more control over our cars than any other modern device? You already can't fix the damn thing without breaking the CFAA.

    Speaking of which, how long before the feds and local cops have backdoors into our self-driving cars so they can take them over/disable them? My guess is they'll be able to turn while neighborhoods into no-drive zones.

    You know, to protect the children. Never to keep people from peaceably assembling or to keep journalists away from something they want to hide. Our government doesn't do those kinds of things.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Aug 2016 @ 5:08am

    The car vendor should be liable for any accident that's caused by the autonomous car. Period. If that would be the case, it would incentivize them to "get it right" WAY MORE than any other regulation.

    reply to this | link to this | view in chronology ]

  • icon
    ThatFatMan (profile), 25 Aug 2016 @ 5:14am

    Link

    A friend from work shared this with me recently, and it seemed relevant to the discussion here. It's basically a short quiz (about 13 random questions) on different scenarios where the car must choose between killing pedestrians or passengers.

    http://moralmachine.mit.edu

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Aug 2016 @ 8:59am

      Re: Link

      That link was already posted here recently, either in an article or in one of the comments. It's kind of a dumb quiz, because it assumes that the car knows whether the person it is about to run over is a criminal or a scientist. What is it going to do, use facial recognition on someone who is presumably not even looking in the car's direction (they wouldn't remain in the intersection if they saw a car coming at them, right?), connect to some network, and download details on the person's life - in an emergency situation? I don't want my car spending its processing power on that.

      The quiz also assumes the brakes are out - well, has the car tried the parking brake? If that's somehow out too, has it tried engine braking? (If you tell me all brakes are out and the transmission is also out, I'm frankly going to question why you assume you still have steering and a working computer.) Has it tried the horn? If the car can't avoid everyone on its own, reducing the speed of impact and giving pedestrians slightly more time to get *themselves* to safety should be the course of action, not trying to decide who should die.

      reply to this | link to this | view in chronology ]

  • identicon
    Heard in the Herd, 25 Aug 2016 @ 5:20am

    IRL?

    Are there any examples of this problem in real life?, I can't find one example(in my admitted limited google) but this is an ethic question a thought experiment anyone have a link to even one time it's actually happened?

    reply to this | link to this | view in chronology ]

  • icon
    Groaker (profile), 25 Aug 2016 @ 5:56am

    Thought problems in ethics have little to nothing to do with real life. They certainly have no application to a vehicle run by either a human or a computer. If there is sufficient time for either to make a decision about what to hit, then there are multiple other decisions available to avoid an accident.

    But of course accidents occur because people or CPUs put themselves in conditions that allow no time to make a decision of value. Most people would simply freeze, while a CPU would more likely be able to keep searching for an answer.

    Any number of "what if" strawmen can be invented. Their utility other than discussion in ethics seminars has no meaning in the real world.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Aug 2016 @ 6:07am

    No one knows what will happen when theres 1000,s of
    self driving cars on the road,
    on some roads many people drive over the speed limit,
    will a self driving car drive over the speed limit
    in order to reduce the chance of an accident .
    human drivers know driving slowin certain situations can cause accidents ,
    self driving cars could have red lights or be painted yellow like a taxi to let other drivers know they will not
    react as a human driver does .
    in the case of an emergency .
    what,ll happen when 30-50 per cent of cars in a city are self driving cars.

    reply to this | link to this | view in chronology ]

  • icon
    Mason Wheeler (profile), 25 Aug 2016 @ 6:46am

    It's still a question that needs asking

    No, it's really not, for two reasons.

    1) Chatham's right. There's a reason the Trolley Problem is a thought experiment, not a case study.
    2) In a world of imperfect computer security, there's only one possible right answer: always protect the people inside the car, period. If you build functionality into the car to kill the people inside the car, that becomes an attack vector that hackers will end up using to kill far more people (even if that number is never more than 1) than a legitimate Trolley Problem dilemma ever will. (See point #1.)

    reply to this | link to this | view in chronology ]

    • icon
      TimothyAWiseman (profile), 25 Aug 2016 @ 8:47am

      Re:

      I never thought of your #2, but you are entirely right.

      I will add another reason, one closely related to Chatam's but not quite the same. You will never face the choice with certainty. The real world is too unpredictable for that. Instead, you will face something more like the choice of increasing the chance of killing 1 person by decreasing the chance of killing 2 or vice versa. In that case, choose the one that has the best chance of having no fatalities at all. But, as Chatham points out, even getting to that probabilistic point.

      reply to this | link to this | view in chronology ]

  • identicon
    Andrew D. Todd, 25 Aug 2016 @ 7:08am

    The Trolley Problem Would Never Happen on a Real Railroad.

    These ethical dilemmas are very, very contrived. The over-riding fact is that the safest thing to do with a land vehicle, safest for all parties, is almost always to stop it. As a general principle of physics, swerving tends to reduce barking effectiveness (and steering effectiveness). The forces acting on the vehicle are the combination of braking force and sideways centrifugal force. The load gets concentrated on one of the front wheels, and even on the outer edge of that wheel, with the possibility of doing a spin around that wheel (*). People want so badly to believe that they can get out of a jam by going faster, only, it is not true, and contradicts the laws of physics.

    (*) I once knew a Doberman bitch who chased after balls in a uniquely stylish way. As she approached the ball, she stuck out a front foot, and did a "four-footed pirouette," with the other three feet in the air, reached out her mouth to snatch the ball, and, landing, dashed back the way she had come, with absolutely no wasted motion. It was a purely ballerina move.

    The Trolley Problem, in particular, was contrived by someone who knows very little about trains. It is incoherent in its own terms, and does not make a distinction between trains and trolleys. Trolleys are designed to run on the public street, and they have good brakes which apply directly to the rail, rather than to the wheel. Trolleys don't operate at very high speed, because the whole point of trolleys is to pick up passengers at short intervals.

    Railroad switches have speed limits on the diverging branch, often as low as fifteen miles per hour, and a train which goes over them faster will derail. A switch which works at high speed has to be correspondingly long, and correspondingly expensive. Taking a typical railroad curvature, a radius of a mile or so (much the same as the interstate highways), a full-speed crossover switch of the type described might have to be several hundred feet long, and take something on the order of ten seconds for the train to traverse. Most switches are designed on the assumption that the train will first slow down. I've got a magazine picture somewhere of the aftermath when a Toronto Go Transit train tried to take a 15 MPH switch at 60 MPH. The usual reason for there to be a switch splitting one track into two parallel tracks is to create a siding. Sidings are generally located where a train can stop without blocking traffic, in short, somewhere other than a traffic crossing point. The railroad is not going to spend large sums of money to build vast numbers of switches to create ethical conjectures. If the railroad has the money to spend, it will build overpasses and underpasses instead, seeking to isolate itself from road traffic, not to make moral conjectures about which car to hit.

    Short of that level of opulence, there is a market for "bulletproof" crossing gates, strong enough to resist if some fool attempts to simply drive through them. Amtrak has installed at least one in Western Michigan. These gates are designed on the tennis-court-net principle, whereby their flexibility is their strength, and they decelerate an errant car much less violently than colliding with a fixed object would. Grade crossings can be fitted with Lidar detectors, which confirm that the grade crossing is in fact empty, and if not, they trigger an alarm which causes trains miles away to start braking. Railroad accidents tend to happen because equipment is old, and has not been brought up to "best practices."

    The single worse railroad accident in North America in many years was the Lac Megantic accident in Canada. It involved a "dummy company" railroad, which was operated with a view to extreme cheapness. One of their oil trains ran away on a long grade, and reached about 70 mph under the influence of gravity. It rolled into a small town, and derailed. Due to the speed of the derailment, many of the tank cars broke open, spilling thousands of tons of oil, and producing a huge fire, which destroyed much of the town, and killed forty-three people. During the investigation, it emerged that the railroad was running its trains "with string and chewing gum," and that this was the cause of the accident. That is the most basic hazard which the railroads present. They haul around Hiroshima-loads of fuel and chemicals.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Aug 2016 @ 7:46am

      Re: The Trolley Problem Would Never Happen on a Real Railroad.

      And most of the contrived situations for the cars all involve "what if the brakes give out?". Brake failure is incredibly rare, and when it does happen, it's almost always a result of a very slow degradation that a smart car would be able to detect months ahead of time before it was a problem. The probability of all of your brake lines and your parking brake all suddenly and catastrophically failing on a car that is properly maintained is so low that I would be thoroughly surprised if it has ever happened.

      We don't need the car to decide who to hit, we just need the car to refuse to go in the first place when it detects that your brake system is in critical need of maintenance.

      reply to this | link to this | view in chronology ]

      • icon
        Mason Wheeler (profile), 25 Aug 2016 @ 7:49am

        Re: Re: The Trolley Problem Would Never Happen on a Real Railroad.

        Agreed. The one time I've actually been in a rear-end collision while in the vehicle that did the rear-ending, my friend (who was driving) was unaware that his brake pads had rotted away, and he was distracted by the music on the radio and hit the brakes too late. Had either factor been different, we wouldn't have hit them.

        reply to this | link to this | view in chronology ]

    • icon
      Mason Wheeler (profile), 25 Aug 2016 @ 7:46am

      Re: The Trolley Problem Would Never Happen on a Real Railroad.

      The over-riding fact is that the safest thing to do with a land vehicle, safest for all parties, is almost always to stop it.

      Almost. Assuming the threat is directly ahead of you.

      A few months ago, I was speeding up an on-ramp, which of course is the whole point of having an on-ramp, when some stupid teenage kid with a bicycle comes out of nowhere and makes like he's about to cross right in front of me. (This was at least 100 feet beyond the point where there are supposed to be no pedestrians, so I wasn't really paying attention to the side of the road when I had more important concerns to focus on in front of me and in the other lane.)

      In this scenario, if I had braked, and he'd stepped out, I'd have ran him down and probably killed him, because there wasn't space to decelerate very far. If I had sped up, on the other hand, and he'd stepped out, he'd have hit my car from the side, which would have injured him a whole lot less.

      Instead, I hit the horn and swerved to make a collision less likely, and he checked himself right at the last second and didn't step out into traffic after all. But this is one case where braking would have been the worst possible result.

      reply to this | link to this | view in chronology ]

      • identicon
        Andrew D. Todd, 25 Aug 2016 @ 9:00am

        Re: Re: The Trolley Problem Would Never Happen on a Real Railroad.

        Well, I don't suppose you had a chance to chew out the kid, but, kids being kids, there is at least a fifty percent chance that he was there on a dare. As my old Human Factors Engineering professor used to say: "You can make something foolproof, but you can't make it damm-fool-proof!"

        Anyway, I don't suppose you could have swerved more than five or ten feet sideways, and that is no distance for a bicycle to cover. I'd say it was probably the horn that averted an accident.

        Here's something I came across. It seems there's this Argentine ballet dancer, Lucila Munaretto, trained in a Russian Bolshoi school in Brazil, who got a modestly paid job, dancing in Vancouver, Canada, with a small semi-professional company which puts on about two shows a year, and does programs in the schools. She was making ends meet by working in a small bakery. Well, she went roller-blading in the street (without a helmet, what's more), collided with a minivan, and sustained head injuries. She seems to have mostly recovered, and they've got her dancing again.

        The Canadian national health insurance paid for her medical care per se, but it provides only limited coverage for things like physical therapy, not for something on the order of stroke recovery. The dance company started a funding drive, presumably among its audience, and raised $40,000, and they got a $150,000 line of coverage from the mini-van's accident insurance.

        http://www.cbc.ca/news/canada/british-columbia/ballet-lucila-munaretto-returns-to-stage-ov er-horrific-accident-1.3560275

        reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 25 Aug 2016 @ 1:23pm

        Re: Re: The Trolley Problem Would Never Happen on a Real Railroad.

        Not likely. If you had enough space to even increase your velocity a 1-2 mph, you likely had enough space to come very nearly to a complete stop. Engine power pales in comparison to brake power. http://www.popsci.com/cars/article/2003-04/my-1500-horsepower-brake-job

        reply to this | link to this | view in chronology ]

    • icon
      Eldakka (profile), 25 Aug 2016 @ 10:24pm

      Re: The Trolley Problem Would Never Happen on a Real Railroad.

      You are getting caught up in the technicalities of the imlementation of a thought experiment, as opposed to what it is asking.

      The thought experiment has set up an analogy to try to explain the experiment that is, admittedly, not fully applicable to the experiment.

      Forgetting the analogy, the thought experiment is asking this:

      If you had 2 exclusive choices, i.e. you could only do ONE of the two choices, which would you choose out of the 2 following options:
      1) take an action that would save the life a number of people (usually 5 or more) but result in the death of 1 person, or
      2) take NO action and allow the number of people (5+) to die, while saving the life of that 1 person?

      Which choice would you make?
      1) take the action, save 5, kill 1, or
      2) take no action, let 5 die, let 1 live.

      Variations on this assign a personal relationship to that single life that could live/die, thus making a personal link to the decision, reversing the no action vs action results (no action 5 live, 1 dies), adjusting the size of the group of people who will be saved/killed.

      reply to this | link to this | view in chronology ]

      • icon
        Mason Wheeler (profile), 26 Aug 2016 @ 6:51am

        Re: Re: The Trolley Problem Would Never Happen on a Real Railroad.

        As I said above, there's a reason it's a thought experiment (as you acknowledge) and not a case study: because it's a ridiculously contrived problem that does not occur in real life, due to being ridiculously contrived.

        reply to this | link to this | view in chronology ]

      • identicon
        Andrew D. Todd, 26 Aug 2016 @ 8:22am

        Re: Re: The Trolley Problem Would Never Happen on a Real Railroad.

        What you are saying is, in effect, "If hardware were software..." But hardware is not software. Saying so doesn't make it so. Hardware costs real money, and it is subject to real physical laws-- such as the law of centrifugal force. A mechanical engineer spends a lot of time negotiating his way past these actual physical constraints. The consensus rule, applying to a wide variety of vehicles, from automobiles to trains to ships, is that, whenever possible, you slow down first, and then turn. If you need to turn at speed, and within a limited space, it is going to get expensive. If the ethical dilemma were cast frankly in the form of dragons and orcs, it would be laughed off.

        The facts of a bad accident, such as the Lac Megantic accident in Canada, are usually such that everyone loses. The town is burned down, all these people are dead, the train has been smashed up, future use of the track is mired in the ultimate NIMBY case, the railroad is bankrupt, the locomotive engineer and railroad dispatcher are going to prison for many years for many counts of the Canadian equivalent of manslaughter, and even the big boss, nominally protected by dummy corporation cut-outs, has been disgraced, and will experience difficulty getting a new job. The company which bought the bankrupt railroad also loses, because it underestimated the depth of the NIMBY opposition.

        In railroading, there is a device called a "derail," a clamp which you bolt to the track to cause a train to derail at low speed, instead of running away. I don't know what a derail would cost-- it's a very simple device, but made in very small quantities. Five hundred bucks might be about right. It is prudent to put a derail in every locomotive cab, as cheap insurance, and I should think you could clamp it onto the track in less than five minutes. That would probably have been enough to prevent the Lac Megantic accident. However, the railroad had a culture of compulsive cost-cutting-- it was the kind of place where the boss rules by terror, and does his nut if you buy a box of pencils.

        "For the want of a horse-shoe nail, the horse-shoe was lost, and for want of the horse-shoe, the horse was lost, and for want of the horse, the rider was lost, and for want of the rider, the battle was lost, and for want of the battle, the kingdom was lost, and all for the want of a horse-shoe nail."

        It costs a lot of money to design and build equipment in such a way that the risks balance out in such a way as to create an ethical dilemma. The only kind of apparatus which has fine control of people getting killed, to the point that you can construct dilemmas, is executioners' apparatus: gas chambers, electric chairs, gallows, guillotines, etc.

        reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Aug 2016 @ 7:50am

    related questions.

    we all notice the differences when we ride in cars less or more expensive than our own. even the sounds the turn signals and seat belt warnings make are tinny and annoying in the lower-price cars. will automated control systems in cheaper cars be cheaper and less reliable or will all cars be mandated to have the best of control? will low-paid people be able to afford cars at all if the high-quality controls systems are mandated? if there are levels of quality in control systems, will there be different routing for different levels? poor people have to drive way out of the way to get where they need to go on flint-quality roads and elites-only drive on the well-maintained roads?

    if we segregate the poor, how do we explain to them that we're all in this together when the next world war comes around? that's going to be some cagey rhetoric. or maybe we assume robots and other bots fight this next one, huh?

    reply to this | link to this | view in chronology ]

  • identicon
    Dan, 25 Aug 2016 @ 8:05am

    i, robot

    and you guys thought i, robot was a fiction movie :)

    reply to this | link to this | view in chronology ]

  • icon
    wimpheling (profile), 25 Aug 2016 @ 8:16am

    obvious

    seems obvious to me that once automated cars hit the market human driving should be outlawed, as it will make the whole system more secure.

    And I'm not just talking about accidents : the recent truck attack in France would have been impossible or far more difficult (it would require skilled engineers) if all cars and trucks were automated.

    reply to this | link to this | view in chronology ]

    • icon
      jupiterkansas (profile), 25 Aug 2016 @ 8:44am

      Re: obvious

      That's like saying the moment we had automobiles we should have outlawed horses and buggies, when in fact they co-existed for a number of years until enough people had cars to mostly eliminate horses from roads.

      If we could shut down the current system and replace it with something else, we would have had self-driving cars long ago, but you can't simply stop the world and change everything overnight.

      reply to this | link to this | view in chronology ]

      • identicon
        Michael, 25 Aug 2016 @ 10:25am

        Re: Re: obvious

        That's like saying the moment we had automobiles we should have outlawed horses and buggies

        While unrealistic for a number of reasons, we SHOULD have done this if the only concern we have to deal with was safety. Having horses on the road was (and still is) dangerous because they behave in a less-predictable way than cars.

        reply to this | link to this | view in chronology ]

  • icon
    wimpheling (profile), 25 Aug 2016 @ 9:02am

    jupiterkansas i agree with you that the laws don't evolve that fast, but still I think my proposal should be promoted as fast as possible as it could surely save human lives.

    reply to this | link to this | view in chronology ]

  • icon
    JustMe (profile), 25 Aug 2016 @ 9:37am

    Bah

    "millions" Their sample size sounds impressive but is vanishingly small. Do they know how many miles Americans drive every day, to say nothing of the whole world?

    reply to this | link to this | view in chronology ]

  • icon
    David (profile), 25 Aug 2016 @ 3:42pm

    Trolley problem eclipsed by modern cars.

    Remember the anti-lock braking rules?

    Stop on the brakes.
    Stay on the brakes.
    Steer away from danger.

    Trolley problem is an issue when none of the above are done. Add to that the level of awareness the automated car has over the mediocre driver. They are seconds ahead of the driver in recognition of potential problems.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Techdirt Gear
Shop Now: I Invented Email
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.