Pedestrian Deaths By Car In Phoenix Area Last Week: 11. But One Was By A Self-Driving Uber

from the I-can't-do-that,-Dave dept

Despite worries about the reliability and safety of self-driving vehicles, the millions of test miles driven so far have repeatedly shown self-driving cars to be significantly more safe than their human-piloted counterparts. Yet whenever accidents (or near accidents) occur, they tend to be blown completely out of proportion by those terrified of (or financially disrupted by) an automated future.

So it will be interesting to watch the reaction to news that a self-driving Uber vehicle was, unfortunately, the first to be involved in a fatality over the weekend in Tempe, Arizona:

"A self-driving Uber SUV struck and killed a pedestrian in Tempe, Arizona, Sunday night, according to the Tempe police. The department is investigating the crash. A driver was behind the wheel at the time, the police said.

"The vehicle involved is one of Uber's self-driving vehicles," the Tempe police said in a statement. "It was in autonomous mode at the time of the collision, with a vehicle operator behind the wheel."

Uber, for its part, says it's working with Tempe law enforcement to understand what went wrong in this instance:

Bloomberg also notes that Uber has suspended its self-driving car program nationwide until it can identify what exactly went wrong. The National Transportation Safety Board is also opening an investigation into the death and is sending a small team of investigators to Tempe.

We've noted for years now how despite a lot of breathless hand-wringing, self-driving car technology (even in its beta form) has proven to be remarkably safe. Millions of AI driver miles have been logged already by Google, Volvo, Uber and others with only a few major accidents. When accidents do occur, they most frequently involve human beings getting confused when a robot-driven vehicle actually follows the law. Google has noted repeatedly that the most common accidents it sees are when drivers rear end its AI-vehicles because they actually stopped before turning right on red.

And while there's some caveats for this data (such as the fact that many of these miles are logged with drivers grabbing the wheel when needed), self-driving cars have so far proven to be far safer then even many advocates projected. We've not even gotten close to the well-hyped "trolly problem," and engineers have argued that if we do, somebody has already screwed up in the design and development process.

It's also worth reiterating that early data continues to strongly indicate that self-driving cars will be notably safer than their human-piloted counterparts, who cause 33,000 fatalities annually (usually because they were drunk or distracted by their phone). It's also worth noting that 10 pedestrians have been killed by drivers in the Phoenix area (including Tempe) in the last week alone by human drivers, and Arizona had the highest rate of pedestrian fatalities in the country last year. And it's getting worse, with 197 Arizona pedestrian deaths in 2016 compared to 224 in 2017.

We'll have to see what the investigation reveals, but hopefully the tech press will view Arizona's problem in context before writing up their inevitable hyperventilating hot takes. Ditto for lawmakers eager to justify over-regulating the emerging self-driving car industry at the behest of taxi unions or other disrupted legacy sectors. If we are going to worry about something, those calories might be better spent on shoring up the abysmal security and privacy standards in the auto industry before automating everything under the sun.


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Gary (profile), 19 Mar 2018 @ 1:09pm

    Autonomous Accidents

    Now I really don't feel safe driving in cities that are testing self-driving cars - because I always stop for right-on-red and I'm afraid I'll get rear-ended by the human drivers who don't!

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 19 Mar 2018 @ 1:52pm

    Human drivers outnumber autonomous by, say, 10,000 to 1...

    Therefore you'd expect 10,000 pedestrians to be killed. Yet aren't. Reason: humans know how other humans will behave, when are distracted and so on.

    Just a matter of time until unfixable flaws show up. Just like running Windows, a # of crashes per time are guaranteed.

    Answer this: would you trust a car-control system if designed by Microsoft?

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 19 Mar 2018 @ 2:01pm

      Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

      Are you sure you want to exit the freeway?

      reply to this | link to this | view in chronology ]

      • icon
        Berenerd (profile), 20 Mar 2018 @ 3:57am

        Re: Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

        Hi! I am Steery the helpful paper..err..steering wheel. It looks like you wish to slam on your breaks to stop..."

        reply to this | link to this | view in chronology ]

    • icon
      Ninja (profile), 19 Mar 2018 @ 2:05pm

      Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

      I would. They wouldn't take this lightly. Most company are putting tremendous effort and care into development of autonomous cars.

      And you ignore that almost all crashes were not caused by the autonomous vehicles. If not all.

      "Just a matter of time until unfixable flaws show up."

      No such thing. There will be flaws, they are always in the path of development. I do hope most people developing them and making laws aren't like you.

      I'd rather have autonomous cars all around. Their failure rates will be much lower than humans, that's guaranteed.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 19 Mar 2018 @ 3:09pm

        Very Limited Testing so far

        "... millions of test miles driven so far have repeatedly shown self-driving cars to be significantly more safe than their human-piloted counterparts."


        Those "test miles" have all been under benign, restricted conditions ... and are not comparable to the normal driving conditions faced by American drivers every day.

        That Uber automated car in Arizona was also operating under very restricted conditions, with a safety driver in the driver's seat.

        Gazillions of development testing miles do not tell you the vehicles will operate in full, real-world conditions.
        "Development" testing & "Operational" testing have different purposes and designs. Apples & Oranges

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 20 Mar 2018 @ 5:21am

          Re: Very Limited Testing so far

          > Those "test miles" have all been under benign, restricted conditions

          That's the first step, but of course automated cars are also being used in real life. Now, certain misguided law makers think they need to limit automated cars to the benign, restricted conditions, so that people like you can crawl up and say that they're only been under benign, restricted conditions. It's a self serving, useless process. Perish, thou!

          reply to this | link to this | view in chronology ]

      • icon
        Richard (profile), 20 Mar 2018 @ 7:42am

        Re: Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

        I'd rather have autonomous cars all around. Their failure rates will be much lower than humans, that's guaranteed.

        In the end game maybe - but we are not there yet. In the meantime the hubris of Google, Uber etc is driving the technology in exactly the wrong direction.

        At present the idea is that the car drives autonomously and teh human supervises it in case anything goes wrong. This is giving the human an absolutely terrible job. Zero interest, huge responsibility and total attention required. It isn't surprising that in the latest incident the human driver didn't effectively intervene.

        If we want to go to self driving cars then the correct route (for now, whilst a human is still involved) is for the computer to monitor the human driver, not the other way around.

        It's much less sexy for the computer but much better for the human driver. In fact the current situation is just repeating the history of autopilot systems - which have now been revised to be much more "computer monitoring human" than the other way around.

        I'm afraid that Google, Uber etc are basically doing a publicity stunt with an immature technology at present - whereas with proper application of the technology we could save many lives every year.

        If the technology for "monitoring the human" was pushed ahead it would provide hard evidence of the safety value of the computer. At that point we could move to a "computer monitoring computer" system with the "safety" computer beign a proven system. aspect

        reply to this | link to this | view in chronology ]

        • icon
          DB (profile), 20 Mar 2018 @ 9:03am

          Re: Re: Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

          You might have missed it, but that is already happening.

          Essentially all high-end cars have sophisticated driver assist systems. They provide lane-keeping assist, distance-keeping cruise control, automatic braking, automatic parking, and lane-change warnings.

          These systems are modestly expensive options on mid-range cars, and occasionally available on the low-end models.

          I've been using a system for about two years. I quickly changed my opinion of it from being a luxury, to being a safety system more valuable than ABS.

          Several freeways around here have the typical California 70MPH-to-stopped for no apparent reason. That's reinforced because people learn to panic stop when they see the first brake light flash. The radar is much better than I am at tracking multiple cars and deciding if this will be a cascading panic stop, or just drivers cautiously having their foot ready on the brake.

          I expect in other regions that functionality will just silently exist, protecting the driver without them ever realizing it.

          reply to this | link to this | view in chronology ]

          • icon
            Richard (profile), 20 Mar 2018 @ 10:22am

            Re: Re: Re: Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

            You might have missed it, but that is already happening.

            No - I knew that lots of vehicle manufacturers were doing this. My point was that really - that type of approach is the way forward. What Google, Uber etc are doing is probably a dead end.

            Several freeways around here have the typical California 70MPH-to-stopped for no apparent reason.

            Yes - I calculated once that the effect travels backwards up the carriageway at about 1500 mph!

            reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 19 Mar 2018 @ 2:18pm

      Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

      I think the numbers are far higher than that. More like a Million to 1. 10,000 is way, way too low. In fact just doing a quick Google,.. I get this link

      https://www.quora.com/How-many-drivers-are-on-the-road-at-any-given-time-in-the-US

      That's around 5 million cars on the road in just the LA area per day. I'm seeing other numbers of over 250 million cars on the road in the U.S.

      So these self-driving cars are a fraction, of a fraction of the cars on the road. Who can do the math. Number of normal cars on the road to people killed, to Self-Driving Cars and people killed. Does it basically end up close? Worse for Self Driving cars. What are they in total. Is it even more than 100 of them? I don't know.

      This person that got hit wasn't in a crosswalk. Not that it's an excuse to get hit, but how did the person get hit? War it running out into the street from behind something where even a human would have never been able to stop? Seems to be a lot of dump being getting hit bad cars in that state!!!! Must be a lot of jay walkers.

      Until we get the whole story, who knows. One thing is for sure, until you get the human element out of the way, you're going to have this transitional phase of humans doing dump things and crashing into self-driving cars that did nothing wrong. People running into the street is natural selection in action.

      reply to this | link to this | view in chronology ]

      • icon
        Roger Strong (profile), 19 Mar 2018 @ 3:00pm

        Re: Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

        This person that got hit wasn't in a crosswalk. Not that it's an excuse to get hit, but how did the person get hit? War it running out into the street from behind something where even a human would have never been able to stop?

        And that's an important question, because in most cases crossing in the middle of the block is legal. (There was a news story a few weeks back about police wrongly charging people with jay-walking - where illegal only where there's traffic lights at each end of the block.)

        Walking from the bus stop my office, there's a stretch with no sidewalk. In the winter you must walk on the very busy road. It's not jay-walking, but I've nearly been hit a few times.

        So if the software is giving pedestrian detection a lower priority away from crosswalks, it needs to stop doing that where there's no sidewalks, and where mid-block crossing is allowed. That's more data that needs to be in the car's internal map.

        reply to this | link to this | view in chronology ]

        • icon
          Richard (profile), 20 Mar 2018 @ 7:46am

          Re: Re: Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

          So if the software is giving pedestrian detection a lower priority away from crosswalks, it needs to stop doing that

          Actually it should NEVER give pedestrian detection a lower priority.

          The problem, as I know from bitter experience, is that with hindsight it is always possible to see how the system could have been coded in such a way as to avoid a particular incident - but then when you do that something else breaks....

          reply to this | link to this | view in chronology ]

          • icon
            The Wanderer (profile), 20 Mar 2018 @ 9:06am

            Re: Re: Re: Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

            Er... lower than what?

            From context, I read the bit you quoted as "the priority it gives to detecting pedestrians when away from crosswalks is lower than the priority it gives to detecting them when at crosswalks", not "when away from crosswalks, the priority it gives to detecting pedestrians is lower than the priority it gives to something else".

            reply to this | link to this | view in chronology ]

            • icon
              Richard (profile), 20 Mar 2018 @ 10:25am

              Re: Re: Re: Re: Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

              How does the phrade "lower priority" have a meaning unless it is "lower than something else"?

              reply to this | link to this | view in chronology ]

              • icon
                The Wanderer (profile), 20 Mar 2018 @ 4:05pm

                Re: Re: Re: Re: Re: Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

                Yes. But what are you putting lower than what?

                As I said, I read that as "the priority on X in Y situation is lower than the priority on X in other situations".

                To respond to that by saying "the priority on X should never be lower" seems nonsensical. The only way I can think of to make sense out of it, without assuming that you misunderstood the original statement, is as a confusing way of saying "the priority of X should always be maintained at the same high level".

                The most natural way to read "the priority on X should never be lower", to my eye, is as being based on the assumption that the original statement was equivalent to "in Y situation, the priority on X is lower than the priority on something else". Since I don't read the original statement as saying that, I find this confusing, so I asked for clarification - although I may have done so in a less-than-ideally-clear way, myself.

                reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 20 Mar 2018 @ 6:58am

      Re: Human drivers outnumber autonomous by, say, 10,000 to 1...

      The accident rate will definitely increase with more of these cars on the road. And as they age, the sensors will fail or not work as well when dirt builds up. I would be concerned if I lived in AZ that the govt is willing to gamble on revenues for early adoption.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 19 Mar 2018 @ 1:57pm

    33,000 deaths a year ????
    Ban cars , go thru multiple checks to see that your a responsible person .
    Beat your husband , wife , kid ? No car for you .
    Ever been convicted of a crime with a jail time of over a year even if you get no time served ? No car for you .
    Feel depressed ? No Car for you.
    Ever think of driving angry ?
    Well we have an an app for that , that makes you wait three days to drive if you already have a car {to cool down you know}

    So hey KIDS get behind something that kills more people
    each year than the 2nd ever did .
    I mean hey just think of yourselves for once .

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 19 Mar 2018 @ 2:09pm

      Re:

      This is a hilariously bad argument that is saying the exact opposite of what you think it is.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 19 Mar 2018 @ 2:19pm

        Re: Re:

        Your reply does the opposite by way of winning an argument, as you think it does.

        reply to this | link to this | view in chronology ]

      • identicon
        Christenson, 19 Mar 2018 @ 2:25pm

        Re: Re: Broken Sarcasm meter

        Our OP is clearly not serious....

        Except he's seriously wrong on one point: We(USA) run about 30,000 firearms deaths per year, with 2/3 of them suicides. And ya, you can argue how many of the remaining 10K involve drugs and drug dealers; you wanna solve that problem, rationalize drug policy!

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 19 Mar 2018 @ 4:04pm

          Re: Re: Re: Broken Sarcasm meter

          I don't think you're understanding his poor attempt at an anti-gun control screed.

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 19 Mar 2018 @ 7:01pm

            Re: Re: Re: Re: Broken Sarcasm meter

            An analysis of the FBI crime statistics found that states that adopted
            concealed carry laws REDUCED

            8.5%
            Murders

            5%
            Rapes

            7%
            Aggravated Assaults

            3%
            Robberies

            gun free zones With just one exception, every public mass shooting in the USA since 1950 has taken place where citizens are banned from carrying guns. Despite strict gun regulations, Europe has had 3 of the worst 6 school shootings.

            But still more people are killed with cars every year

            reply to this | link to this | view in chronology ]

            • icon
              Talmyr (profile), 20 Mar 2018 @ 6:58am

              Re: Re: Re: Re: Re: Broken Sarcasm meter

              Stats/citations please. And don't count Scotland in 1996.

              Certainly gun control has meant no mass murders in the UK like you have regularly. Nor Australia.

              reply to this | link to this | view in chronology ]

            • icon
              Richard (profile), 20 Mar 2018 @ 7:50am

              Re: Re: Re: Re: Re: Broken Sarcasm meter

              Despite strict gun regulations, Europe has had 3 of the worst 6 school shootings.

              And after each one the rules were tightened and there were no more incidents of that type in that place.

              Whereas in the US - after each incident, much handwringing and "never againing" ... aaaand it happens again ....

              reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 20 Mar 2018 @ 9:33am

            Re: Re: Re: Re: Broken Sarcasm meter

            The only way to stop a bad guy with a self driving car is s good guy with a self driving car.

            I have already submitted my application for a concealed autonomous vehicle. Bad guys watch out.

            reply to this | link to this | view in chronology ]

  • icon
    Ninja (profile), 19 Mar 2018 @ 1:59pm

    Now, it had to be Uber. I hope they deal with it well and don't screw everybody with shady practices leading to some catastrophic failure.

    Gotta keep an eye on how this will develop. I'm hoping it was some sort of negligence by the pedestrian. I'm not trying to blame the victim or take the death lightly but if it's something with the tech it's gonna be a blow to autonomous cars.

    reply to this | link to this | view in chronology ]

    • icon
      Roger Strong (profile), 19 Mar 2018 @ 2:14pm

      Re:

      I just hope that Uber and Google don't use their vast experience in southern California and Arizona to declare self-driving cars "ready".... for sale to folks in northern states with winter conditions.

      reply to this | link to this | view in chronology ]

      • identicon
        Thad, 19 Mar 2018 @ 2:37pm

        Re: Re:

        I've never seen one in a monsoon.

        But I suppose if it pulls over and waits until the monsoon is over, then it's smarter than 90% of Phoenix drivers.

        reply to this | link to this | view in chronology ]

      • icon
        Anonymous Anonymous Coward (profile), 19 Mar 2018 @ 2:47pm

        Re: Re:

        I would not be surprised if the self-driving cars do better than drivers from states that get little snow do when they encounter snowstorms. I learned to drive in New England, lots of snow, but I lived in the Washington DC area for some time, and a 1/4 inch of snow caused an awful lot of havoc that just does not take place in the north.

        At least the self-driving cars will be programmed with some sensible snow related driving actions. People from southern states, not so much.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 20 Mar 2018 @ 9:37am

          Re: Re: Re:

          I can imagine the autonomous vehicle reply to a request to go somewhere during a blizzard. I will have selected the Rapsta linguistic package.

          "You be trippin' yo"

          reply to this | link to this | view in chronology ]

    • icon
      Richard (profile), 20 Mar 2018 @ 7:54am

      Re:

      Gotta keep an eye on how this will develop. I'm hoping it was some sort of negligence by the pedestrian.

      In this situation it is ALWAYS the machine's fault.

      That is how the public will view it .

      reply to this | link to this | view in chronology ]

  • identicon
    Mark Wing, 19 Mar 2018 @ 2:02pm

    I'm still waiting for all those cars to get hacked by the Russians and start auto-driving everyone to GOP rallies.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 19 Mar 2018 @ 2:16pm

      Re:

      They can't be hacked remotely. That part of the vehicle is on a closed system and would have to physically be connected to it in order to hack it. At least that is what it used to be. Possible someone decided that they should allow remote connection to the system that controls the vehicle. Would probably be a career ending move if it is compromised.

      reply to this | link to this | view in chronology ]

      • identicon
        Travis, 19 Mar 2018 @ 4:34pm

        Re: Re:

        You mean like the V2V protocols mandated by the government?

        reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 19 Mar 2018 @ 4:45pm

        Re: Re:

        Possible someone decided that they should allow remote connection to the system that controls the vehicle. Would probably be a career ending move if it is compromised.

        That has already happened.

        reply to this | link to this | view in chronology ]

      • identicon
        Rich Kulawiec, 19 Mar 2018 @ 4:48pm

        Re: Re:

        "They can't be hacked remotely."

        1. or so the vendors claim
        2. that you know of
        3. today
        4. and they don't need to be

        There's a lot more to be said on this, but for the moment: a person is dead, and that's a tragedy. The debate can wait.

        reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 20 Mar 2018 @ 7:31am

        Re: Re:

        "They can't be hacked remotely. "

        So you are claiming there is no wireless interface?
        I remember reading there is such an interface, why do you think it is impervious?

        reply to this | link to this | view in chronology ]

    • identicon
      Thad, 19 Mar 2018 @ 2:42pm

      Re:

      I know you're joking, but I am concerned about deliberate attacks, from simple griefing (painting weird shit on roads or signs to confuse the cars' image recognition) to complex attacks on the networks by organized crime and hostile nations.

      I don't know how robust these cars are against external attacks, and I'm afraid the only way we're going to find out is the hard way.

      reply to this | link to this | view in chronology ]

  • icon
    That Anonymous Coward (profile), 19 Mar 2018 @ 2:03pm

    And the larger terror looming is even with a human in the car able to step in if needed (something to pacify the AI overlord fearmongers) it still happened!!!!!!

    People will be using this a "reason" to stop the self driving future. They will discount the fact that humans killed more people with cars than AI did as comparing apples to oranges. While they aren't the same, AI seems to be less murderous than humans.

    We are well on our way to having the media hyped future of this brand of AI is more likely to kill & rules banning those AI's rather than noticing that the human overrode the AI in each of the cases.

    The simple thing we have learned is even a driver who follows every rule still is no match for the human ability to ignore the world around them.

    reply to this | link to this | view in chronology ]

    • icon
      Toom1275 (profile), 19 Mar 2018 @ 3:12pm

      Re:

      "AI seem to be less murderous than humans"

      https://xkcd.com/1958/

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 20 Mar 2018 @ 7:33am

        Re: Re:

        So much this. Also the fact that to be a great thing, AI cars only need to have fewer accidents than human driven ones. If statistics went from "37,461 people a year killed in car accidents with human drivers" to "20,000 a year killed in car accidents with AI drivers. There are zero human drivers." That would be a really great thing.

        But I get the feeling, if 20,000 people were killed per year in AI driven cars, all the headlines would read "KILLER AI! It's Skynet! Hide yo wife! Hide yo kids!"

        reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 19 Mar 2018 @ 2:06pm

    My experience, which involves missing pedestrians by the thickness of a coat of paint, is that they can suddenly and unexpectedly step out in front of a vehicle. This includes pushing a pushchair into the road from between two furniture vans that were so close together that the pushchair barely went through the gap, and the mother was blind to any approaching vehicle, and they could not see her unless it was crossing the line of the gap.

    reply to this | link to this | view in chronology ]

  • identicon
    AnonCow, 19 Mar 2018 @ 2:12pm

    There is one number that matters: Pedestrian deaths per million miles driven.

    I'll put autonomous vehicles up against self-driven vehicles any day of the week.

    And even the pedestrian deaths by AV will be found to be almost 100% pedestrian fault v. self-driven vehicles.

    reply to this | link to this | view in chronology ]

    • identicon
      Scote, 19 Mar 2018 @ 2:18pm

      The goal is higher than that

      "And even the pedestrian deaths by AV will be found to be almost 100% pedestrian fault v. self-driven vehicle"

      That doesn't mean they were un-avoidable, though, even if the vehicle didn't actively chase and kill pedestrians.

      AV have some serious potential, but based on the relative crudeness of even todays current computer vision/LIDAR systems, pedestrians are still a major problem for autonomous vehicles.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 19 Mar 2018 @ 2:43pm

        Re: The goal is higher than that

        Pedestrians can be a severe problem for human drivers, they can be totally unpredictable. Just because they are looking in a shop window does not mean they wont dash out into the road when a friend calls out to them from the other side.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 20 Mar 2018 @ 9:41am

          Re: Re: The goal is higher than that

          I wonder how well these autonomous vehicles will do when encountering a moose.

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 20 Mar 2018 @ 10:38am

            Re: Re: Re: The goal is higher than that

            Possibly better than humans, as they can still steer and brake, while a human has just had an air bag explode in their face, knocking their hands of the wheel and stunning them enough they take time to attempt to brake..

            reply to this | link to this | view in chronology ]

  • icon
    Anonymous Anonymous Coward (profile), 19 Mar 2018 @ 2:16pm

    Compare and contrast

    Pedestrian Deaths By Car In Phoenix Area Last Week: 11. But One Was By A Self-Driving Uber

    I would like to see not only the police investigation reports on these 11 accidents, but the insurance companies reports on investigations into these 11 accidents. It's not like I don't trust the police reports, cause I don't, but I want to see the differences between the two reports.

    In addition, I don't think we will have a complete understanding of the 'egregiousness' of the Uber car until we understand the nature of all the accidents. While no-fault might make the 'driver' liable, we all know that they are not always at fault.

    reply to this | link to this | view in chronology ]

    • identicon
      Christenson, 19 Mar 2018 @ 2:33pm

      Dangerous Deer....

      When fast-moving objects approach animals, including the human animal, it's entirely possible for the animal to deliberately or accidentally force an accident.

      Whether it's the kid running into traffic between two cars (or out of a crowd on the sidewalk) or the deer deciding to jump in front of a car, or even into a stationary car, it's one of those things that every driver should be aware is a potential accident situation.

      Now, I do hope Uber isn't going to try to game its way out of this. Hopefully there's video footage to be examined; that's certainly a reasonable expectation for an autonomous vehicle.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Hero, 19 Mar 2018 @ 2:22pm

    > Yet whenever accidents (or near accidents) occur, they tend to be blown completely out of proportion by those terrified of (or financially disrupted by) an automated future.

    Someone was killed. Doesn't matter whether the car driver was a human, software, or software that could be over-ridden by a human.

    (If anything, perhaps collisions involving cars and deaths when only human operators are present are blown completely under proportion.)

    Part of the problem is accountability. With a human driver, it's generally straight-forward to blame the human (of course, if the car accelerates on a whim without human intervention or whatever, that's a different story).

    With software-driven cars, do you blame the technician in the driver's seat who may be able to override the software pilot? Do you blame the software developers? It can become very difficult to place blame. This is not just a matter of playing the "blame game". The concept of "accountability" has a positive reason for existing, which is to identify the problem so it can be fixed. It becomes more difficult to solve a problem if the source of the problem is difficult to determine.

    reply to this | link to this | view in chronology ]

    • identicon
      Thad, 19 Mar 2018 @ 2:36pm

      Re:

      With software-driven cars, do you blame the technician in the driver's seat who may be able to override the software pilot? Do you blame the software developers?

      Yes, if only there were some sort of legal entity whose purpose was to represent both the technician and the developers and assume liability for any accidents they caused.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Hero, 19 Mar 2018 @ 4:00pm

        Re: Re:

        Sorry I'm missing this. Please educate me.

        reply to this | link to this | view in chronology ]

        • identicon
          Thad, 19 Mar 2018 @ 4:31pm

          Re: Re: Re:

          I'm talking about corporations.

          It doesn't matter whether the accident was caused by the person behind the wheel or a defect in the software. In either one of those cases, the legally liable party is Uber, the company.

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 19 Mar 2018 @ 4:38pm

            Re: Re: Re: Re:

            What about if the accident was caused by the pedestrian?

            reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 19 Mar 2018 @ 11:36pm

              Re: Re: Re: Re: Re:

              I can only speak for my own country, but here the fault would be on the driver by default as they are controlling the largest vehicle. It can then be determined that the accident was unavoidable and fault can be removed from the driver.
              I must admit that I do not know the code well enough to tell for certain, but I don't think there is any way here that fault or blame can ever be placed on the pedestrian.

              reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 20 Mar 2018 @ 9:42am

        Re: Re:

        How's my programming?

        Dial 1-800-eat-shit

        reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 19 Mar 2018 @ 4:40pm

      Re:

      With software-driven cars, do you blame the technician in the driver's seat who may be able to override the software pilot?

      You mean the official "safety driver", who had one job?

      Accountability will be difficult when the tech becomes generally available and doesn't need to be monitored by humans. But at present, we have a person who is there for the purpose of being accountable.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 19 Mar 2018 @ 6:48pm

      Re:

      With a fully implemented system, I guess it would fall to the car owner, barring technical issues with the software that then, would be part of a massive lawsuit or fines from the Government.

      In short, self driving cars would be treated like a tool, being the car owner the one responsible for it.

      As an analogy, if your brakes are broken and you kill someone because you can't stop your car, you have to show that the car was broken and that you weren't negligent with the car maintenance.

      At that point, depending on the legal system, either they'd make the car maker/shop responsible for that, whether is directly (suing them) or indirectly, by making you (or your insurance company) pay for the damages and then you (or the insurance company) sue the shop/maker.


      I mean, responsibility is the last thing they will leave behind.

      reply to this | link to this | view in chronology ]

  • identicon
    Thad, 19 Mar 2018 @ 2:33pm

    I live in Tempe. I see a few self-driving Ubers each day on my commute.

    They're a little slow at intersections, I've never seen one on the freeway, and my wife says she saw one run a red light once. But on the whole, I trust them more than I trust Phoenix drivers.

    This is a tragedy. I'm no fan of Uber as a company, but they appear to be responding correctly so far. We don't know who was at fault yet; I'll wait to hear what they find out in the investigation.

    But, crass as it is to reduce a human life to a statistic, self-driving cars have killed fewer people than human drivers have in a comparable number of hours in the same area.

    reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 19 Mar 2018 @ 2:45pm

    Gee whiz, what a surprise... I wondered what the odds of someone at TechDirt would use that forum to try sweep a fatality under the rug because that fatality affects google's bottom godless dollar line. And yep - sure as shit, and right on cue, someone at TechDirt is doing exactly that... trying to sweep a fatality under the rug as if it means "nothing". Perhaps an aluminum baseball bat to the side of your cranium would mean nothing to certain people. I know if I saw you go down in that fashion it would literally mean nothing to me.

    reply to this | link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 19 Mar 2018 @ 3:02pm

      Re:

      someone at TechDirt is doing exactly that... trying to sweep a fatality under the rug as if it means "nothing".

      Nothing in this post does that. What it does is put it into context -- and notes that if you really do care about lives, we should be pushing for making the technology better, faster, so that it can save more lives of all the people killed by cars in other instances.

      But, of course, you'd have to read the post, and not just be looking for some bullshit fake way to attack us to understand that... and I guess that's too hard for some people.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 19 Mar 2018 @ 3:40pm

      Re:

      “It would mean literally nothing to me.”
      And yet you took the time out of your busy day of sucking dicks to tell us that...

      reply to this | link to this | view in chronology ]

  • identicon
    @b, 19 Mar 2018 @ 2:47pm

    Mathematical Error: Apples and Oranges

    1. Human-drivers in Pheonix weren't 10 times more dangerous--unless there was the same number of self-drive vehicles.

    2. The "merely one death" and "only rear-ended at a red light" defense is a absolutely inhumane argument.

    People are not scared of the road-death statistics. They are scared of the robotic approach to trolley problems.

    They don't care whether the AI is doing the moral calculation or the engineer.

    They want to do the moral calculation. Like what they do now. When they drink and drive and text whilst driving.

    That is the part that worries me. The two sides have no care for me and my family. They only fear, self-servingly.

    reply to this | link to this | view in chronology ]

    • identicon
      Thad, 19 Mar 2018 @ 4:18pm

      Re: Mathematical Error: Apples and Oranges

      1. Human-drivers in Pheonix

      It's Phoenix.

      weren't 10 times more dangerous--unless there was the same number of self-drive vehicles.

      The phrase "10 times more dangerous" does not appear anywhere in the article.

      1. The "merely one death" and "only rear-ended at a red light" defense is a absolutely inhumane argument.

      "Merely one death" may be a callous way of putting it, but the goal is to reduce the number of deaths. We should always strive to make self-driving cars as safe as possible, but the question of whether or not they're ready for wide use is not "Are they perfectly safe?", it's "Are they at least as safe as human drivers?"

      I'm not sure what you mean by "only rear-ended at a red light defense" but presumably you're referring to this part of the article:

      Google has noted repeatedly that the most common accidents it sees are when drivers rear end its AI-vehicles because they actually stopped before turning right on red.

      I don't see what's inhumane about this. What they're saying is that most collisions involving autonomous vehicles are the fault of human drivers running into them.

      I would add that, in my experience, "actually stopped before turning right on red" isn't quite a fair way of describing the situation. I don't see a lot of Waymo cars (they're more concentrated in the Chandler area; I live in Tempe and work in Phoenix), but the Uber cars I've seen really do behave unexpectedly in intersections; it's not just that they stop at red lights, it's that they slow down earlier than human drivers do (and not just for red lights) and proceed very slowly through intersections. This is probably safer for pedestrians and cyclists, but I can see how it would increase the likelihood of rearend collisions; I don't like being behind them.

      People are not scared of the road-death statistics. They are scared of the robotic approach to trolley problems.

      Nonsense.

      Most people don't even know what the trolley problem is.

      They don't care whether the AI is doing the moral calculation or the engineer.

      They want to do the moral calculation. Like what they do now. When they drink and drive and text whilst driving.

      That is the part that worries me. The two sides have no care for me and my family. They only fear, self-servingly.

      I don't think most people consider the trolley problem or look at this from a moral perspective at all. I think what people are concerned with is:

      • do I enjoy driving?
      • are self-driving cars safe?
      • are self-driving cars convenient?
      • will it save me money to use a self-driving car?
      • will self-driving cars put me out of a job?

      reply to this | link to this | view in chronology ]

  • identicon
    automaus, 19 Mar 2018 @ 3:02pm

    They'll be safe, just give us your infrastructure

    Autonomous vehicles will work eventually - as soon as we hand over our tax-funded highways and roads to private corporations. ; )

    In truth, I don't see these working without human operators somewhere in the mix, and therefore they aren't really autonomous*. Call me when they can operate somewhere other than a sunny, dry state under almost ideal conditions.

    * (This car had a driver. They just weren't driving.)

    ...........

    I'm waiting for my flying autonomous car. (My autonomous bike is so convenient. And those autonomous roller skates... excellent!)

    reply to this | link to this | view in chronology ]

    • icon
      Roger Strong (profile), 19 Mar 2018 @ 3:11pm

      Re: They'll be safe, just give us your infrastructure

      I'm waiting for a Roomba autonomous car. No RADAR or LIDAR. Just bump sensors all 'round.

      Given the local driving style, perhaps they already have.

      reply to this | link to this | view in chronology ]

    • identicon
      Thad, 19 Mar 2018 @ 4:27pm

      Re: They'll be safe, just give us your infrastructure

      Call me when they can operate somewhere other than a sunny, dry state under almost ideal conditions.

      That's sunny, dry city. Most of Arizona is not actually a desert.

      And...you get that the reason they're going for locations with predictable weather, flat land, and simple block layouts is that this is early testing, yes? You have to start somewhere. It would be foolish for the initial test market to be rural Appalachia, San Francisco, or even northern Arizona.

      Some five million people, however, do live in the Phoenix area. I'm one of them. I already share the road with these vehicles; for me, it's not a question of "call me when they're ready to operate in my area," because they already are operating in my area.

      And, as I've said elsewhere, from what I've seen of them I trust them more than human drivers.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 19 Mar 2018 @ 4:20pm

    Yeah but what's the ratio of kills? Given the rarity of self-driving cars even a single death is likely to skew the data into projecting them as more dangerous.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 19 Mar 2018 @ 4:29pm

    What Was the Human Backup Doing?

    Wasn't the point of someone sitting in the that seat to use "their superior decision-making power" and make everything better?

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 19 Mar 2018 @ 5:38pm

      Re: What Was the Human Backup Doing?

      I can think of three possibilities:

      1. The collision was one that a human driver couldn't have avoided. For example, someone walking into the street from between two cars.

      2. The collision could have been avoided be a human, and the driver was attentive, but the extra time it took to realize that a collision needed to be avoided and take control didn't leave enough time to actually avoid the collision. For example, the pedestrian steps out onto the street at T-0:05, the driver realizes that the car isn't stopping at T-0:03, takes control at T-0:01, and cannot swerve in time.

      3. The collision could have been avoided by a human, but the driver was lulled into complacency. Plainly put, the human driver just wasn't paying attention.

      Any of the three seem like viable possibilities at this point; we'll have to wait for further information comes out to know which.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 19 Mar 2018 @ 6:54pm

    I'm more worried about security regarding self driven cars one full ITSs are implemented (they require car connectivity), because they might be a weak link to hack a car or a whole network.


    It's true that human drivers don't always follow the laws, have less care and are, in general, worse drivers than self driven ones.

    But that's when things work properly.

    Get them blind (or rather, mess up the information they get, and I'm not talking just about some silly painting) and they are worse than humans.


    You can't fool a human into thinking that he can see when he actually cannot (well, you can, but in politics and copyright). Though there are cases where humans went blindly where their GPSs told them, to the point of having an accident, but my guess is that the latter ones are the exception and not the norm.

    Fooling a computer into thinking that "everything is fine" is easier; and my guess is that such will be the base of ITS network hacks.

    Make the network believe that everything is right even if cars are being crashed left and right, keep that for a minute in a big city, and the death count will go up. Fast.

    That's if cars aren't given erroneous orders (like, you know, make them think that they are in a highway and that they should be at 120 kph, in the middle of a city).

    reply to this | link to this | view in chronology ]

  • identicon
    Andrew D. Todd, 19 Mar 2018 @ 10:23pm

    The Safe Way To Cross Streets.

    I have a walking stick, four feet of steel. After twenty years, the stick has become more or less an extension of my arm, like a musketeer's sword, and I can do all kinds of ornamental flourishes without effort. When I cross streets, I punch the stick out to its full length across each lane before entering the lane myself, and I insist that each driver come to a full stop before entering the lane myself. A lot of drivers, if not already stopped, come to a stop twenty or thirty feet from me. Naturally, they do not wish me to think that they are attempting to run me over.

    I have no experience of Uber self-driving cars, but I presume that the stick's motion would be sufficient to trigger the car's sensors.

    My major concerns have to do with cars coming around blind angles, where the driver literally cannot see more than thirty feet ahead of him, and is nonetheless, driving fairly fast. There is one location where I once saw a five-way fender-bender develop in the space of two hundred feet from a standing start, when the traffic light turned green. Based on stopping distances and traffic density, a reasonable speed limit would be five miles per hour, but the posted speed limit is forty-five miles an hour. There are also places where I can sometimes amuse myself by out-walking traffic, to the humiliation of the drivers.

    reply to this | link to this | view in chronology ]

  • identicon
    Stephen, 20 Mar 2018 @ 2:22am

    "Millions of AI driver miles have been logged already by Google, Volvo, Uber and others with only a few major accidents."

    Um, that is potentially a misleading statement. Wired has also reported on this incident:

    https://www.wired.com/story/uber-self-driving-car-crash-arizona-pedestrian/

    but they offer a rather different version of that statistic.

    "Nearly 40,000 people died on American roads last year. Almost 6,000 of them were pedestrians—that’s more than 16 per day. ... But human drivers kill just 1.16 people for every 100 million miles driven. Waymo and Uber and all the rest combined are nowhere near covering that kind of distance, and they’ve already killed one."

    I notice Techdirt offered no actual numbers for its own claim.

    For the record, TheVerge in a November 2017 article claimed Waymo had logged 4 million miles:

    https://www.theverge.com/2017/11/28/16709104/waymo-self-driving-autonomous-cars-public-roads- milestone

    BTW, according to Arstechnica just this February:

    https://arstechnica.com/cars/2018/02/waymo-now-has-a-serious-driverless-car-rival-gms-crui se/

    Waymo is way ahead of the competition (e.g. Google) in terms of actual mileage racked up by its automated cars..

    Bearing in that mind, we come to another statistic, the one used in the title for this Techdirt article ("Pedestrian Deaths By Car In Phoenix Area Last Week: 11. But One Was By A Self-Driving Uber") is also potentially misleading.

    There may have been more deaths from human-driven cars in Phoenix over the week in question, but then there are far more humans driving cars in Phoenix over that same period than were recorded by their automated counterparts, and their mileage is way greater, whether one considers all the manufacturers in one lump or just by Uber by themselves.

    Which raises a question, just how many millions of miles HAD Uber's automated cars racked up prior to the accident?

    reply to this | link to this | view in chronology ]

  • icon
    Berenerd (profile), 20 Mar 2018 @ 4:03am

    Just a note...

    FYI, according to the police report, there was a Human override in the car. Someone there specifically to stop the car to prevent this. They didn't. It will be interesting to see the dash-cam footage to see what the human driver saw

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 20 Mar 2018 @ 4:12am

      Re: Just a note...

      I heard earlier that the human driver didn't see the pedestrian until the collision alert went off. Thus, IMO, this indicates that this is human error; the pedestrian was distracted and walked out in front of the car before it (or the human driver) had time to react.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 20 Mar 2018 @ 4:19am

        Re: Re: Just a note...

        The police have already said that the car and its driver were likely not at fault. See link in thread above

        reply to this | link to this | view in chronology ]

      • identicon
        Stephen, 20 Mar 2018 @ 5:17am

        Re: Re: Just a note...

        Anonymous Coward: "Thus, IMO, this indicates that this is human error; the pedestrian was distracted and walked out in front of the car before it (or the human driver) had time to react."

        So now it's the victim's fault?

        Your theory only works if the pedestrian was so close to the car when she stepped off the kerb that the car had no time to react; and none of the accounts I have read claim that. Instead they point out that the car did not slow, which in turn implies that the pedestrian was a fair distance from the car when she stepped off the kerb.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 20 Mar 2018 @ 5:33am

          Re: Re: Re: Just a note...

          Follow the links in the comment above yours, the police are indicating that neither the software or driver were at fault, but rather the pedestrian stepped out when the car was close, and was invisible until then due to shadows.

          reply to this | link to this | view in chronology ]

        • identicon
          Andrew D. Todd, 20 Mar 2018 @ 7:33am

          Re: Re: Re: Just a note...

          The safety driver turns out to be a convicted armed robber, who has already served time in prison. The victim turns out to be a mentally disturbed homeless person, living in a homeless camp, apparently having run away from her husband. The bicycle was being used in lieu of a shopping cart, with various bags suspended from it.

          It is quite probably that the victim was behaving erratically, but it is also quite likely that the safety driver was making no effort whatsoever to observe the margins of the road. The safety driver would not be a very good witness in court, obviously. Uber would be well advised to settle, and quickly.

          The Tempe police seem to be taking the view that "what's a homeless person, more or less." The locale appears to be more or less on the Arizona State University campus, and, no doubt, the police chase homeless people away on an ongoing basis.

          ---------------------------------------------------------------------------------------------------- ----------------------------------------------------------
          Self-driving Uber vehicle strikes, kills 49-year-old woman in Tempe
          Ryan Randazzo, Bree Burkitt and Uriel J. Garcia Published 10:13 a.m. MT March 19, 2018 | Updated 8:26 p.m. MT March 19, 2018

          https://www.azcentral.com/story/news/local/tempe-breaking/2018/03/19/woman-dies-fatal-hit-strike s-self-driving-uber-crossing-road-tempe/438256002/
          -------------------------------------------------- ---------------------------------------------------------------------------------------------------- ---------

          reply to this | link to this | view in chronology ]

          • identicon
            Thad, 20 Mar 2018 @ 8:04am

            Re: Re: Re: Re: Just a note...

            Thanks for the link.

            I wouldn't describe Mill and Curry as "more or less on the ASU campus", but it's not far from campus. There are some ASU-owned buildings up the road, but the edge of the campus proper is about a mile and a half away.

            reply to this | link to this | view in chronology ]

  • icon
    Ed (profile), 20 Mar 2018 @ 8:57am

    The pedestrian in this case was a woman in dark clothing on a dark night pushing a bicycle into the street 60 feet from a crosswalk, 60 feet from the street lighting at that crosswalk. I dare say no human driver would have missed her, either. In fact, a human driver likely would have hit her much harder as they'd have been speeding, as most people on that stretch do. The Uber car obeyed the speed limit. All this hand-wringing and drama over it being an Uber car is disingenuous. Where's the attention on the other pedestrians hit/killed daily by human drivers?

    reply to this | link to this | view in chronology ]

  • icon
    John85851 (profile), 20 Mar 2018 @ 10:03am

    News from 100 years ago

    This just in: an automobile has killed a pedestrian. How can we let automobiles on the road when horse-drawn carriages are so much safer? A horse knows when a person is crossing the street and can stop, but those "horseless carriages" can not.

    I just don't trust this new technology. Won't someone please stop Mr Ford before his "automobile company" gets too much bigger?

    reply to this | link to this | view in chronology ]

  • icon
    araybold (profile), 20 Mar 2018 @ 11:34am

    If You Want to Make a Rational Argument

    It is somewhat ironic that you should complain of things being blown out of proportion when you did not include the appropriate factor of proportionality for the argument started in your title, which would be, in the context established by what you have said so far, the ratio of miles driven manually versus autonomously in that timespan and in that geographical region.

    I am strongly in favor of autonomous vehicle development advancing as fast as is safely possible, but I am averse to bogus and sloppy arguments.

    reply to this | link to this | view in chronology ]

    • identicon
      Thad, 20 Mar 2018 @ 2:46pm

      Re: If You Want to Make a Rational Argument

      Except that, as I noted above, 1 fatality out of *n* miles driven is not enough to build a reliable statistical model. We simply don't have enough data to compare the safety of autonomous vehicles to human drivers based on fatalities per mile driven.

      reply to this | link to this | view in chronology ]

      • identicon
        Christenson, 20 Mar 2018 @ 5:34pm

        Re: Re: If You Want to Make a Rational Argument

        Thad:
        1 fatality out of "N" miles driven is, unfortunately, the best estimate going for autonomous cars. Yes, the variance is huge, and, more importantly, it's very unlikely that Uber or Waymo is going to leave their system alone, so next year's statistics will be different.

        Now, if the Phoenix area is killing 11 pedestrians a week, it should be possible to get a reasonably good estimate of the hazard rates in the Phoenix area.

        Finally, "per mile driven" is also really crude. Accidents typically happen at intersections, and when certain other opportunities present themselves...like pedestrians in the road, or other cars to run into, or late at night when there's a bunch of impaired drivers.

        reply to this | link to this | view in chronology ]

        • identicon
          Thad, 21 Mar 2018 @ 11:07am

          Re: Re: Re: If You Want to Make a Rational Argument

          1 fatality out of "N" miles driven is, unfortunately, the best estimate going for autonomous cars.

          I don't think that it is.

          I'll reserve final judgement until after the final report, but early indications are that the car was not at fault. Given that we have exactly one case to draw conclusions from, it makes a whole lot more sense to accept the findings in that single specific case than to try and extrapolate a trend.

          There's no trend. A thing happening one time is not a trend. You can't draw a line from a dot.

          Now, if the Phoenix area is killing 11 pedestrians a week, it should be possible to get a reasonably good estimate of the hazard rates in the Phoenix area.

          Certainly. We can draw conclusions about how dangerous human drivers are in general, how dangerous certain intersections are, etc. We've got plenty of data on those things. But we've still got a pretty small dataset for self-driving cars in general, and a smaller one for Tempe-area Ubers in particular.

          Finally, "per mile driven" is also really crude. Accidents typically happen at intersections, and when certain other opportunities present themselves...like pedestrians in the road, or other cars to run into, or late at night when there's a bunch of impaired drivers.

          Right. There are a lot of variables to control for. (And I think I already noted somewhere in the thread that I've never seen a self-driving Uber on a freeway or in a storm.) Distance driven is a limited and crude one, but it's useful for illustrating just how small our data size is at this time.

          reply to this | link to this | view in chronology ]

          • identicon
            christenson, 21 Mar 2018 @ 11:49am

            Re: Re: Re: Re: If You Want to Make a Rational Argument

            When I said *best*, I meant in a mathematical sense. This particular accident, I actually think Uber did about the same as an average human driver...but color me unsurprised if eventually it turns out that Phoenix pedestrians and deer get to being *safer* with Uber behind the wheel.

            reply to this | link to this | view in chronology ]

            • identicon
              Thad, 21 Mar 2018 @ 11:57am

              Re: tl;dr

              but color me unsurprised if eventually it turns out that Phoenix pedestrians and deer get to being safer with Uber behind the wheel.

              You and me both.

              About 8 hours before this accident occurred, and about two miles east of where it happened, a car cut me off and I had to slam my brakes so hard that my dog fell off the backseat and onto the floor.

              That wasn't an autonomous car; it was some asshole.

              reply to this | link to this | view in chronology ]

      • icon
        araybold (profile), 21 Mar 2018 @ 7:18am

        Re: Re: If You Want to Make a Rational Argument

        If that is so, then the author should not have started down that road.

        reply to this | link to this | view in chronology ]

        • identicon
          Thad, 21 Mar 2018 @ 10:57am

          Re: Re: Re: If You Want to Make a Rational Argument

          I don't think he did.

          I read the headline as "the media are focusing on the dangers of self-driving cars while accepting the dangers of manually-driven cars as a given." I don't see anywhere in the headline or the article where Karl (or whoever wrote the headline; I know headlines and articles aren't always written by the same person) says anything resembling "autonomous cars are 10 times as safe as manually-driven ones."

          reply to this | link to this | view in chronology ]

          • icon
            araybold (profile), 22 Mar 2018 @ 4:29am

            Re: Re: Re: Re: If You Want to Make a Rational Argument

            In order to make this claim, you have to read the headline as something other than what was actually written.

            reply to this | link to this | view in chronology ]

            • identicon
              Thad, 22 Mar 2018 @ 10:03am

              Re: tl;dr

              You're right. How silly of me to have read it as "Pedestrian Deaths By Car In Phoenix Area Last Week: 11. But One Was By A Self-Driving Uber" instead of "Pedestrian Deaths By Car In Phoenix Area Last Week: 11. But One Was By A Self-Driving Uber, Which Means Human-Driven Cars are 10 Times as Dangerous as Autonomous Ones."

              reply to this | link to this | view in chronology ]

              • icon
                araybold (profile), 22 Mar 2018 @ 11:34am

                Re: Re: tl;dr

                What you claim to have read it as seems to change with every post:

                'I read the headline as "the media are focusing on the dangers of self-driving cars while accepting the dangers of manually-driven cars as a given."'

                Why don't you re-read my original post with the actual title firmly in mind, and see if you have anything relevant to say.

                reply to this | link to this | view in chronology ]

                • identicon
                  Thad, 22 Mar 2018 @ 12:52pm

                  Re: Re: Re: tl;dr

                  What you claim to have read it as seems to change with every post:

                  'I read the headline as "the media are focusing on the dangers of self-driving cars while accepting the dangers of manually-driven cars as a given."'

                  Oh, now I get it. You're being disingenuous.

                  I read the headline and I interpreted it.

                  You read the headline and you interpreted it.

                  When you interpret the headline to mean "autonomous cars are 10 times as safe as manually-driven ones," your interpretation is objectively correct, even though those words do not appear in the headline.

                  When I interpret the headline to mean "the media are focusing on the dangers of self-driving cars while accepting the dangers of manually-driven cars as a given," I am "read[ing] the headline as something other than what was actually written."

                  Of course. How silly of me.

                  Why don't you re-read my original post with the actual title firmly in mind, and see if you have anything relevant to say.

                  You'll find, if you scroll up just a hair, that Christenson and I had a discussion about statistically meaningful data and other relevant variables besides the number of miles driven. I think it was productive and informative. And I think the reason that it was productive and informative is that when Christenson responded to me, he actually engaged with the points I made instead of just doubling down on nitpicking about perceived flaws in the headline.

                  But sure, why not, I'll go back and reread your first post in light of the conversation we've had since.

                  It is somewhat ironic that you should complain of things being blown out of proportion when you did not include the appropriate factor of proportionality for the argument started in your title, which would be, in the context established by what you have said so far, the ratio of miles driven manually versus autonomously in that timespan and in that geographical region.

                  Hm, okay then.

                  It is somewhat ironic that you should lecture somebody else about proportionality and then spend four posts arguing about a headline without ever engaging the substance of the article that appears underneath that headline.

                  It is considerably more ironic that you should close out the fourth such post with a crack about how I don't have anything relevant to say.

                  reply to this | link to this | view in chronology ]

  • icon
    charliebrown (profile), 21 Mar 2018 @ 12:44pm

    XKCD

    They covered this already a couple of weeks ago https://xkcd.com/1958/

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Insider Shop - Show Your Support!

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.