Should Police Have The Right To Take Control Of Self-Driving Cars?

from the kill-switch dept

As Google, Tesla, Volvo, and other companies make great strides with their self-driving car technology, we've started moving past questions about whether the technology will work, and started digging into the ethics of how it should work. For example, we recently discussed whether or not cars should be programmed to sacrifice their own driver if it means saving the lives of countless others (like a number of children on a school bus). Programmers are also battling with how to program vehicles to obey all rules -- yet still account for highway safety's biggest threat: shitty human drivers.

But another key question recently raised its head in discussing what this brave new self-driving world will look like. Just how much power should law enforcement have over your self-driving vehicle? Should law enforcement be able to stop a self-driving vehicle if you refuse to? That was a question buried recently in this otherwise routine RAND report (pdf) which posits a number of theoretical situations in which law enforcement might find the need for some kind of automobile kill switch:
"The police officer directing traffic in the intersection could see the car barreling toward him and the occupant looking down at his smartphone. Officer Rodriguez gestured for the car to stop, and the self-driving vehicle rolled to a halt behind the crosswalk.
Commissioned by the National Institute of Justice, the RAND report is filled with benign theoreticals like this, and while it briefly discusses some of the obvious problems created by giving law enforcement (and by proxy intelligence agencies) this type of power over vehicle systems and data, it doesn't offer many solutions. As parts of the report make clear, having immediate access to driver and vehicle history and data is an incredibly sexy concept for law enforcement:
"Imagine a law enforcement officer interacting with a vehicle that has sensors connected to the Internet. With the appropriate judicial clearances, an officer could ask the vehicle to identify its occupants and location histories. … Or, if the vehicle is unmanned but capable of autonomous movement and in an undesirable location (for example, parked illegally or in the immediate vicinity of an emergency), an officer could direct the vehicle to move to a new location (with the vehicle’s intelligent agents recognizing “officer” and “directions to move”) and automatically notify its owner and occupants."
Yes, because if the history of intelligence and law enforcement is any indication, the "appropriate judicial clearances" are of the utmost importance. Thanks to what will inevitably be a push for backdoors to this data, we'll obviously be creating entirely new delicious targets for hackers -- who've already been poking holes in the paper mache grade security currently "protecting" current vehicle electronics. The report does briefly acknowledge this "risk to the public’s civil rights, privacy rights, and security," but as we've seen time and time again, these concerns are a footnote in the expansion of surveillance authority.

We already live in an age where the consumer doesn't have the ability to control or modify their own vehicle's electronics courtesy of DRM and copyright, and self-driving cars are already going to be a tough sell for many people from a liberty and personal freedom perspective. Adding the ability for law enforcement to not only snoop on vehicle data but take direct control of your vehicle -- is a conversation we should start having sooner rather than later.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: autonomous vehicles, lawa enforcement, police, remote control, self-driving cars, surveillance


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Mason Wheeler (profile), 9 Sep 2015 @ 2:15pm

    The police officer directing traffic in the intersection could see the car barreling toward him and the occupant looking down at his smartphone. Officer Rodriguez gestured for the car to stop, and the self-driving vehicle rolled to a halt behind the crosswalk.

    Why is this even worthy of special consideration? Any self-driving vehicle worth actually being sold on the consumer market would detect and stop for a pedestrian in the crosswalk anyway, so why would it have to do anything different for a traffic cop?

    reply to this | link to this | view in chronology ]

    • icon
      Sheogorath (profile), 9 Sep 2015 @ 3:27pm

      Re:

      Well, given the vehicle's choice between ploughing into a bus full of children or the armed man who has the power to hurt people, wouldn't you rather the car hit the prick in the uniform even if he's your son? ;)

      reply to this | link to this | view in chronology ]

    • icon
      JoeCool (profile), 9 Sep 2015 @ 4:16pm

      Re:

      That example plays on the fears that self-driving cars cannot deal with anything out of the ordinary - in this case, an intersection where for whatever reason has a cop directing traffic instead of traffic lights. It's basically ignorance (willful in many cases) of how much work goes into these self-driving cars to deal with just the situation described.

      They describe a fear (that's probably already handled by the car) and then state that this fear will become reality unless [insert pet legislation here]. They wish to create a link between the two in the minds of the ignorant masses.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 9 Sep 2015 @ 5:10pm

        Re: Re:

        This pretty much. They don't want the ability to stop self driving cars for the mundane circumstances such as police directing traffic. Any self-driving car worth using should be able to account for pedestrians in the street, or a human in the street performing traffic control duties.

        What they really want is to be able to stop self-driving capable cars in all circumstances, even if a human is driving it instead of the car. Even if they have no right to detain you. Suspect fleeing in an area? Just stop all the cars and make sure they have to flee on foot. Decide you don't care to go through a police checkpoint and turn around to go a different way? Well they can't have that, stop the car and see why they don't want to go through a check point. Etc., etc.

        Basically just programming a self-driving car to obey existing laws should have a car that's driving itself stop if a cop signals it to pull over. The only reason to provide some sort of override is for when a human is driving, or for circumstances where cops currently aren't allowed to stop everyone.

        reply to this | link to this | view in chronology ]

    • icon
      nasch (profile), 9 Sep 2015 @ 8:49pm

      Re:

      Any self-driving vehicle worth actually being sold on the consumer market would detect and stop for a pedestrian in the crosswalk anyway, so why would it have to do anything different for a traffic cop?

      Stopping when there's someone standing in the intersection is not too bad. Recognizing that that person is a police officer directing traffic and is signaling your lane to go sounds a lot harder. I wonder if anyone has solved that one.

      reply to this | link to this | view in chronology ]

      • identicon
        Thrudd, 9 Sep 2015 @ 9:20pm

        Re: Re:

        I think it was either stop signs or traffic lights or that one flashing amber light.
        The real question though is why is a LEO in the middle of an intersection in the first place? Part of self driving vehicles is being able to communicate and coordinate with all the others nearby.
        I am fairly certain that even the Skoda will rank much higher on the intelligence and reaction times than some meat sack standing on a paint bucket.

        reply to this | link to this | view in chronology ]

        • icon
          nasch (profile), 10 Sep 2015 @ 6:44am

          Re: Re: Re:

          The real question though is why is a LEO in the middle of an intersection in the first place? Part of self driving vehicles is being able to communicate and coordinate with all the others nearby.

          When 100% of vehicles are self-driving, yes, but there will be a very long transition period when there are both autonomous and human-driven cars on the road together.

          reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 10 Sep 2015 @ 12:15am

        Re: Re:

        Stopping when there's someone standing in the intersection is not too bad. Recognizing that that person is a police officer directing traffic and is signaling your lane to go sounds a lot harder. I wonder if anyone has solved that one.


        Right now Google's self-driving car can't handle driving in rain, or anywhere other than the heavily mapped streets it's tested on. And last I heard, pedestrians were basically just moving pillars to it. It can't pick out a police officer from other people, much less tell what gestures they're making. The technology is simply much farther from being practical than it's proponents like to admit.

        That's why arguments for a kill switch like this one can sound plausible. Self driving cars currently can't look at a cop and obey gestures like a law abiding human would, so "obviously" cops need some other sort of kill switch to stop the car. That the real obvious solution is "don't allow mass production and sale of self driving cars until they are capable of obeying all signals an attentive, law abiding human would obey" is something they'd prefer not to discuss, as it doesn't let them get their foot in the door.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 10 Sep 2015 @ 2:01am

          Re: Re: Re:

          It can't pick out a police officer from other people, much less tell what gestures they're making.


          Not true.
          https://youtu.be/dk3oc1Hr62g?t=62

          reply to this | link to this | view in chronology ]

          • icon
            nasch (profile), 10 Sep 2015 @ 6:49am

            Re: Re: Re: Re:

            That's pretty cool. That's probably easier than detecting a signaling cop, but it's impressive.

            reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 10 Sep 2015 @ 9:25am

            Re: Re: Re: Re:

            I'm going by articles like this one:

            http://www.technologyreview.com/news/530276/hidden-obstacles-for-googles-self-driving-cars/

            which includes this paragraph:

            "Pedestrians are detected simply as moving, column-shaped blurs of pixels—meaning, Urmson agrees, that the car wouldn’t be able to spot a police officer at the side of the road frantically waving for traffic to stop."

            Urmson being Chris Urmson, director of the Google car team. So as of early last fall, the guy in charge of Google's self-driving cars acknowledged that no, the cars couldn't pick out a police officer, much less tell that they were waving for the car to stop.

            reply to this | link to this | view in chronology ]

            • icon
              nasch (profile), 10 Sep 2015 @ 11:56am

              Re: Re: Re: Re: Re:

              Clearly there's some conflict between "Pedestrians are detected simply as moving, column-shaped blurs of pixels" and the video demonstrating recognition of a bicyclist making an arm signal. Unless the car has different algorithms to handle bikes.

              I'm a little concerned about their approach with these cars after reading that article. "If a new stop light appeared overnight, for example, the car wouldn’t know to obey it." So if you own a Google car, you're not only trusting the car's programming, you're also trusting the map database to be up to date. I would rather have a car that can recognize a traffic light (and lots of other things) without knowing it's there ahead of time.

              reply to this | link to this | view in chronology ]

              • identicon
                Anonymous Coward, 10 Sep 2015 @ 12:23pm

                Re: Re: Re: Re: Re: Re:

                It says right in the video you linked to "Our cars treat cyclists as a special category of moving object." So yes, it does handle bikes differently than people walking.

                Your concern would be an example of why I said the technology is farther from being practical than it's proponents like to acknowledge. The sort of tests Google's doing demonstrate that it's possible. Real world practicality, under the conditions cars are currently used in, is a much different matter. Especially without re-engineering the roads to specifically accommodate self-driving cars.

                The bottom line is that the technology is young, there's still tons of issues with it to solve, and dealing with some of the issues will likely break previously existing solutions to other issues. That's not a quick process.

                reply to this | link to this | view in chronology ]

      • icon
        Sheogorath (profile), 10 Sep 2015 @ 12:47pm

        Re: Re:

        People generally don't stand in the middle of the road, wearing large white gauntlets and waving white batons in a series of pre-set signals. Given that description, I don't think it's too hard to teach a self-driving vehicle the difference between ordinary pedestrians and traffic cops.

        reply to this | link to this | view in chronology ]

        • icon
          nasch (profile), 10 Sep 2015 @ 6:00pm

          Re: Re: Re:

          People generally don't stand in the middle of the road, wearing large white gauntlets and waving white batons in a series of pre-set signals. Given that description, I don't think it's too hard to teach a self-driving vehicle the difference between ordinary pedestrians and traffic cops.

          Traffic cops aren't always in the middle of the road, don't always wear gloves, and don't always have a baton.

          reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 10 Sep 2015 @ 6:25am

      Re:

      That's the problem with a lot of the scare ideas about self driving cars, they aren't realistic at all. Same with the car sacrificing it's passengers to save other people, that idea is even stupider.

      Also, a properly programmed self driving car should be programmed to pull over if a Cop is after them with sirens blazing. A cop shouldn't need to hack into their car to make it pull over. Allowing cops to do that, even with a warrant, would make the cars much less safe.

      The real concern if any should be people who illegally purchase police equipment to make their car look kind of like a cop car, and the self driving car behaving like they are cops.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 Sep 2015 @ 2:16pm

    I would rather give up driving altogether

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 Sep 2015 @ 2:20pm

    The problem is that if the police have the ability to take remote control of a car, then the bad guys will also have that capability, along with secret services, therefore the car should normally avoid collisons with anything, and obey hand signals and visible signal lights..
    More Important is giving the occupants the ability to override the automatic system, so that for example they can force the car to drive away from, or even through a hostile crowd. Without that options, it becomes easy for gangs to hijack vehicles, spread out across the road to force the car to stop, and then move in behind the car.

    reply to this | link to this | view in chronology ]

    • icon
      pixelpusher220 (profile), 9 Sep 2015 @ 4:48pm

      Re:

      OnStar. They already have the ability to turn off the engine of a car with this technology and have I believe.

      reply to this | link to this | view in chronology ]

    • icon
      Ninja (profile), 10 Sep 2015 @ 4:50am

      Re:

      That. The occupants should be the only ones that operate the car. A network of self-driving cars is only meant to let the cars talk and act accordingly (ie: car A wants to change lanes to take next exit, broadcasts intention, other cars slow down to make space for the maneuver). Nothing should interfere with the inner systems except for those inside the car. It can be limited (ie: if we 'ban' human drivers) and only allow manual changes to the route and not actual human driving but it must be available only from the inside.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 Sep 2015 @ 2:24pm

    Yes, and if someone shady happened to get access via that backdoor/killswitch to the aforementioned vehicle, he could kill the occupants of that vehicle leaving easy to clean traces, if done properly.

    People forget that just because the military/police have it, nobody else is going to have access to it.

    Yeah, that's why there are no terrorist and criminal organizations with access to automatic weapons and explosives.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Sep 2015 @ 4:53pm

      Re:

      But don't autonomous vehicles already have to pull off and stop in the presence of an emergency vehicle with sirens and lights on? Isn't this already an effective killswitch?

      If you've got a police car, and you're driving down the road, you should already be able to stop ALL autonomous vehicles just by turning on your siren and lights. Any autonomous vehicle that doesn't stop in this situation would be breaking current laws in most places.

      So no other killswitch is needed. A situation that would merit stopping an autonomous vehicle should also merit the current prescribed ER activities. No stopping cars without drawing due attention. Period.

      reply to this | link to this | view in chronology ]

    • identicon
      PaultheCabDriver, 14 Sep 2015 @ 12:17am

      Re: like the government is more trustworthy

      Like the government is more trustworthy than the individual. government is made up of fallible humans, and giving people a badge and a gun does not make them angels.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 Sep 2015 @ 2:26pm

    First the government wanted a magic backdoor to all encryption. Now they want a golden hatchback to all self-driving cars.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Sep 2015 @ 2:32pm

      Re:

      I'm assuming the next step is Explosive Restraint Collars. I hope they're thoughtful enough to make 'em waterproof, otherwise we're gonna have one helluva pungent police state.

      reply to this | link to this | view in chronology ]

  • identicon
    Lord Binky, 9 Sep 2015 @ 2:34pm

    Ah, so this is how we are going to get to the flying car scene in the Fifth Element.

    No. If the car obeys all standard traffic laws such as pulls over for ambulances and police cars with lights and sirens on, then this is unnecessary at the expense of citizen's safety.

    Seriously, are they getting so lazy that they want us do make their job entirely push button? All the standard deterrents for a non-cooperative human operating a vehicle will still work on the self driving vehicle, so besides testing the waters for exerting more controls on civilians.

    Thinking on it more though, Yes, they can have control of my car on that condition that I get control of all of theirs with no restrictions.

    reply to this | link to this | view in chronology ]

  • icon
    Roger Strong (profile), 9 Sep 2015 @ 2:36pm

    Police have long used OnStar to remotely cut power to stolen vehicles - when they're located and in motion.

    Granted, that was coordinating not just with OnStar, but with the real owner of the car.

    reply to this | link to this | view in chronology ]

    • icon
      Mason Wheeler (profile), 10 Sep 2015 @ 7:31am

      Re:

      I find that odd. If my car was stolen, and they were able to locate it, no way no how would I want them to kill power while the vehicle was in motion, because I have no idea what's behind the car or how fast it's going, and I'd prefer to get my car back in the same condition it was in when it got stolen!

      I would, however, be just fine with saying "kill the engine the next time it stops at a red light."

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 10 Sep 2015 @ 9:03am

        Re: Re:

        I would, however, be just fine with saying "kill the engine the next time it stops at a red light."

        That is only acceptable if whoever sends the kill command is absolutely sure that the car in in a safe place to stop, and not in the middle of a junction, blocking a pedestrian crossing, or on a level crossing. That requires that they can see the car, and that implies an Orwellian surveillance system.

        reply to this | link to this | view in chronology ]

        • icon
          Mason Wheeler (profile), 10 Sep 2015 @ 12:13pm

          Re: Re: Re:

          ...or a simple GPS and a computer to plot the coordinates on a street map.

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 10 Sep 2015 @ 12:37pm

            Re: Re: Re: Re:

            GPS is not always accurate enough to detect whether you are stopped before or over a stop line. Also it cannot tell whether stopping the car where it is will block traffic because road works for example have squeezed everyone into a single lane.

            reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 Sep 2015 @ 2:39pm

    Holy crap ... are we really considering and actual "blue screen of death" ?

    reply to this | link to this | view in chronology ]

  • icon
    tom (profile), 9 Sep 2015 @ 2:39pm

    If some aspect of government can take control of a self driving car, that government must also take the responsibility for the outcome, good or bad.

    If the scenario where the traffic directing cop halts the speeding car, the cop must also assume the responsibility for the outcome. What if the car was carrying a panicked parent and his small child who managed to mostly cut off his leg with a circular saw and is now rapidly bleeding to death? The parent was letting the car drive to the nearby hospital rather then waiting for EMTs to show up. The smartphone usage was the parent calling the hospital so they could be prepping for the incoming emergency. If the delay caused by the cop causes the small child to die, the cop should own the death and any criminal or civil penalties.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Sep 2015 @ 2:55pm

      Re:

      Yeah, well, cops don't have to own the deaths caused by their bullets. What makes you think they'll own deaths caused by cars?

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 Sep 2015 @ 2:42pm

    Hmm, come to think of it, why would these cars even need an occupant? Lets say I need a ride from work, why should say, my girlfriend drive there to pick me up? Just send the car by itself. Or say, there's a house fire on a street, command all the cars on it to move out of the way of coming fire engines...

    reply to this | link to this | view in chronology ]

  • icon
    DannyB (profile), 9 Sep 2015 @ 2:45pm

    The wonderful future of self driving cars!

    Suppose law enforcement (at every level) had a golden key that would allow them to send your car simple messages.
    * lock the doors
    * pull over
    * take occupants to certain location (eg, police station, or secret unofficial police torture location)

    Naturally the Federal government will want this. Even branches such as the IRS. Or National Park Service.

    All state governments will need access to this facility.

    All local governments will need this. (How many is that?)

    And of course, all of these are trustworthy. That is, all of them are the 'good guys'. Defending Truth Justice and the Corporate Way.

    And all these golden key holders will naturally use security best practices. Even Po Dunk Sticksville.

    Control of cars will never fall into the wrong hands. Nosiree.

    And this certainly would not get abused. Just as Stingray would never be abused.


    And finally this brings me to. . . the next one to get in line for this will be the corporations. Why can't they order your car to lock the doors and drive you to their bill payment collection center?

    And I suppose copyright owners must have access to control your car because . . . PIRACY! And artists must be protected! They lose TRILLIONS of dollars a year! This is killing Hollywood. Etc.

    What if a local business could make sure your car drove you by their big sign to be sure you could see it? Maybe add a fifteen second unskippable pause before contining on to your destination (or the next businesses' unskippable advertisement).


    Self driving cars promise a wonderful future.

    reply to this | link to this | view in chronology ]

    • icon
      JoeCool (profile), 9 Sep 2015 @ 4:25pm

      Re: The wonderful future of self driving cars!

      Great ideas! But you aren't taking it far enough. Hollywood will (without an actual law) require your car to drive you to the theater every time a new movie comes out, then refuse to leave until you buy a ticket (don't have to watch the movie, just get a ticket).

      On the local front, the current high-bidder to the car company will get all cars using their service whether you want it or not. If the check clears, the next time out your car will automatically go to Jiffy-Lube, then the local car wash, and then the McDonald's drive thru.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 10 Sep 2015 @ 2:16am

      Re: The wonderful future of self driving cars!

      And I suppose copyright owners must have access to control your car because . . . PIRACY! And artists must be protected! They lose TRILLIONS of dollars a year! This is killing Hollywood. Etc.


      Actually they lose one googol per day ! That's why they hate Google so much, they're reminded of how much money they could be making.

      reply to this | link to this | view in chronology ]

  • identicon
    Kennon, 9 Sep 2015 @ 2:49pm

    if police then....

    How much longer until "we" learn that if police can do it, the criminals can too?

    reply to this | link to this | view in chronology ]

  • icon
    Roger Strong (profile), 9 Sep 2015 @ 2:52pm

    Once traffic signal pre-emption - manipulating traffic signals to give emergency vehicles the right of way - was in use, illicit devices to let others do it soon followed. The same will happen with this idea.

    Still, XKCD makes the case that the ability to remotely stop your self-driving car might be a very, very good idea.

    reply to this | link to this | view in chronology ]

    • icon
      MikeVx (profile), 9 Sep 2015 @ 8:06pm

      Re: XKCD ref.

      A small boulder won't be able to attempt to control the vehicle if external overrides are used. The operator of the vehicle should be able to ignore any external order, regardless of the "originating authority", otherwise an autonomous vehicle becomes the most effective murder weapon/kidnap aid in the history of history.

      AVs are one place where the use of closed-source proprietary systems should be an automatic life sentence.

      reply to this | link to this | view in chronology ]

  • icon
    Pronounce (profile), 9 Sep 2015 @ 2:56pm

    What Could Go Wrong

    What could go wrong giving more power to those who've already proven to abuse what power they have?

    reply to this | link to this | view in chronology ]

    • icon
      That One Guy (profile), 9 Sep 2015 @ 11:37pm

      Re: What Could Go Wrong

      Yeah, before I could even begin to think that they could be trusted with the ability they're talking about here, they'd have to show that they can be trusted with what they currently have, and so far they have failed abysmally at that, to the point that I wouldn't trust most police with a freakin' cap gun at this point.

      reply to this | link to this | view in chronology ]

  • icon
    Wyrm (profile), 9 Sep 2015 @ 3:09pm

    Lack of ability? really?

    We already live in an age where the consumer doesn't have the ability to control or modify their own vehicle's electronics courtesy of DRM and copyright,

    That's slightly wrong. DRM and copyright never removed from people the "ability" to do anything. DRM make things more difficult and copyright make it illegal, but people still have the ability to control and modify their vehicles (and other appliances) electronics.

    What has really been removed from the public is the right to do so... or maybe only the right to do so without risking more or less justified legal threats and actions.

    reply to this | link to this | view in chronology ]

  • identicon
    That One Other Not So Random Guy, 9 Sep 2015 @ 3:11pm

    from the faries and unicorns department

    Self-Driving Cars...
    This will never happen. Sorry folks.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Sep 2015 @ 11:16pm

      Re: from the faries and unicorns department

      You have not been paying very much attention to the news about this then. They passed a new law in California specifically for self driving cars. How they are not to be used in the carpool lanes.

      It's here, at least in California and probably several other states.

      reply to this | link to this | view in chronology ]

    • identicon
      PRMan, 10 Sep 2015 @ 10:30am

      Re: from the faries and unicorns department

      I already in a 2014 car don't have to hit the pedals at all when I have cruise control on, just the steering wheel. It also parallel parks itself.

      It was already more than halfway here 2 years ago.

      I think someone doesn't get out much.

      reply to this | link to this | view in chronology ]

      • icon
        nasch (profile), 10 Sep 2015 @ 11:58am

        Re: Re: from the faries and unicorns department

        It was already more than halfway here 2 years ago.

        I don't think maintaining a steady speed and/or following the car in front is more than half of what's needed to make an autonomous car. That's really the tip of the iceberg.

        reply to this | link to this | view in chronology ]

  • icon
    Sheogorath (profile), 9 Sep 2015 @ 3:20pm

    "The police officer directing traffic in the intersection could see the car barreling toward him and the occupant looking down at his smartphone. Officer Rodriguez gestured for the car to stop, and the self-driving vehicle rolled to a halt behind the crosswalk."
    That hypothesis is quite clearly bullshit. After all, if a vehicle is self-driving, then it's going to travelling at a steady speed and not 'barrelling towards' anyone. Furthermore, checking your texts whilst behind the wheel of a self-driving car doesn't have the dangers inherent in doing so whilst behind the wheel of a traditional vehicle, although I wouldn't advise doing so because of the dangers of getting caught by a gun-toting cop who can't tell the difference between a Google car and a Ford Ka.

    reply to this | link to this | view in chronology ]

    • icon
      Uriel-238 (profile), 9 Sep 2015 @ 3:26pm

      I'd think that anyone could order a self-driving vehicle to stop

      ...By merely standing in front of it. I do suspect that all piloting software would seek to acknowledge, identify and safely circumvent obstacles, living or otherwise.

      Eventually, vehicles will be self-driving by default, and may require hazard lights in the rare event that a human has taken direct control of the vehicle.

      For now, though, yes, police are going to be confused by passengers in the driver's seat.

      reply to this | link to this | view in chronology ]

  • icon
    Uriel-238 (profile), 9 Sep 2015 @ 3:20pm

    On the condition that we have a working police system with proper oversight and due process

    Then yeah, it seems like it might be a good idea to allow a car to be controlled by a police officer, especially if they are able to use this power to avert hazards and citations.

    Given the police we have in the US, given the abuses of power we see already?

    I can't say Hell no! emphatically enough.

    reply to this | link to this | view in chronology ]

  • icon
    Padpaw (profile), 9 Sep 2015 @ 3:22pm

    I would not trust the police with a dull butter knife let alone the military hardware they wield and seem constantly decide is never enough to wage war against the citizenry be they criminal or just in the way.

    reply to this | link to this | view in chronology ]

  • icon
    sigalrm (profile), 9 Sep 2015 @ 3:25pm

    Lets reframe the conversation...

    Replace every reference to "police officer", "Law Enforcement", and similar in the article above with "arbitrary third party".

    Because that's the situation you'll have in reality.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 Sep 2015 @ 3:27pm

    And remember that all those electronic devices are asking for an EMP/maser/whatever toy to get them fried (as if someone wouldn't come up with one). And forget about safety measures if that happens.

    At 120 km/h it will be an interesting show to watch, particularly the marks the brains of the occupants leave.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 Sep 2015 @ 3:36pm

    This is a horrible idea to allow the police to sacrifice the drive and is ripe for abuse. Just image this particular scenario:

    Police officer is being sued by "John Doe" over civil rights violations. Aforementioned police officer, in an effort to get out of the lawsuit, takes control of driver's vehicle and drives it off a cliff, bridge or chasm, effectively stopping the lawsuit in its tracks.

    Last thing we need is a police officer having the ability to control our vehicle. It's a system ripe for all kinds of abuses and trust me, the system being abused is exactly what will happen.

    Every program that has been created in this country has always ended up getting abused by law enforcement and by the government.

    reply to this | link to this | view in chronology ]

  • identicon
    DCL, 9 Sep 2015 @ 3:55pm

    Needs a morality switch

    To answer to the story lead in regarding morality is easy:

    Next to the "on switch" put a "morality setting" control: Selfless vs Selfish (for you Mass Effect fans: Paragon vs Renegade). You have to set it before you start the car.

    In the event of a serious incident it uses the different setting as the last decision:
    Selfless = Kill me instead of others.
    Selfish = Protect me at all costs.

    It should be right next to the Emergency stop button (I am assuming there is one of those), but it should be separate from the Navigation/environment/entertainment controls.

    But no thanks for the remote kill switch. That is all kinds of bad news.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Sep 2015 @ 4:29pm

      Re: Needs a morality switch

      The ones who would really need that "morality switch" would be the policement.

      But set it on always "morality on", please.

      reply to this | link to this | view in chronology ]

    • icon
      JoeCool (profile), 9 Sep 2015 @ 4:35pm

      Re: Needs a morality switch

      Courtesy of The Simpsons: Yep, here's your problem. Someone set this car to "Evil."

      reply to this | link to this | view in chronology ]

    • icon
      Khaim (profile), 10 Sep 2015 @ 11:02am

      Re: Needs a morality switch

      Anyone with half a brain will set this switch to the third setting:
      Smart: Avoid serious incidents.

      Of course, this option is why the people making these cars don't bother putting in the switch in the first place.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 Sep 2015 @ 4:03pm

    If this happens, look forward to a lot of people of interest having car accidents.

    reply to this | link to this | view in chronology ]

  • identicon
    Nonya, 9 Sep 2015 @ 4:30pm

    No

    What about a cop that spots a pretty girl, then uses this to rape her. Cops have been convicted of rape, it happens. They can use this to plant evidence. I have no trust in the system. It is to corrupt for me to trust.
    To the cops. If you wouldn't mess with kids like you were a pedophile things would be different. My friends grandma had to yell at the cops and tell them to not pick on a child. I have witnessed some sick stuff cops do . just sick in the head. Get so mms e professional help before you piss off the rest of the law abiding society.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 Sep 2015 @ 4:38pm

    i don't know what you guys are worrying about.

    these cars are going to be so expensive there probably won't be ten of them in the whole nation at a given time.  and repairing - ha! replacing - those systems would break a bank.

    we shouldn't worry about them hitting each other.  odds of that would be incredibly low.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Sep 2015 @ 4:49pm

      Re:

      For now yes but along a long enough timeline things get cheaper. Televisions used to break a bank but nowadays I can go to Wal-Mart and buy a new one with HD, Netflix, and browsing capability for less than $300.

      reply to this | link to this | view in chronology ]

    • icon
      That One Guy (profile), 9 Sep 2015 @ 5:49pm

      Today is not tomorrow

      Computers used to be so incredibly expensive that only large, well funded groups could have even 'simple' ones to work with. Now even kids carry around phones with more computing power than those original computer designers could have ever dreamed of.

      Just because they're not likely to be very numerous for a while, doesn't mean they won't end up likely being the primary things on the road give a decade or two.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 9 Sep 2015 @ 4:47pm

    How about no? I already don't trust them with the disproportionate tech and power they already have and now they want MORE?!?

    How about they prove to us they can be trusted before we bestow this highly abusive tech to them hmmm?

    reply to this | link to this | view in chronology ]

  • icon
    ECA (profile), 9 Sep 2015 @ 5:33pm

    OPINION

    the #1 problem here, is copyrights..
    WHO sets the abilities and HOW my car drives..
    They are customizing things to the point that your CAR isnt YOUR CAR..modding and adjusting and anything TECH is not giving the consumer Access to do anything to these cars..

    So, who is responsible FOR ITS DRIVING? not you.
    Since you have no or little responsibility to HOW the car drives...WHO gets the ticket? WHO get to goto jail if it kills someone?

    NOT YOU..

    If they want to OWN all of the control of the car...arnt the makers responsible for its working?
    This could mean they are responsible for the INSURANCE ALSO...as you are no longer the driver.

    I would LOVE this..for a few good reasons..
    UPDATES?? they find a flaw in the coding, and you dont have access to UPDATE IT..who is responsible?
    The Computer FAILS and you are stranded...who is responsible?
    There are to many things about driving, that makes the CAR responsible, and NOT the driver..

    reply to this | link to this | view in chronology ]

  • identicon
    Stephen, 9 Sep 2015 @ 6:31pm

    Minority Report Deja Vu

    Sounds like the scene in "Minority Report" where Tom Cruise's self-driving car is taken over by the cops after he goes on the run and starts to drive him to the nearest police station.

    Can the day be far off when car chase scenes in movies will become outmoded because the hero's/villain's own machinery will refuse to obey them and instead willingly hand them over, be it to the cops or to the bad guys hacking in and taking it over?

    reply to this | link to this | view in chronology ]

  • icon
    Uriel-238 (profile), 9 Sep 2015 @ 11:01pm

    The question that is raised is: what's in it for me?

    If I own an autonomous vehicle, for what reason would I want to relinquish control to someone else?

    Because functions that serve the owner of the vehicle are what's going to drive the technology to have features such as remote control.

    If it allows me to park in places for cheaper, by temporarily authorizing a cyber-valet service to move my car around, that may be a thing.

    If someone is joyriding my car and I want it to come home, I may want to be able to take control of it remotely.

    If I want my car to drop my kids off at school (and not at the mall) without supervision, I may want to limit the degree to which my kids can control the vehicle's functions

    If it's parked in a place that is interfering with responders (fire, police, ambulance), and to avoid citations, damage or towing, I may want to be able to move the car remotely or temporarily authorize the responders to move it for me.

    If I'm running errands, I may want a parking app to drive the vehicle in circles until a convenient parking place opens up.

    These are the sorts of things that are going to take us in the direction of designing transferable remote access into the driving software.

    And as much as law-enforcement might want universal backdoors, that's not going to happen without a lot of mischief as an added side effect. Besides which, the US is learning as a nation to distrust law enforcement and administrators generally.

    Technology that gives the police an option before citing or arresting the owner may be accepted by owners if it actually saves on arrests, tows or citations. If the police are going to cite someone anyway (as per the revenue-enhancement efforts of some precincts), there's no cause to facilitate their job.

    reply to this | link to this | view in chronology ]

  • identicon
    Andrew D. Todd, 10 Sep 2015 @ 3:08am

    Suppose Smart Automobiles Work Like Trains.

    Trains are routinely directed from control centers hundreds of miles away. The system has what cameras, and other sensors, it needs, and their feeds are transmitted to the control center, and fed into the computer, and the train dispatcher only makes policy decisions, like which train to give right-of-way to. The more advanced railroads have Automatic Train Stop. Additional communication with the locomotive engineer, above and beyond the standard signals, is by radio-telephone, not by stopping the train.

    A few years ago, along a road in rural West Virginia, I saw a road-repair crew using a pair of remote-controlled portable traffic lights to feed a two-lane road into one lane of traffic. They were of course much cheaper than human flaggers. I can imagine an improved traffic light, which not only signals turns, but lane changes as well.

    People kicked up about traffic light cameras, because the traffic light cameras were issuing fines. I don't think there would be so much uproar about a system which merely causes approaching cars to slow down slightly over a quarter-mile or so, so that they only arrive at the light after it has turned green.

    The railroads are working on new kinds of crossing gates for road/rail crossings. Tracks over which high-speed trains will pass need to be more positively locked-down than has previously been the case. One of the more advanced models involves rows of plastic pylons which rise out of slots in the road, and which can fit in places where there isn't room for a conventional gate. Naturally, one can imagine these kinds of gates being applied to school crossings and suchlike.

    Taking all of these things together, what I see is policemen being gradually written out of traffic control. The central computer will be making plans to manage traffic over tens of miles, and if the policeman on the ground tries to interfere, he will merely get in the way. Policemen will only become involved when there's a scofflaw, someone who deliberately hot-wires the controls on his car, or something like that.

    To Anonymous Coward, #52:

    Parenthetically, a realistic system of automatic control would mean equipping the roads with large numbers of electronic devices, such as RFID tags. Tesla doesn't want to talk about this, because Elon Musk does not own the public roads, and doesn't have any authority to rebuild them. What will practically happen in the foreseeable short run will be things like ferry trains. You drive a car onto a train, park it, and the train goes somewhere, and you drive off the train at the other end. And your car doesn't have to be a Tesla to use the system. The ferry train will probably be built like an advanced subway, with lots of automatic controls, but that is another story. You can start operating ferry trains where the demand exists, without having to replace all the cars and all the roads in the country.

    When you drive a car onto a ferry-train and park it, it ceases to be a moving vehicle. It becomes baggage, or cargo. It is no longer capable of committing a moving violation, and it is not on the public roads.

    reply to this | link to this | view in chronology ]

  • identicon
    me@me.net, 10 Sep 2015 @ 4:05am

    Just say no

    reply to this | link to this | view in chronology ]

  • identicon
    Rekrul, 10 Sep 2015 @ 8:12am

    Quite some time ago, in response to an article about self-driving cars, I stated my belief that companies would never be allowed to manufacture and sell them to the public without them containing some provision for the police to shut them down remotely.

    Of course, making the controls of the vehicle remotely accessible poses all kinds of security risks. The most a car should ever do is download updates to its GPS and data about traffic patterns. It shouldn't even be able to automatically download firmware updates. The owner should have to do that manually from a trusted source.

    It's like with web browsers; You make a browser that only displays text and pictures and barring any serious bugs, there's no way it can be used against you. Start adding ways for it to execute code and scripts and you open up a ton of vulnerabilities.

    reply to this | link to this | view in chronology ]

  • icon
    DannyB (profile), 10 Sep 2015 @ 8:15am

    An ethical question about self driving cars

    I have a different question than the typical one which goes like: should a self driving car hit a pedestrian / baby carriage / animal instead of hitting a school bus / rich person / politician?


    My question is this: should a self driving car run over a person who has a gun pointed at you?

    What if they have a badge?

    reply to this | link to this | view in chronology ]

    • icon
      Khaim (profile), 10 Sep 2015 @ 11:06am

      Re: An ethical question about self driving cars

      My question is this: should a self driving car run over a person [...]?


      No.

      What if [...]?


      Still no.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 10 Sep 2015 @ 11:36am

        Re: Re: An ethical question about self driving cars

        It should run over a person on the explicit command of the occupants, otherwise criminal gangs can simply block it in and make whatever demands, or do whatever they want to the occupants. Note when surrounded by a mob, a gun is nowhere near as likely to save you life as simply being able to drive away, over some of them if necessary. If you cannot control your car to the extent of using as a weapon, then others can take control of it to their advantage.

        reply to this | link to this | view in chronology ]

      • icon
        DannyB (profile), 10 Sep 2015 @ 11:47am

        Re: Re: An ethical question about self driving cars

        Not arguing. Not angry. Just pointing out: so you should not be able to control your car in some extreme situations (defending your life), but others should be able to control your car in any situation whenever they want to (police wanting that hot dude/babe).

        reply to this | link to this | view in chronology ]

        • icon
          Khaim (profile), 10 Sep 2015 @ 12:29pm

          Re: Re: Re: An ethical question about self driving cars

          If I'm making a self-driving car, I want it to not run people over. Designing any mechanism for it to intentionally run someone over is a terrible idea.

          Are you familiar with statistics? There's an unintuitive result where a small error rate for a very rare condition produces far more false-positives than you would expect. In particular, the backwards probability of "the test returned positive, what is the probability of the condition?" can be very low.

          In our example, if the car is 99% accurate on "do I run this person over", and 99% of the time it is in a situation where it shouldn't, then how many people will the car run over, and what percentage of them were innocent?

          That's not a rhetorical question. We can do the math. Out of 10,000 encounters:
          - 99 times the car should run over the person, and it does
          - 1 time the car should run over the person, but it doesn't
          - 9,801 times the car shouldn't run over the person, and it doesn't
          - 99 times the car shouldn't run over the person, but it does
          So of the dead pedestrians, half of them were innocent.

          That's given the 99%/1% ratios I made up on the spot. I think a 1% chance of justifiable homicide is incredibly high; as that chance goes down, the car-accidental-murder ratio goes up.

          So to answer your larger point: Yes, anyone can stop a self-driving car by stepping out in front of it. Assuming you have the doors locked, this is mostly an annoyance. They could try to threaten you with a gun - but what kind of idiot criminal is going to wave a gun in front of a half-dozen cameras connected to a full-size computer with an internet connection?

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 10 Sep 2015 @ 12:51pm

            Re: Re: Re: Re: An ethical question about self driving cars

            A dozen gang members, in hoodies and armed with crowbars, do not need to worry about cameras, and can get into any car with no trouble whatsoever unless it can drive away. Unless the occupants can command the car to move, they are toast before the cops can respond. Allowing the occupants to deliberately take manual control and drive on does not absolve them from possible legal consequences of doing so, but if they need to they would have a very strong self defence armament.
            The other one I saw, was a car broke down on a level crossing, and then the lights started to flash. The guy behind drove up to the breakdown and pushed it clear of the crossing. Minor damage to both cars, but the track was cleared.

            reply to this | link to this | view in chronology ]

            • icon
              Uriel-238 (profile), 10 Sep 2015 @ 1:33pm

              A very common urban hazard.

              I take a dozen gang members with crowbars acting in coordination is a common occurrence in urban areas?

              Sounds like a rare instance in which a human-driven car might be particularly useful.

              Also, zombie outbreaks.

              reply to this | link to this | view in chronology ]

              • identicon
                Anonymous Coward, 10 Sep 2015 @ 3:05pm

                Re: A very common urban hazard.

                If the car cannot get away, and the gangs know it, how rare would it be? If people have the ability, it is likely that it will only be used in extremely rare circumstance, but if the gangs know they cannot be run over, they will take advantage of the fact.

                reply to this | link to this | view in chronology ]

                • icon
                  Uriel-238 (profile), 10 Sep 2015 @ 3:48pm

                  Re: Re: A very common urban hazard.

                  You're still speculating. Do we have statistics of crowbar gangs going after pedestrians in packs of dozens? How are these hordes manifesting while they're waiting for driverless-car protocols to slow down their preferred prey?

                  I'm kinda curious how they live and eat now. Maybe if we started feeding these creatures (and not beating them up with truncheons) we could domesticate them in time for automated cars.

                  reply to this | link to this | view in chronology ]

          • icon
            DannyB (profile), 11 Sep 2015 @ 1:43pm

            Re: Re: Re: Re: An ethical question about self driving cars

            I did not mean to suggest that the self driving car should conclude that it should run someone over. It should not.

            But I should have the capability to OVERRIDE the self driving car and run someone over.

            reply to this | link to this | view in chronology ]

            • icon
              Uriel-238 (profile), 11 Sep 2015 @ 2:08pm

              I'm not sure of that.

              We have airplanes that will override a pilot's controls to avoid crashing into the ground (or a cliff face or whatever). The presumption is to avoid pilot error, but sometimes humans get funny-in-the-head, especially when they're being stretched and compressed and mashed during high-G turns.

              Unless there are other uses for overriding the vehicle's driving functions (e.g. getting out of mud when the computer can't do it automatically) the uses for running someone over are so few that the feature may not come standard.

              reply to this | link to this | view in chronology ]

            • identicon
              Andrew D. Todd, 12 Sep 2015 @ 8:40am

              Re: Re: Re: Re: Re: An ethical question about self driving cars

              One thing I note is that DannyB seems to be equating pedestrians with criminals. Criminals are not Luddites-- they use automobiles just about as much as anyone else in a given area. Criminals in Los Angeles do things like "drive-by" shootings. If they want to stop a car, they will do it with another car.

              DannyB is saying that anyone who doesn't drive must be a criminal.

              As a pedestrian, I've had motorists get aggressive. They weren't crooks in the ordinary sense of the word, just ordinary working dudes who had had a bad day at work, didn't dare to talk back to their bosses, and took out their aggressions on the nearest pedestrian. I got hurt one time, when one of these characters decided to accelerate across a parking lot. I managed to leap out of the way, but I took a spill. The motorist kept going, of course. Nothing broken, but a lot of muscles pulled, and the next couple of months were fairly painful. You understand that this is the mental framework in which I listen to DannyB.

              reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 14 Sep 2015 @ 4:58am

            Re: Re: Re: Re: An ethical question about self driving cars

            "what kind of idiot criminal is going to wave a gun in front of a half-dozen cameras connected to a full-size computer with an internet connection?"
            a ski-mask, a helmet, a clown, etc

            reply to this | link to this | view in chronology ]

  • icon
    Uriel-238 (profile), 10 Sep 2015 @ 12:33pm

    Even our killbots...

    ...tasked with killing specific people have a bad habit of killing lots of unspecific people. It's a problem.

    I think it may be a bad idea to make robots that sometimes kill people intentionally.

    reply to this | link to this | view in chronology ]

  • identicon
    car salesbot, 14 Sep 2015 @ 5:02am

    optional

    would you like to have an "EVASION & override" package on top on that "sport" package?
    that's 5000 Ameros each package

    reply to this | link to this | view in chronology ]

  • identicon
    Dave Mowers, 14 Sep 2015 @ 10:25am

    Communism = Technology

    You love technology because it will take away all of your personal freedom.


    Enjoy it fools.

    reply to this | link to this | view in chronology ]

    • icon
      Uriel-238 (profile), 14 Sep 2015 @ 12:47pm

      Re: Communism = Technology

      Technology can be used to take away personal freedoms.

      Technology can be used to give new freedoms to the individual that he never had.

      For me, seeing a street-level few of Iran is pretty keen.

      reply to this | link to this | view in chronology ]

    • icon
      Per Inge Oestmoen (profile), 7 Oct 2016 @ 11:19am

      Re: Communism = Technology

      Those who want to give away their freedom to obtain a little more safety, will in the end lose both their freedom and their safety.

      It is astonishing to see that people never learn that simple fact, which has been proven since time immemorial.

      reply to this | link to this | view in chronology ]

  • identicon
    Robert, 17 Sep 2015 @ 11:02am

    Only if there are protections to ensure that no one else could use this for their own purposes.

    But I'm not sure this is needed at all. I suspect that the vehicle would be programmed to pull over if a police car flashes its light at it. That is, the police wouldn't directly hack the vehicle, but order it to pull over, the same way they can order a vehicle to pull over today. Of course, someone could pose as a police officer to pull people over, but they could do that today.

    Even if the police did have a backdoor to control the vehicle, they wouldn't be able to drive it off a cliff. The car would reject it. They could order the car to pull over and stop.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 24 Sep 2015 @ 9:52am

    Maybe that's the ultimate goal.

    reply to this | link to this | view in chronology ]

  • identicon
    Per Inge Oestmoen, 27 May 2016 @ 12:29pm

    Why not just say NO to self-driving cars?

    The problems of privacy, freedom and choice connected to self-driving cars are easily overcome.

    The only thing we need to do, is to say a resounding NO to self-driving cars

    reply to this | link to this | view in chronology ]

    • icon
      nasch (profile), 27 May 2016 @ 2:27pm

      Re: Why not just say NO to self-driving cars?

      The only thing we need to do, is to say a resounding NO to self-driving cars

      As long as you acknowledge that thousands of lives will be sacrificed every year to do so.

      reply to this | link to this | view in chronology ]

      • icon
        Per Inge Oestmoen (profile), 7 Oct 2016 @ 11:11am

        Re: Re: Why not just say NO to self-driving cars?

        We need to put this into proper perspective.

        In the US, there are more than 322 million people today.

        In the US, 2,626,418 people died in 2014 from all causes. Of these, 32,675 died in traffic accidents.

        To put the numbers in perspective, let it be mentioned that in 2014 all the automobiles in the US together travelled 3,026 billion miles. That means that there is one fatal accident in 1.08 million vehicle miles travelled.

        Frankly, it bodes no good for our society if these numbers leads society to outlaw manual driving.

        Do we really want to give up our freedom, bit by bit, for a little more "safety"?

        reply to this | link to this | view in chronology ]

        • icon
          nasch (profile), 7 Oct 2016 @ 12:04pm

          Re: Re: Re: Why not just say NO to self-driving cars?

          Frankly, it bodes no good for our society if these numbers leads society to outlaw manual driving.

          About a tenth of that number died on September 11th and we basically completely changed the way we thought about travel, Muslims, and to a large extent the world. I'm pleased the response to auto deaths has been measured. Second, IMO there's no need to ban it, given enough time hardly anyone will be interested in manual driving. It will be like banning churning butter by hand.

          reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Essential Reading
Techdirt Insider Chat
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.