Should Police Have The Right To Take Control Of Self-Driving Cars?

from the kill-switch dept

As Google, Tesla, Volvo, and other companies make great strides with their self-driving car technology, we’ve started moving past questions about whether the technology will work, and started digging into the ethics of how it should work. For example, we recently discussed whether or not cars should be programmed to sacrifice their own driver if it means saving the lives of countless others (like a number of children on a school bus). Programmers are also battling with how to program vehicles to obey all rules — yet still account for highway safety’s biggest threat: shitty human drivers.

But another key question recently raised its head in discussing what this brave new self-driving world will look like. Just how much power should law enforcement have over your self-driving vehicle? Should law enforcement be able to stop a self-driving vehicle if you refuse to? That was a question buried recently in this otherwise routine RAND report (pdf) which posits a number of theoretical situations in which law enforcement might find the need for some kind of automobile kill switch:

“The police officer directing traffic in the intersection could see the car barreling toward him and the occupant looking down at his smartphone. Officer Rodriguez gestured for the car to stop, and the self-driving vehicle rolled to a halt behind the crosswalk.

Commissioned by the National Institute of Justice, the RAND report is filled with benign theoreticals like this, and while it briefly discusses some of the obvious problems created by giving law enforcement (and by proxy intelligence agencies) this type of power over vehicle systems and data, it doesn’t offer many solutions. As parts of the report make clear, having immediate access to driver and vehicle history and data is an incredibly sexy concept for law enforcement:

“Imagine a law enforcement officer interacting with a vehicle that has sensors connected to the Internet. With the appropriate judicial clearances, an officer could ask the vehicle to identify its occupants and location histories. ? Or, if the vehicle is unmanned but capable of autonomous movement and in an undesirable location (for example, parked illegally or in the immediate vicinity of an emergency), an officer could direct the vehicle to move to a new location (with the vehicle?s intelligent agents recognizing ?officer? and ?directions to move?) and automatically notify its owner and occupants.”

Yes, because if the history of intelligence and law enforcement is any indication, the “appropriate judicial clearances” are of the utmost importance. Thanks to what will inevitably be a push for backdoors to this data, we’ll obviously be creating entirely new delicious targets for hackers — who’ve already been poking holes in the paper mache grade security currently “protecting” current vehicle electronics. The report does briefly acknowledge this “risk to the public?s civil rights, privacy rights, and security,” but as we’ve seen time and time again, these concerns are a footnote in the expansion of surveillance authority.

We already live in an age where the consumer doesn’t have the ability to control or modify their own vehicle’s electronics courtesy of DRM and copyright, and self-driving cars are already going to be a tough sell for many people from a liberty and personal freedom perspective. Adding the ability for law enforcement to not only snoop on vehicle data but take direct control of your vehicle — is a conversation we should start having sooner rather than later.

Filed Under: , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Should Police Have The Right To Take Control Of Self-Driving Cars?”

Subscribe: RSS Leave a comment
100 Comments
Mason Wheeler (profile) says:

The police officer directing traffic in the intersection could see the car barreling toward him and the occupant looking down at his smartphone. Officer Rodriguez gestured for the car to stop, and the self-driving vehicle rolled to a halt behind the crosswalk.

Why is this even worthy of special consideration? Any self-driving vehicle worth actually being sold on the consumer market would detect and stop for a pedestrian in the crosswalk anyway, so why would it have to do anything different for a traffic cop?

JoeCool (profile) says:

Re: Re:

That example plays on the fears that self-driving cars cannot deal with anything out of the ordinary – in this case, an intersection where for whatever reason has a cop directing traffic instead of traffic lights. It’s basically ignorance (willful in many cases) of how much work goes into these self-driving cars to deal with just the situation described.

They describe a fear (that’s probably already handled by the car) and then state that this fear will become reality unless [insert pet legislation here]. They wish to create a link between the two in the minds of the ignorant masses.

Anonymous Coward says:

Re: Re: Re:

This pretty much. They don’t want the ability to stop self driving cars for the mundane circumstances such as police directing traffic. Any self-driving car worth using should be able to account for pedestrians in the street, or a human in the street performing traffic control duties.

What they really want is to be able to stop self-driving capable cars in all circumstances, even if a human is driving it instead of the car. Even if they have no right to detain you. Suspect fleeing in an area? Just stop all the cars and make sure they have to flee on foot. Decide you don’t care to go through a police checkpoint and turn around to go a different way? Well they can’t have that, stop the car and see why they don’t want to go through a check point. Etc., etc.

Basically just programming a self-driving car to obey existing laws should have a car that’s driving itself stop if a cop signals it to pull over. The only reason to provide some sort of override is for when a human is driving, or for circumstances where cops currently aren’t allowed to stop everyone.

nasch (profile) says:

Re: Re:

Any self-driving vehicle worth actually being sold on the consumer market would detect and stop for a pedestrian in the crosswalk anyway, so why would it have to do anything different for a traffic cop?

Stopping when there’s someone standing in the intersection is not too bad. Recognizing that that person is a police officer directing traffic and is signaling your lane to go sounds a lot harder. I wonder if anyone has solved that one.

Thrudd (profile) says:

Re: Re: Re:

I think it was either stop signs or traffic lights or that one flashing amber light.
The real question though is why is a LEO in the middle of an intersection in the first place? Part of self driving vehicles is being able to communicate and coordinate with all the others nearby.
I am fairly certain that even the Skoda will rank much higher on the intelligence and reaction times than some meat sack standing on a paint bucket.

nasch (profile) says:

Re: Re: Re: Re:

The real question though is why is a LEO in the middle of an intersection in the first place? Part of self driving vehicles is being able to communicate and coordinate with all the others nearby.

When 100% of vehicles are self-driving, yes, but there will be a very long transition period when there are both autonomous and human-driven cars on the road together.

Anonymous Coward says:

Re: Re: Re:

Stopping when there’s someone standing in the intersection is not too bad. Recognizing that that person is a police officer directing traffic and is signaling your lane to go sounds a lot harder. I wonder if anyone has solved that one.

Right now Google’s self-driving car can’t handle driving in rain, or anywhere other than the heavily mapped streets it’s tested on. And last I heard, pedestrians were basically just moving pillars to it. It can’t pick out a police officer from other people, much less tell what gestures they’re making. The technology is simply much farther from being practical than it’s proponents like to admit.

That’s why arguments for a kill switch like this one can sound plausible. Self driving cars currently can’t look at a cop and obey gestures like a law abiding human would, so “obviously” cops need some other sort of kill switch to stop the car. That the real obvious solution is “don’t allow mass production and sale of self driving cars until they are capable of obeying all signals an attentive, law abiding human would obey” is something they’d prefer not to discuss, as it doesn’t let them get their foot in the door.

Anonymous Coward says:

Re: Re: Re:2 Re:

I’m going by articles like this one:

http://www.technologyreview.com/news/530276/hidden-obstacles-for-googles-self-driving-cars/

which includes this paragraph:

“Pedestrians are detected simply as moving, column-shaped blurs of pixels—meaning, Urmson agrees, that the car wouldn’t be able to spot a police officer at the side of the road frantically waving for traffic to stop.”

Urmson being Chris Urmson, director of the Google car team. So as of early last fall, the guy in charge of Google’s self-driving cars acknowledged that no, the cars couldn’t pick out a police officer, much less tell that they were waving for the car to stop.

nasch (profile) says:

Re: Re: Re:3 Re:

Clearly there’s some conflict between “Pedestrians are detected simply as moving, column-shaped blurs of pixels” and the video demonstrating recognition of a bicyclist making an arm signal. Unless the car has different algorithms to handle bikes.

I’m a little concerned about their approach with these cars after reading that article. “If a new stop light appeared overnight, for example, the car wouldn’t know to obey it.” So if you own a Google car, you’re not only trusting the car’s programming, you’re also trusting the map database to be up to date. I would rather have a car that can recognize a traffic light (and lots of other things) without knowing it’s there ahead of time.

Anonymous Coward says:

Re: Re: Re:4 Re:

It says right in the video you linked to “Our cars treat cyclists as a special category of moving object.” So yes, it does handle bikes differently than people walking.

Your concern would be an example of why I said the technology is farther from being practical than it’s proponents like to acknowledge. The sort of tests Google’s doing demonstrate that it’s possible. Real world practicality, under the conditions cars are currently used in, is a much different matter. Especially without re-engineering the roads to specifically accommodate self-driving cars.

The bottom line is that the technology is young, there’s still tons of issues with it to solve, and dealing with some of the issues will likely break previously existing solutions to other issues. That’s not a quick process.

nasch (profile) says:

Re: Re: Re: Re:

People generally don’t stand in the middle of the road, wearing large white gauntlets and waving white batons in a series of pre-set signals. Given that description, I don’t think it’s too hard to teach a self-driving vehicle the difference between ordinary pedestrians and traffic cops.

Traffic cops aren’t always in the middle of the road, don’t always wear gloves, and don’t always have a baton.

Anonymous Coward says:

Re: Re:

That’s the problem with a lot of the scare ideas about self driving cars, they aren’t realistic at all. Same with the car sacrificing it’s passengers to save other people, that idea is even stupider.

Also, a properly programmed self driving car should be programmed to pull over if a Cop is after them with sirens blazing. A cop shouldn’t need to hack into their car to make it pull over. Allowing cops to do that, even with a warrant, would make the cars much less safe.

The real concern if any should be people who illegally purchase police equipment to make their car look kind of like a cop car, and the self driving car behaving like they are cops.

Anonymous Coward says:

The problem is that if the police have the ability to take remote control of a car, then the bad guys will also have that capability, along with secret services, therefore the car should normally avoid collisons with anything, and obey hand signals and visible signal lights..
More Important is giving the occupants the ability to override the automatic system, so that for example they can force the car to drive away from, or even through a hostile crowd. Without that options, it becomes easy for gangs to hijack vehicles, spread out across the road to force the car to stop, and then move in behind the car.

Ninja (profile) says:

Re: Re:

That. The occupants should be the only ones that operate the car. A network of self-driving cars is only meant to let the cars talk and act accordingly (ie: car A wants to change lanes to take next exit, broadcasts intention, other cars slow down to make space for the maneuver). Nothing should interfere with the inner systems except for those inside the car. It can be limited (ie: if we ‘ban’ human drivers) and only allow manual changes to the route and not actual human driving but it must be available only from the inside.

Anonymous Coward says:

Yes, and if someone shady happened to get access via that backdoor/killswitch to the aforementioned vehicle, he could kill the occupants of that vehicle leaving easy to clean traces, if done properly.

People forget that just because the military/police have it, nobody else is going to have access to it.

Yeah, that’s why there are no terrorist and criminal organizations with access to automatic weapons and explosives.

Anonymous Coward says:

Re: Re:

But don’t autonomous vehicles already have to pull off and stop in the presence of an emergency vehicle with sirens and lights on? Isn’t this already an effective killswitch?

If you’ve got a police car, and you’re driving down the road, you should already be able to stop ALL autonomous vehicles just by turning on your siren and lights. Any autonomous vehicle that doesn’t stop in this situation would be breaking current laws in most places.

So no other killswitch is needed. A situation that would merit stopping an autonomous vehicle should also merit the current prescribed ER activities. No stopping cars without drawing due attention. Period.

Lord Binky says:

Ah, so this is how we are going to get to the flying car scene in the Fifth Element.

No. If the car obeys all standard traffic laws such as pulls over for ambulances and police cars with lights and sirens on, then this is unnecessary at the expense of citizen’s safety.

Seriously, are they getting so lazy that they want us do make their job entirely push button? All the standard deterrents for a non-cooperative human operating a vehicle will still work on the self driving vehicle, so besides testing the waters for exerting more controls on civilians.

Thinking on it more though, Yes, they can have control of my car on that condition that I get control of all of theirs with no restrictions.

Mason Wheeler (profile) says:

Re: Re:

I find that odd. If my car was stolen, and they were able to locate it, no way no how would I want them to kill power while the vehicle was in motion, because I have no idea what’s behind the car or how fast it’s going, and I’d prefer to get my car back in the same condition it was in when it got stolen!

I would, however, be just fine with saying “kill the engine the next time it stops at a red light.”

Anonymous Coward says:

Re: Re: Re:

I would, however, be just fine with saying “kill the engine the next time it stops at a red light.”

That is only acceptable if whoever sends the kill command is absolutely sure that the car in in a safe place to stop, and not in the middle of a junction, blocking a pedestrian crossing, or on a level crossing. That requires that they can see the car, and that implies an Orwellian surveillance system.

tom (profile) says:

If some aspect of government can take control of a self driving car, that government must also take the responsibility for the outcome, good or bad.

If the scenario where the traffic directing cop halts the speeding car, the cop must also assume the responsibility for the outcome. What if the car was carrying a panicked parent and his small child who managed to mostly cut off his leg with a circular saw and is now rapidly bleeding to death? The parent was letting the car drive to the nearby hospital rather then waiting for EMTs to show up. The smartphone usage was the parent calling the hospital so they could be prepping for the incoming emergency. If the delay caused by the cop causes the small child to die, the cop should own the death and any criminal or civil penalties.

DannyB (profile) says:

The wonderful future of self driving cars!

Suppose law enforcement (at every level) had a golden key that would allow them to send your car simple messages.
* lock the doors
* pull over
* take occupants to certain location (eg, police station, or secret unofficial police torture location)

Naturally the Federal government will want this. Even branches such as the IRS. Or National Park Service.

All state governments will need access to this facility.

All local governments will need this. (How many is that?)

And of course, all of these are trustworthy. That is, all of them are the ‘good guys’. Defending Truth Justice and the Corporate Way.

And all these golden key holders will naturally use security best practices. Even Po Dunk Sticksville.

Control of cars will never fall into the wrong hands. Nosiree.

And this certainly would not get abused. Just as Stingray would never be abused.

And finally this brings me to. . . the next one to get in line for this will be the corporations. Why can’t they order your car to lock the doors and drive you to their bill payment collection center?

And I suppose copyright owners must have access to control your car because . . . PIRACY! And artists must be protected! They lose TRILLIONS of dollars a year! This is killing Hollywood. Etc.

What if a local business could make sure your car drove you by their big sign to be sure you could see it? Maybe add a fifteen second unskippable pause before contining on to your destination (or the next businesses’ unskippable advertisement).

Self driving cars promise a wonderful future.

JoeCool (profile) says:

Re: The wonderful future of self driving cars!

Great ideas! But you aren’t taking it far enough. Hollywood will (without an actual law) require your car to drive you to the theater every time a new movie comes out, then refuse to leave until you buy a ticket (don’t have to watch the movie, just get a ticket).

On the local front, the current high-bidder to the car company will get all cars using their service whether you want it or not. If the check clears, the next time out your car will automatically go to Jiffy-Lube, then the local car wash, and then the McDonald’s drive thru.

Anonymous Coward says:

Re: The wonderful future of self driving cars!

And I suppose copyright owners must have access to control your car because . . . PIRACY! And artists must be protected! They lose TRILLIONS of dollars a year! This is killing Hollywood. Etc.

Actually they lose one googol per day ! That’s why they hate Google so much, they’re reminded of how much money they could be making.

MikeVx (profile) says:

Re: XKCD ref.

A small boulder won’t be able to attempt to control the vehicle if external overrides are used. The operator of the vehicle should be able to ignore any external order, regardless of the “originating authority”, otherwise an autonomous vehicle becomes the most effective murder weapon/kidnap aid in the history of history.

AVs are one place where the use of closed-source proprietary systems should be an automatic life sentence.

That One Guy (profile) says:

Re: What Could Go Wrong

Yeah, before I could even begin to think that they could be trusted with the ability they’re talking about here, they’d have to show that they can be trusted with what they currently have, and so far they have failed abysmally at that, to the point that I wouldn’t trust most police with a freakin’ cap gun at this point.

Wyrm (profile) says:

Lack of ability? really?

We already live in an age where the consumer doesn’t have the ability to control or modify their own vehicle’s electronics courtesy of DRM and copyright,

That’s slightly wrong. DRM and copyright never removed from people the “ability” to do anything. DRM make things more difficult and copyright make it illegal, but people still have the ability to control and modify their vehicles (and other appliances) electronics.

What has really been removed from the public is the right to do so… or maybe only the right to do so without risking more or less justified legal threats and actions.

Sheogorath (profile) says:

“The police officer directing traffic in the intersection could see the car barreling toward him and the occupant looking down at his smartphone. Officer Rodriguez gestured for the car to stop, and the self-driving vehicle rolled to a halt behind the crosswalk.”
That hypothesis is quite clearly bullshit. After all, if a vehicle is self-driving, then it’s going to travelling at a steady speed and not ‘barrelling towards’ anyone. Furthermore, checking your texts whilst behind the wheel of a self-driving car doesn’t have the dangers inherent in doing so whilst behind the wheel of a traditional vehicle, although I wouldn’t advise doing so because of the dangers of getting caught by a gun-toting cop who can’t tell the difference between a Google car and a Ford Ka.

Uriel-238 (profile) says:

Re: I'd think that anyone could order a self-driving vehicle to stop

…By merely standing in front of it. I do suspect that all piloting software would seek to acknowledge, identify and safely circumvent obstacles, living or otherwise.

Eventually, vehicles will be self-driving by default, and may require hazard lights in the rare event that a human has taken direct control of the vehicle.

For now, though, yes, police are going to be confused by passengers in the driver’s seat.

Uriel-238 (profile) says:

On the condition that we have a working police system with proper oversight and due process

Then yeah, it seems like it might be a good idea to allow a car to be controlled by a police officer, especially if they are able to use this power to avert hazards and citations.

Given the police we have in the US, given the abuses of power we see already?

I can’t say Hell no! emphatically enough.

Anonymous Coward says:

This is a horrible idea to allow the police to sacrifice the drive and is ripe for abuse. Just image this particular scenario:

Police officer is being sued by “John Doe” over civil rights violations. Aforementioned police officer, in an effort to get out of the lawsuit, takes control of driver’s vehicle and drives it off a cliff, bridge or chasm, effectively stopping the lawsuit in its tracks.

Last thing we need is a police officer having the ability to control our vehicle. It’s a system ripe for all kinds of abuses and trust me, the system being abused is exactly what will happen.

Every program that has been created in this country has always ended up getting abused by law enforcement and by the government.

DCL says:

Needs a morality switch

To answer to the story lead in regarding morality is easy:

Next to the “on switch” put a “morality setting” control: Selfless vs Selfish (for you Mass Effect fans: Paragon vs Renegade). You have to set it before you start the car.

In the event of a serious incident it uses the different setting as the last decision:
Selfless = Kill me instead of others.
Selfish = Protect me at all costs.

It should be right next to the Emergency stop button (I am assuming there is one of those), but it should be separate from the Navigation/environment/entertainment controls.

But no thanks for the remote kill switch. That is all kinds of bad news.

Nonya says:

No

What about a cop that spots a pretty girl, then uses this to rape her. Cops have been convicted of rape, it happens. They can use this to plant evidence. I have no trust in the system. It is to corrupt for me to trust.
To the cops. If you wouldn’t mess with kids like you were a pedophile things would be different. My friends grandma had to yell at the cops and tell them to not pick on a child. I have witnessed some sick stuff cops do . just sick in the head. Get so mms e professional help before you piss off the rest of the law abiding society.

Anonymous Coward says:

i don’t know what you guys are worrying about.

these cars are going to be so expensive there probably won’t be ten of them in the whole nation at a given time.  and repairing – ha! replacing – those systems would break a bank.

we shouldn’t worry about them hitting each other.  odds of that would be incredibly low.

That One Guy (profile) says:

Re: Today is not tomorrow

Computers used to be so incredibly expensive that only large, well funded groups could have even ‘simple’ ones to work with. Now even kids carry around phones with more computing power than those original computer designers could have ever dreamed of.

Just because they’re not likely to be very numerous for a while, doesn’t mean they won’t end up likely being the primary things on the road give a decade or two.

ECA (profile) says:

OPINION

the #1 problem here, is copyrights..
WHO sets the abilities and HOW my car drives..
They are customizing things to the point that your CAR isnt YOUR CAR..modding and adjusting and anything TECH is not giving the consumer Access to do anything to these cars..

So, who is responsible FOR ITS DRIVING? not you.
Since you have no or little responsibility to HOW the car drives…WHO gets the ticket? WHO get to goto jail if it kills someone?

NOT YOU..

If they want to OWN all of the control of the car…arnt the makers responsible for its working?
This could mean they are responsible for the INSURANCE ALSO…as you are no longer the driver.

I would LOVE this..for a few good reasons..
UPDATES?? they find a flaw in the coding, and you dont have access to UPDATE IT..who is responsible?
The Computer FAILS and you are stranded…who is responsible?
There are to many things about driving, that makes the CAR responsible, and NOT the driver..

Stephen says:

Minority Report Deja Vu

Sounds like the scene in “Minority Report” where Tom Cruise’s self-driving car is taken over by the cops after he goes on the run and starts to drive him to the nearest police station.

Can the day be far off when car chase scenes in movies will become outmoded because the hero’s/villain’s own machinery will refuse to obey them and instead willingly hand them over, be it to the cops or to the bad guys hacking in and taking it over?

Uriel-238 (profile) says:

The question that is raised is: what's in it for me?

If I own an autonomous vehicle, for what reason would I want to relinquish control to someone else?

Because functions that serve the owner of the vehicle are what’s going to drive the technology to have features such as remote control.

If it allows me to park in places for cheaper, by temporarily authorizing a cyber-valet service to move my car around, that may be a thing.

If someone is joyriding my car and I want it to come home, I may want to be able to take control of it remotely.

If I want my car to drop my kids off at school (and not at the mall) without supervision, I may want to limit the degree to which my kids can control the vehicle’s functions

If it’s parked in a place that is interfering with responders (fire, police, ambulance), and to avoid citations, damage or towing, I may want to be able to move the car remotely or temporarily authorize the responders to move it for me.

If I’m running errands, I may want a parking app to drive the vehicle in circles until a convenient parking place opens up.

These are the sorts of things that are going to take us in the direction of designing transferable remote access into the driving software.

And as much as law-enforcement might want universal backdoors, that’s not going to happen without a lot of mischief as an added side effect. Besides which, the US is learning as a nation to distrust law enforcement and administrators generally.

Technology that gives the police an option before citing or arresting the owner may be accepted by owners if it actually saves on arrests, tows or citations. If the police are going to cite someone anyway (as per the revenue-enhancement efforts of some precincts), there’s no cause to facilitate their job.

Andrew D. Todd (user link) says:

Suppose Smart Automobiles Work Like Trains.

Trains are routinely directed from control centers hundreds of miles away. The system has what cameras, and other sensors, it needs, and their feeds are transmitted to the control center, and fed into the computer, and the train dispatcher only makes policy decisions, like which train to give right-of-way to. The more advanced railroads have Automatic Train Stop. Additional communication with the locomotive engineer, above and beyond the standard signals, is by radio-telephone, not by stopping the train.

A few years ago, along a road in rural West Virginia, I saw a road-repair crew using a pair of remote-controlled portable traffic lights to feed a two-lane road into one lane of traffic. They were of course much cheaper than human flaggers. I can imagine an improved traffic light, which not only signals turns, but lane changes as well.

People kicked up about traffic light cameras, because the traffic light cameras were issuing fines. I don’t think there would be so much uproar about a system which merely causes approaching cars to slow down slightly over a quarter-mile or so, so that they only arrive at the light after it has turned green.

The railroads are working on new kinds of crossing gates for road/rail crossings. Tracks over which high-speed trains will pass need to be more positively locked-down than has previously been the case. One of the more advanced models involves rows of plastic pylons which rise out of slots in the road, and which can fit in places where there isn’t room for a conventional gate. Naturally, one can imagine these kinds of gates being applied to school crossings and suchlike.

Taking all of these things together, what I see is policemen being gradually written out of traffic control. The central computer will be making plans to manage traffic over tens of miles, and if the policeman on the ground tries to interfere, he will merely get in the way. Policemen will only become involved when there’s a scofflaw, someone who deliberately hot-wires the controls on his car, or something like that.

To Anonymous Coward, #52:

Parenthetically, a realistic system of automatic control would mean equipping the roads with large numbers of electronic devices, such as RFID tags. Tesla doesn’t want to talk about this, because Elon Musk does not own the public roads, and doesn’t have any authority to rebuild them. What will practically happen in the foreseeable short run will be things like ferry trains. You drive a car onto a train, park it, and the train goes somewhere, and you drive off the train at the other end. And your car doesn’t have to be a Tesla to use the system. The ferry train will probably be built like an advanced subway, with lots of automatic controls, but that is another story. You can start operating ferry trains where the demand exists, without having to replace all the cars and all the roads in the country.

When you drive a car onto a ferry-train and park it, it ceases to be a moving vehicle. It becomes baggage, or cargo. It is no longer capable of committing a moving violation, and it is not on the public roads.

Rekrul says:

Quite some time ago, in response to an article about self-driving cars, I stated my belief that companies would never be allowed to manufacture and sell them to the public without them containing some provision for the police to shut them down remotely.

Of course, making the controls of the vehicle remotely accessible poses all kinds of security risks. The most a car should ever do is download updates to its GPS and data about traffic patterns. It shouldn’t even be able to automatically download firmware updates. The owner should have to do that manually from a trusted source.

It’s like with web browsers; You make a browser that only displays text and pictures and barring any serious bugs, there’s no way it can be used against you. Start adding ways for it to execute code and scripts and you open up a ton of vulnerabilities.

DannyB (profile) says:

An ethical question about self driving cars

I have a different question than the typical one which goes like: should a self driving car hit a pedestrian / baby carriage / animal instead of hitting a school bus / rich person / politician?

My question is this: should a self driving car run over a person who has a gun pointed at you?

What if they have a badge?

Anonymous Coward says:

Re: Re: An ethical question about self driving cars

It should run over a person on the explicit command of the occupants, otherwise criminal gangs can simply block it in and make whatever demands, or do whatever they want to the occupants. Note when surrounded by a mob, a gun is nowhere near as likely to save you life as simply being able to drive away, over some of them if necessary. If you cannot control your car to the extent of using as a weapon, then others can take control of it to their advantage.

Khaim (profile) says:

Re: Re: Re: An ethical question about self driving cars

If I’m making a self-driving car, I want it to not run people over. Designing any mechanism for it to intentionally run someone over is a terrible idea.

Are you familiar with statistics? There’s an unintuitive result where a small error rate for a very rare condition produces far more false-positives than you would expect. In particular, the backwards probability of “the test returned positive, what is the probability of the condition?” can be very low.

In our example, if the car is 99% accurate on “do I run this person over”, and 99% of the time it is in a situation where it shouldn’t, then how many people will the car run over, and what percentage of them were innocent?

That’s not a rhetorical question. We can do the math. Out of 10,000 encounters:
– 99 times the car should run over the person, and it does
– 1 time the car should run over the person, but it doesn’t
– 9,801 times the car shouldn’t run over the person, and it doesn’t
– 99 times the car shouldn’t run over the person, but it does
So of the dead pedestrians, half of them were innocent.

That’s given the 99%/1% ratios I made up on the spot. I think a 1% chance of justifiable homicide is incredibly high; as that chance goes down, the car-accidental-murder ratio goes up.

So to answer your larger point: Yes, anyone can stop a self-driving car by stepping out in front of it. Assuming you have the doors locked, this is mostly an annoyance. They could try to threaten you with a gun – but what kind of idiot criminal is going to wave a gun in front of a half-dozen cameras connected to a full-size computer with an internet connection?

Anonymous Coward says:

Re: Re: Re:2 An ethical question about self driving cars

A dozen gang members, in hoodies and armed with crowbars, do not need to worry about cameras, and can get into any car with no trouble whatsoever unless it can drive away. Unless the occupants can command the car to move, they are toast before the cops can respond. Allowing the occupants to deliberately take manual control and drive on does not absolve them from possible legal consequences of doing so, but if they need to they would have a very strong self defence armament.
The other one I saw, was a car broke down on a level crossing, and then the lights started to flash. The guy behind drove up to the breakdown and pushed it clear of the crossing. Minor damage to both cars, but the track was cleared.

Uriel-238 (profile) says:

Re: Re: Re:5 A very common urban hazard.

You’re still speculating. Do we have statistics of crowbar gangs going after pedestrians in packs of dozens? How are these hordes manifesting while they’re waiting for driverless-car protocols to slow down their preferred prey?

I’m kinda curious how they live and eat now. Maybe if we started feeding these creatures (and not beating them up with truncheons) we could domesticate them in time for automated cars.

Uriel-238 (profile) says:

Re: Re: Re:3 I'm not sure of that.

We have airplanes that will override a pilot’s controls to avoid crashing into the ground (or a cliff face or whatever). The presumption is to avoid pilot error, but sometimes humans get funny-in-the-head, especially when they’re being stretched and compressed and mashed during high-G turns.

Unless there are other uses for overriding the vehicle’s driving functions (e.g. getting out of mud when the computer can’t do it automatically) the uses for running someone over are so few that the feature may not come standard.

Andrew D. Todd (user link) says:

Re: Re: Re:3 An ethical question about self driving cars

One thing I note is that DannyB seems to be equating pedestrians with criminals. Criminals are not Luddites– they use automobiles just about as much as anyone else in a given area. Criminals in Los Angeles do things like “drive-by” shootings. If they want to stop a car, they will do it with another car.

DannyB is saying that anyone who doesn’t drive must be a criminal.

As a pedestrian, I’ve had motorists get aggressive. They weren’t crooks in the ordinary sense of the word, just ordinary working dudes who had had a bad day at work, didn’t dare to talk back to their bosses, and took out their aggressions on the nearest pedestrian. I got hurt one time, when one of these characters decided to accelerate across a parking lot. I managed to leap out of the way, but I took a spill. The motorist kept going, of course. Nothing broken, but a lot of muscles pulled, and the next couple of months were fairly painful. You understand that this is the mental framework in which I listen to DannyB.

Robert (profile) says:

Only if there are protections to ensure that no one else could use this for their own purposes.

But I’m not sure this is needed at all. I suspect that the vehicle would be programmed to pull over if a police car flashes its light at it. That is, the police wouldn’t directly hack the vehicle, but order it to pull over, the same way they can order a vehicle to pull over today. Of course, someone could pose as a police officer to pull people over, but they could do that today.

Even if the police did have a backdoor to control the vehicle, they wouldn’t be able to drive it off a cliff. The car would reject it. They could order the car to pull over and stop.

Per Inge Oestmoen (profile) says:

Re: Re: Why not just say NO to self-driving cars?

We need to put this into proper perspective.

In the US, there are more than 322 million people today.

In the US, 2,626,418 people died in 2014 from all causes. Of these, 32,675 died in traffic accidents.

To put the numbers in perspective, let it be mentioned that in 2014 all the automobiles in the US together travelled 3,026 billion miles. That means that there is one fatal accident in 1.08 million vehicle miles travelled.

Frankly, it bodes no good for our society if these numbers leads society to outlaw manual driving.

Do we really want to give up our freedom, bit by bit, for a little more “safety”?

nasch (profile) says:

Re: Re: Re: Why not just say NO to self-driving cars?

Frankly, it bodes no good for our society if these numbers leads society to outlaw manual driving.

About a tenth of that number died on September 11th and we basically completely changed the way we thought about travel, Muslims, and to a large extent the world. I’m pleased the response to auto deaths has been measured. Second, IMO there’s no need to ban it, given enough time hardly anyone will be interested in manual driving. It will be like banning churning butter by hand.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...