If We're Not Careful, Self-Driving Cars Will Be The Cornerstone Of The DRM'd, Surveillance Dystopias Of Tomorrow

from the who-controls-the-code dept

We’ve talked a lot about the ethical and programming problems currently facing those designing self driving cars. Some are less complicated, such as how to program cars to bend the rules slightly and be more more human like. Others get more complex, including whether or not cars should be programmed to kill the occupant — if it means saving a school bus full of children (aka the trolley problem). And once automated cars are commonplace, can law enforcement have access to the car’s code to automatically pull a driver over? There’s an ocean of questions we’re not really ready to answer.

But as we accelerate down the evolutionary highway of self-driving technology, the biggest question of all becomes: who gets to control this code? Will the automotive update process be transparent? Will the driver retain the ability to modify their car’s code? Will automakers adapt and stop implementing the kind of paper mache level security that has resulted in the endless parade of stories about hacked automobiles it takes five years for automakers to patch?

Trying to force the issue before there’s a hacker-induced automotive mass fatality, Ford, GM and Toyota were hit by a class action lawsuit earlier this year claiming the car companies were failing to adequately disclose the problems caused my abysmal auto security:

“Among other things, the lawsuit alleges Toyota, Ford and GM concealed or suppressed material facts concerning the safety, quality and functionality of vehicles equipped with these systems. It charges the companies with fraud, false advertising and violation of consumer protections statutes. Stanley continued, “We shouldn’t need to wait for a hacker or terrorist to prove exactly how dangerous this is before requiring car makers to fix the defect. Just as Honda has been forced to recall cars to repair potentially deadly airbags, Toyota, Ford and GM should be required to recall cars with these dangerous electronic systems.”

This month a court ruled that yes, we will have to probably wait for someone to die before automakers are held liable for lagging automotive security. The case was ultimately dismissed (pdf), the court ruling that the plaintiffs have yet to prove sufficiently concrete harms, and that potential damage (to the driver and to others) remains speculative. At the pace self-driving and smart car technology is advancing, one gets the sneaking suspicion we won’t have long to wait before harms become notably more concrete.

But however complicated these legal, ethical, and technical questions are, they become immeasurably more complicated once you realize that smart cars will ultimately form the backbone of the smart cities of tomorrow, working in concert with city infrastructure to build a living urban organism designed to be as efficient as mathematically possible. As Cory Doctorow noted last week, this makes ensuring code transparency and consumer power more important than ever:

“The major attraction of autonomous vehicles for city planners is the possibility that they?ll reduce the number of cars on the road, by changing the norm from private ownership to a kind of driverless Uber. Uber can even be seen as a dry-run for autonomous, ever-circling, point-to-point fleet vehicles in which humans stand in for the robots to come ? just as globalism and competition paved the way for exploitative overseas labour arrangements that in turn led to greater automation and the elimination of workers from many industrial processes.

If Uber is a morally ambiguous proposition now that it?s in the business of exploiting its workforce, that ambiguity will not vanish when the workers go. Your relationship to the car you ride in, but do not own, makes all the problems mentioned even harder. You won?t have the right to change (or even monitor, or certify) the software in an Autonom-uber. It will be designed to let third parties (the fleet?s owner) override it. It may have a user override (Tube trains have passenger-operated emergency brakes), possibly mandated by the insurer, but you can just as easily see how an insurer would prohibit such a thing altogether.”

You’d hate to wander too casually into the hyperbole territory traditionally reserved for hysterical Luddites, but there’s a laundry list of reasons to be worried about the trajectory of the lowly automobile. If we don’t demand code transparency and consumer empowerment in automotive standards now, your car may find itself the cornerstone of a future in which DRM, encryption backdoors, lax security standards, eroded consumer legal rights, insurance companies and government power combine to create a supernova of dystopian dysfunction.

Filed Under: , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “If We're Not Careful, Self-Driving Cars Will Be The Cornerstone Of The DRM'd, Surveillance Dystopias Of Tomorrow”

Subscribe: RSS Leave a comment
53 Comments
Machin Shin (profile) says:

The good old classics

I have told many people that having looked at the price tags I would sooner by a restored classic car long before I would buy a new car. Every day I see new stories about modern cars that just reinforce that feeling. Why would I spend $55k on a new corvette when I can get a beautiful 1969 corvette for $30k? To me the older cars look much better than the new ones and I know I can fix that 69 corvette with a good set of wrenches. No worries about DRM on that thing.

JMT says:

Re: Re: Re: The good old classics

While I share your enthusiasm for older cars, you have to be truly ignorant of modern automotive engineering to think a few basic bolt-ons to an old car can make it as safe as a new one. What you’re doing is simply accepting greater risk of injury in an accident. Nothing wrong with honestly admitting that.

Also note that driving in a caged car without a helmet makes serious injury more likely than if you had no cage. Side impacts are far more common than rollovers, and smacking your head against a cage during a crash is never pretty…

Mason Wheeler (profile) says:

Re: Re: Re: The good old classics

I will take that solid steal frame and body vs your fiberglass crumple zones any day.

Then have fun dying when you hit something. Those crumple zones are there to protect you–not whatever you hit–by helping to absorb the impact.

Since deceleration trauma is deceleration trauma no matter which direction it occurs in, think about it vertically. If you do the math, you find that getting in a crash at 60 MPH is almost exactly equivalent to falling off of a 14-story building. If that happened, would you rather land directly on the sidewalk, or on a pile of cushions?

Machin Shin (profile) says:

Re: Re: Re:2 The good old classics

I realize the point of the crumple zones. I also realize that most of the wrecks will be car vs car, not car vs solid object. Even more to the point is that very large part of the time I’m driving in town at 35mph or less.

Either way, I’m taking on certain risks by driving older vehicles. They are risks I’m willing to accept though. I do not base my decision to buy a car from the standpoint of planning to crash it. I do base it on things like my ability to control the car and avoid a crash. Things like not having a computer inserted between my controls and the car.

There is something nice about knowing that I have control and no computer glitch can crash the car for me.

Anonymous Coward says:

Re: Re: Re:3 The good old classics

False sense of security.

Yes computers glitch, but there would be far fewer overall crashes once our evil self driving cars take over.

Do do anything financial online? This is done by computers.
Do you fly commercial? Most of the work is done by computers.
Do trust your traffic camera’s? Those are done by computers too!

People falsely assume that a malfunction is an instant death sentence and it is hardly the case at all. Crumple Zones, computers, and updated physical mechanics are all deliberated upon my professionals who dedicate more time to these things than you could possibly even understand.

You are a Darwin Awardee in waiting!

Machin Shin (profile) says:

Re: Re: Re:4 The good old classics

You assume a lot just by my stated opinion about cars. Such as assuming I dislike technology. In fact, I work in IT and love technology. That still does not mean I like the idea of trusting a computer to handle things like acceleration, steering and braking.

I especially don’t like the idea of trusting a computer when some moron insists on connecting that same computer up to the in dash entertainment system that has Bluetooth and WiFi enabled.

Even more so when the code is locked away and I’m not allowed to look at it. This is a big concern. Knowing what I know about computers I would rather not blindly trust some programmer without having the right to check his work.

As for your questions.

Yes I do financial stuff online, and by doing so I risk someone stealing my money, but no physical harm is done.

I don’t fly if I can avoid it. I would also like to point out that those computer systems have been hacked before.

Do I trust traffic camera’s? What does that have to do with anything? And no, I don’t really trust them, I mean come on. Most of them are not secure and open for anyone to watch if they like. Then they are also wide open for abusive use in tracking people’s movement.

I do not fear all technology. In fact I love technology and I am very excited about the advancements that I am seeing in technology. What I fear is the fact that time after time after time people have shown that technology will be abused. I do not fear the tech, I fear the people who are already drooling over the new ways they can abuse it.

JMT says:

Re: Re: Re:3 The good old classics

“I do base it on things like my ability to control the car and avoid a crash. Things like not having a computer inserted between my controls and the car.”

Modern stability controls can do a much better job of allowing you to control the car and avoid a crash than most drivers are able to. You say you’re in IT, not a professional driver, so it’s safe to assume that includes you. I’m not criticising your car choices, just your rationalisation for them. Claiming you can do a better job of avoiding a crash on your own implies a skill level that’s probably higher than the reality.

Machin Shin (profile) says:

Re: Re: Re:4 The good old classics

There is a difference between things like stability control, anti-lock brakes, and systems such as those and a full drive by wire car. Those systems fail and you still have control of the car.

I’m not saying that I don’t see the benefit of a lot of these advancements. In fact I find ABS systems to be pretty awesome especially in the rain. I just don’t like the direction things are going where their is no redundancy and the computer has far too much control.

Also, My dislike of these computer systems is far from the only reason I like older cars. One of the biggest reasons is because most modern cars look like shit.

Anonymous Coward says:

Re: The good old classics

Indeed, those older cars are easier to repair. However, there are more than a few disadvantages…..

1. When was the last time you heard of a car built in the 60s or 70s going over 100,000 miles? Wasn’t too often. However, with a modern card, it’s totally routine.

2. Remember the yearly tuneups you had to do in order to get ready for winter? Doesn’t seem to be all that common with modern cars.

JoeCool (profile) says:

Re: Re: The good old classics

Our old 1963 Valiant station wagon went 362K miles before the odometer broke. There was by that time no parts to fix it, so we drove it another maybe 200K miles before selling it to a collector. It’s just the opposite of what you state – old cars went FOREVER on one or two oil changes while a modern one is lucky to make 100K without maintenance every 5K miles.

rebrad (profile) says:

Time to Unplug and Go Dark

I’m going to buy myself a 67 Chevy with a tube AM/FM Radio, throw my phone in the garbage disposal and hit the back roads. I mean this stuff scares the shit out of me. Especially when I can’t find any redeeming value out of the technological revolution worth much more than the simple remote control. Stay if you will. I’ll give a big wave on your trip to hell.

Anonymous Coward says:

Re: Mr. N

he proclaimed that touch tone phones were a rip-off for charging people more

It’s basically true, but it’s touch-tone dialling and not phones that are the ripoff. “The phone company” used to charge approx. $3/month for tone dialling service, and unless you had a modem constantly redialling it wasn’t all that useful. Many tone-capable phones, when set to pulse mode, would temporarily switch to tone mode if you pressed * so you could use IVR menus. And most vertical service codes have pulse equivalents, e.g. 1167 instead of *67.

(Actually, telcos were lucky not to be sued, mostly. They advertised DTMF as being useful for phone menus, but tone dialling service had nothing to do with that. The only thing they got into trouble for IIRC was adding DTMF service to pulse-grandfathered lines without consent.)

W says:

Stuff like this is why I hate so many in the free speech tech community.

I grew up on the Internet while constantly hearing promises of an amazing future where freedom and user-empowerment was paramount.

Instead I find the freedoms I routinely enjoyed while growing up are being taken away from me and the future is moving towards a dystopian nightmare. The present ubiquitous surveillance already is a nightmare in and of itself.

I hate that the world that was promised by so many tech luminaries is being perverted towards the exact opposite. I hate that I had so much hope for the future and only find said hope continually being dashed and kicked while in the ground.

Why do you keep giving hope only to take it away? Over and over and over and over again. This is cruelty.

Anonymous Coward says:

Re: Re: Re:

I can’t make a future free of ubiquitous surveillance. That is not within my power to do anything about.

I no longer have intellectual privacy online, which has replaced much of what I enjoyed about the Internet with anxiety. I do not know what to do about that short of no longer using the Internet, which is rapidly becoming a less feasible option.

I can’t make a future where I’m free to create what comes to mind without worrying about threats to my freedom via law enforcement misinterpreting my words (look up what happened to Justin Carter). This is not something I can do anything about.

I can’t enjoy many cultural works anymore because of arcane, arbitrary, and vexatious copyright law enforcement on the Internet. This is not something I can do anything about.

Worst of all, I am seeing many tech luminaries encourage censorship when once upon a time the concept was akin to heresy! It’s gotten so bad that I double-check everything I type out to ensure that it can’t easily be taken out of context to ruin me later in life. I don’t want my career to be the victim of the next Twitter mob egged-on by Silicon Valley activists and CEOs! I’ve been trying to do something about that for a year and a half and I’ve not had been able to make any change.

The future that I want is not one that is in my power to have. Once upon a time I wanted to be like the hackers of olde, creating more software to underpin more of the internet and creating awesome shit. Now? The future that I am getting is so frequently causing pain that I am seriously considering dropping my career in tech.

Anonymous Coward says:

Re: Re: Re: Re:

There’s a strange parallel between those who fear terrorism and those who fear government overreach. At least for the moment neither are rational for the average citizen of a developed nation to fear or alter their behavior to try to avoid. For those who are not average citizens and are targeted by such groups, it is all the more reason for the rest of us to use the liberties we have to prevent things from getting worse.

We live in an unprecedented age of human freedom, access to knowledge and communication. To deny this reality and victimize ourselves as being helpless does a disservice to our ancestors who struggled and died through much worse to make a society that though not perfect, is the best it has ever been.

W says:

Re: Re: Re:2 Re:

And what am I supposed to do? I can’t research freely without worrying about what alternative consequences may be! I can’t research how rockets work out of curiosity without risking being put on some kind of terrorist watchlist! The same thing with researching how hacking works out of curiosity, or wanting to help work on Internet freedom.

I don’t know the potential consequences of any given action online anymore due to not trusting that the watchmen know the difference between right and wrong in the online world.

JMT says:

Re: Re:

Your criticism is very poorly aimed. Those in the “free speech tech community” are not the ones breaking the promises you grew up hearing, in fact they’re often the one’s fighting hardest for them. It’s governments and big business that are entirely to blame for the ills of the internet. You same ones you probably vote for and buy from.

W says:

Re: Re: Re:

I hate that they gave me so much hope for so long that I gave them my trust, only to find they weren’t even a notable public force. They aren’t enough of a political power to warrant recognition!

The people who I trusted deeply to keep the Internet safe were incapable of doing so. The incapability has been so severe that random non-tech activist groups are making more headway changing tech away from freedom of speech than tech is towards promoting online freedom!

I hate that the people who promised so many great things turned out to be so incredibly weak and powerless. I pray that I have to eat these words eventually. But for now, I feel little more than pain at what has happened to the open Internet and hacker culture.

JMT says:

Re: Re: Re: Re:

I don’t want to offend but you sound a bit naive, as if you didn’t think government and big business would push back to defend their firmly entrenched interests. Tell me who else has the power, funds and network to effectively challenge them? The only group who has a change are voter/purchasers, and they’re doing a pretty average job…

SparkyDan (profile) says:

Why is it always all or nothing?

You know it’s really annoying how people will say Self Driving Cars will take over. No one is taking over anything, all the Self Driving Car is doing is creating an alternative for people with disabilities or for people who don’t want to drive. You’re all looking at this like they are coming for your cars, if anything you have a brand new field of cars to look into. What if you have a blind or disabled friend who is also a car junkie? Quit listening to Elon Musk, he’s not perfect like he thinks he is. When you make these types of fear based unrealistic articles, you are just feeding Elon Musk’s ego.

Do you ever get afraid if the motor on an elevator, which many are computerized now a days, will be hacked? Maybe ISIS will hack your local elevator and not allow you to leave the elevator for good! The Google Database is one of the most secure databases on the planet, not even the best hackers in the world could crack their code and usually most terrorist hackers are amateurs who get into a website or two. Big deal, ISIS hacks into the little league baseball website.

Law Enforcement will clearly have access to the cars if needed, if the car is doing something Illegal then yes the officer will intervene. They can write up a citation where the car needs to have a recertification by the manufacturers and DMV that it is once again working legally, failure to do so should rest on the owner of the car’s hands.

Anonymous Coward says:

Re: Why is it always all or nothing?

Problem is it will be all or nothing. One day they will quit making drivable cars. The insurance industry will make sure of it, just like helmets became mandatory in many states for motorcycles. When that day comes, another day down the road, that same industry will be quoting stats that say driving a car makes you a hazard to others on the road.

As far as the issue with finding out about software problems, it will be the same as finding out about badly engineered designs in autos. Think the ignition swtich or Toyota’s unwanted acceleration. With the DRM and arbitration, finding out it’s a problem with a particular make will have to be proved by deaths. Those who refuse to sign a settlement with a NDC in it. So it won’t be one death that leads you to find out about it but a lot of deaths before it can’t be covered up any more. You can be sure that just like today, these big automakers will do anything in their power to keep it under wraps for as long as possible that there is anything wrong with their products.

Today I’m quite happy to be driving the same vehicle I bought 20 years ago. It’s not nickle and diming me to death and pretty much all it takes is gas, oil, tires, and rarely anything else. Best part about it is it doesn’t have all the geolocation crap in it.

SparkyDan (profile) says:

Re: Re: Why is it always all or nothing?

Well that is more of a bureaucracy issue, these insurance companies don’t have the right to start having everyone switch over to Self Driving Cars. If anything I believe the insurance companies will do the same thing as the DMV just create a new branch of rules.

Nobody wants a future where people are forced to do something, in fact many auto manufacturing companies who are building self driving cars will have the self driving part as more of an option. I don’t really see the point in Self Driving Cars for myself due to the fact I’m an able bodied 22 year old electrician.

I don’t even own a car but I get around with lyft or public transportation. I’ve actually carved out a nice little niche where I as an electrician never have to drive. At the moment I work as a Construction Electrician where I just put my tools into a lyft car and bring them right over. Soon enough I’ll have my AS and will be doing lights out manufacturing.The constitution has made it clear that they can not take away your right in being happy. The automated car though is for people who EG: may not have legs,lost an eye in a refinery or are deaf.

As for the software problems, it’s not like these cars are just going to hit the road and crash. There will be a lot of safety tests for each and every car. There will be certifications granted to each car as if they are a driver themselves. If there is a software problem chances are they will find it in the factory or wherever they sell these things. And as for the comparison to your smart phone, no, the checks for these will be a lot more intense.If the head of the NHTSA said that we should not get in the way of this type of innovation, then these cars are probably vetted extremely well.

Software problems though are being looked at as we speak. Tesla has been the first ones to try this software and yes they’ve had heavy issues with the software in 2013 but it’s almost 2016 now and 3 years ago is ancient compared to what the software is today. If these cars were considered a danger to society then they wouldn’t even be looked at by the Feds.

I still believe that a lot of people tend to feed into an idea that really isn’t there. I remember talking to a friend of mine who said the same thing you’re saying. He hates the idea because they will force everyone to use it. And I said why would they and all he said back was Control.

Coyne Tibbets (profile) says:

Secondary profit center

From the article: … whether or not cars should be programmed to kill the occupant…

Whew! Good thing you followed this up with a qualifier, because my first thought was murder for hire.

And wouldn’t murder for hire be easy for a self-driving car manufacturer to accomplish? “We don’t know what happened…everything was going well and suddenly the car swerved into the path of that monster truck. Must have been a software glitch.”

Anonymous Coward says:

who gets to control this code?

The scope of that question is a lot larger than most people think it is. If you consider “code”, to extend to any software that can be reasonably maintained in an industrialized way, then the question extends as far as DNA and synaptic patterns.

The short answer is: the person who wrote it. Which presents a rather large problem for a society that is increasingly dominated by heavily leveraged aristocrats that don’t code.

The reality is that if software developers were guarding the products of their labor as jealously as the RIAA/MPAA mafia, we would still be using punch cards. The significance of that fact is lost on the armies of nodders blithely tapping away on their pocket leashes.

In this century coding is to literacy as reading and writing was to literacy in the 17th century. Software developers are the modern journeymen; hawking their wares not because they want to; but because fair compensation is rare and fleeting.

There of dozens of farces being used to villainize coders. The cyberterrorism oriented TV shows are not unlike the German Jew posters of the 1930’s. They are a fear tactic manufactured by socialite barrons of the corporate state in order to keep the newly literate in corporate concentration camps. We are contained by the barbed wire of our vanity, and an utterly corrupt and meaningless financial system.

The reality is that many software developers have written code that was used to do things that the developer deplores. And it is up to us to mitigate the effect of those who would abuse our services. And yes, they _will_ come for us before they are forced to let go of the reigns. Fortunately for us, the order to round us up will be transmitted over computer networks.

I am not so concerned about the morality of a self driving car. The moral issues of code in modern society go far beyond that. The question for us, is whether software developers will permit themselves to be turned against each other by the social wranglings of lesser men.

As the saying goes: “You can always hire one half of the poor to kill the other half.” Whether it is true in the modern era, is a question every coder needs to ask himself. For it is in our power to suspend that formerly inevitable conclusion. And that is what the social aristocracy truly fears.

Anonymous Coward says:

who gets to control this code? Will the automotive update process be transparent? Will the driver retain the ability to modify their car’s code? Will automakers adapt and stop implementing the kind of paper mache level security that has resulted in the endless parade of stories about hacked automobiles it takes five years for automakers to patch?

Here are your answers, in order:
Big Businesses. No. No. Absolutely… NOT.

As usual, there will be some pretend “safety regulations” list released by the FCC & co. that big business will take the minimum effort to respect.

Besides, no DRM would mean opening up to liability /sarcasm

I’d say:
– Prepare for an even worse future of proprietary interfaces. – All based on the same secure-by-obscurity protocol with minor differences for your protection (to lock you in).
– The protocol will be an over-engineered mess of spaghetti code and the licenses will sell for millions of $.
– Hood will be sealed. You’ll get a fine if you pop open the hood.

Mason Wheeler (profile) says:

Others get more complex, including whether or not cars should be programmed to kill the occupant — if it means saving a school bus full of children.

There are only two answers to this question that make any sense at all.

First, the literal one: Don’t be ridiculous; do you have any idea how solidly built a bus is? It’s practically inconceivable that there could be any way you could crash a modern car into one that would put the inhabitants of the bus at any serious risk of death or severe injury.

Second, the “in the spirit it was asked” one: No, it absolutely should not be programmed to sacrifice the driver under any circumstances whatsoever. For two reasons: first, because if people know that such functionality exists, they’ll never want to buy it, and second (partially feeding into the first) because if such functionality exists, it becomes a target for hackers; someone will inevitably try to find a way to cause it to activate incorrectly, just because it’s there. The very real danger of this overwhelmingly outweighs the hypothetical danger of a “trolley problem” incident.

Anonymous Coward says:

Re: Re:

Seriously, there are tons of ways to wreck into a bus that could harm the occupants.

Busses of all types have sorely poor safety features and crash history when things go wrong. Check some news clippings next time you find yourself online and in reality at the same time!

But you are right, a computer system should never be programmed with an acceptable to kill in this circumstance logic. Each system should only ever focus on the safety of its current occupant, while secondarily focusing on minimizing any and all collateral damage to its surroundings.

Mason Wheeler (profile) says:

Re: Re: Re:

Seriously, there are tons of ways to wreck into a bus that could harm the occupants.

Yeah, if you’re driving a dump truck. Buses are made with very heavy-duty steel frames, because they’re designed to take safety very seriously. When a car made of aluminum crashes into one, the car gets crunched and the bus just shakes a little.

Blaise Alleyne (profile) says:

Software freedom is a must

The more software controls things that weren’t traditionally the computers, the more important software freedom becomes. We run cars through safety tests, but not source code audits? Cars, airplanes, drones, pacemakers, etc… the software needs to be free and open source as a prerequisite. It’s increasingly necessary (though not sufficient).

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...