from the I'm-sorry-I-can't-do-that,-Dave dept
So over the last few years you probably remember seeing white hat hackers demonstrate how easily most modern smart cars can be hacked, often with frightening results. Cybersecurity researchers Charlie Miller and Chris Valasek have made consistent headlines in particular by highlighting how they were able to manipulate and disable a Jeep Cherokee running Fiat Chrysler's UConnect platform. Initially, the duo documented how they were able to control the vehicle's internal systems -- or kill it's engine entirely -- from an IP address up to 10 miles away.
But the two would go on to highlight how things were notably worse, pointing out last year that they'd also found a way to kill the vehicle's brakes, cause unexpected acceleration, or even direct the vehicle to perform sudden and extreme turns:
"Last year, they remotely hacked into the car and paralyzed it on highway I-64—while I was driving in traffic. They could even disable the car’s brakes at low speeds. By sending carefully crafted messages on the vehicle’s internal network known as a CAN bus, they’re now able to pull off even more dangerous, unprecedented tricks like causing unintended acceleration and slamming on the car’s brakes or turning the vehicle’s steering wheel at any speed."
Just the gift for intelligence or private sector ne'er-do-wells looking to cause mayhem -- or worse.
After Miller and Valasek's hacks made consistent headlines, the two were quietly hired by Uber to help the company secure its self-driving taxi service. Miller has since moved on to Chinese competitor Didi, and tells Wired he's much more free to speak about the perils of securing automated cars and taxis. What he's saying isn't what you'd call comforting:
"Autonomous vehicles are at the apex of all the terrible things that can go wrong,” says Miller, who spent years on the NSA’s Tailored Access Operations team of elite hackers before stints at Twitter and Uber. “Cars are already insecure, and you’re adding a bunch of sensors and computers that are controlling them… If a bad guy gets control of that, it’s going to be even worse."
The problems that Miller highlighted with the Jeep Cherokee are significantly worse when you're talking about a taxi that sees significantly more use each day. A taxi that, under current federal law, won't be able to block consumer access to the vehicle's OBD2 port (something consumers want the freedom to tinker with in their own vehicle, but perhaps not so much in a communal car):
"There’s going to be someone you don’t necessarily trust sitting in your car for an extended period of time,” says Miller. “The OBD2 port is something that’s pretty easy for a passenger to plug something into and then hop out, and then they have access to your vehicle’s sensitive network."
Miller notes that securing an automated vehicle isn't impossible, but it's going to require the use of "codesigning," restrictions built into the OBD2 port, better internal segmentation and authentication -- and basically a complete retooling of how self-driving vehicle security is implemented. But Miller notes that companies like Uber are bolting their computer systems onto already built vehicles, which complicates things. And the slow pace of finding and patching security vulnerabilities in vehicles poses an additional layer of problems.
The solution will also involve greater "open conversation and cooperation" among carmakers and developers, something Miller says was lacking at Uber, and hasn't exactly been the trademark of other automated vehicle vendors.
Right now, we continue to find the lack of security in our smart fridges and TVs kind of cute. But it's threats like those being exposed by Miller that have some security researchers like Bruce Schneier consistently predicting some massive problems on the horizon that may result in notable human casualties. And we're not helping the problem by letting companies monopolize repair, or consistently erode our privacy rights or our freedom to tinker.