$100 Bluetooth Hack Can Unlock All Kinds Of Devices, Including Teslas, From Miles Away

from the dumb-tech-is-smart-tech dept

While they’re not impervious, at least you know where you stand with a good, old fashioned dumb lock. That’s in stark contrast to so-called “smart” locks, which studies have repeatedly shown to be easily compromised with minimal effort. One report showed that 12 of 16 smart locks they tested could be relatively easily hacked thanks to flimsy security standards.

Now there’s a new vulnerability to worry about. Sultan Qasim Khan, a researcher at NCC Groupover has discovered a new Bluetooth vulnerability that’s relatively trivial to exploit with around $100 in hardware, and impacts potentially thousands of Bluetooth devices, including Teslas.

The attack exploits a weaknesses in the Bluetooth Low Energy (BLE) standard adhered to by thousands of device makers, including “smart” door locks, cars, laptops, and various “internet of things” devices. It’s a form of “relay attack” that usually requires two attackers, one near the target, and one near the phone used to unlock the target.

But this class of attack doesn’t even require two people. A relaying device can be placed near where the target device is located or will be located (like by your driveway), and the other attacker can be targeting the device from hundreds of yards — or even miles — away:

“Hacking into a car from hundreds of miles away tangibly demonstrates how our connected world opens us up to threats from the other side of the country—and sometimes even the other side of the world,” Sultan Qasim Khan, a principal security consultant and researcher at security firm NCC Group, told Ars. “This research circumvents typical countermeasures against remote adversarial vehicle unlocking and changes the way we need to think about the security of Bluetooth Low Energy communications.”

Device makers have implemented a bunch of countermeasures to prevent against BLE attacks like these, but Khan found a way to mitigate those attacks. Many other companies are smart enough to avoid using BLE for proximity authentication (since it was designed for data transfer, not authentication), but given that privacy and security is an afterthought for many companies, many still do.

All told, it’s just another reminder that dumb tech is often… smarter.

Filed Under: , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “$100 Bluetooth Hack Can Unlock All Kinds Of Devices, Including Teslas, From Miles Away”

Subscribe: RSS Leave a comment
33 Comments
That Anonymous Coward (profile) says:

“A relaying device can be placed near where the target device is located or will be located”

Say like a fancy restaurant where the valet parks your car a distance away.

Entrance to a mall, office building, all sorts of places.

Once again they failed to listen to or hire someone who looks at these awesome advances and says but what if someone does x…

I mean I’m not THAT smart and even I can see many of the issues with these sorts of things ‘making life easier’ by creating 10 more problems than they had before.

Again supports my truism that humans can not learn, how many fscking BT hacks have we seen and people keep using it in ways never intended pretending their little twist will undo decades of its not fscking designed to do that.

Anonymous Coward says:

Re:

Relay attacks were documented for bank cards in 2005:

Relay attacks, also know as the wormhole or Chess grand-master attacks have been known of since at least 1987. In the context of EMV, we described how relay attacks could be used for fraud in the paper “Chip and Spin”.

To be fair, it’s not like people do much better with physical keys. Many walk around with their keys hanging off their pants, bitting fully visible for anyone with a camera to copy. And from the manufacturers’ side, I’ve heard of people getting into someone else’s car with their own key, noticing only after it fails to start the ignition (which I guess… actually checks the key properly?).

nasch (profile) says:

Re: Re:

I’ve heard of people getting into someone else’s car with their own key, noticing only after it fails to start the ignition (which I guess… actually checks the key properly?).

Right, to unlock the door if you use the physical key I think it’s nothing more than a physical lock. But to start the car, the car’s ECU communicates with the key to ensure it’s correct, so even the right physical key with the wrong code will not start it.

Anonymous Coward says:

Re: Re: Re:

Even in the days of non-electronic keys (1980s-’90s), this was known to happen in large parking lots (where car-doppelgangers were likely to be found). Each lock on a car may accept subtly different keys—e.g., a valet key wouldn’t open the glove compartment. I’m not sure if there was any good design rationale for a key that could open the doors but not turn on the car. Maybe you could order a special key so your butler could put bags in or out?

If you’ve got an electronic key, make sure you know how the physical backup works—and practice with it. A family member had to figure this out in the dark. The fob actually had some kind of miniature physical key inside, and the car had one physical lock on the driver’s door—but hidden behind some tiny cover that had to be slid away. Make sure you’ve got a physical backup for your garage door too, if relevant; I heard about someone who couldn’t get into their garage because the circuit breaker for their opener had tripped, and the electrical panel was inside the garage… (You can’t make this stuff up. And why would you? That would be a waste of everyone’s time).

Anonymous Coward says:

Re: Re: Re:2

3D-printed plastic keys can be durable enough to be used once or twice. People did it with the leaked TSA keys.

Is anyone surprised that there’s at least one online service that actually will ship you a key based on a photo:

So here’s how it works: you take a photo of your key with your smartphone, upload it to the site, and the company sends you a duplicate in less than a week. That’s it. […] While it may seem practical […], the site doesn’t require its users to verify the key’s ownership.

(That last sentence is kind of bullshit. The typical lock one might buy at Home Depot doesn’t include key registration, certificates of ownership, or anything like that. For those who don’t want to order online, Home Depot also has a rack of key blanks at the end of the lock aisle—unguarded, and as easy to palm as a washer, and probably with less surveillance because who’s gonna steal a blank key?)

Naughty Autie says:

Re: Re:

I’ve heard of people getting into someone else’s car with their own key, noticing only after it fails to start the ignition.

That’s nothing. When my uncle was a teenager in the ’90s, he and his gang of friends would frequently get into the Land Rover of one of the friend’s parents with their own front door keys. Beat that!

Christenson says:

Butt Dialing

This goes back to not having effective locks on my standard phone — and the phone taking action ( such as dialing, or hanging up on friends, or unlocking cars/ houses ) without actual permission from it’s owner.

This kind of man in the middle replay is possible because interaction isn’t required from the user to unlock, and no amount of cryptography in the protocol can fix that.

Apple, if you are reading, I want a pocket mode for my phone where it won’t unlock or anything until the phone is in my hand and I have triple- clicked on it or something. Guided access is close, but doesn’t quite do it.

That Anonymous Coward (profile) says:

Re:

Hey Guys Guys!!!
I think I found the FanBoi…

If the idea that a shitty feature would be enough to sink a company gets you to say anything, you might be a fanboi.

Whenever anyone says anything remotely connected to Tesla and your blood pressure spikes, you might be a fanboi.

If you managed to read the article and your singular takeaway was that it was an attack on Tesla, you might be a fanboi.

While the example is a Tesla, the problem isn’t only found in Teslas, but your brain can’t actually process that information… sorta like autodrive and those posts that keep cars off bike lanes.

Run along now, the adults are talking about how despite the many failures of BT to do wondrous things, they keep using it and failing.

Upstream (profile) says:

What will It take?

For people to start taking the absurd lack of security in IoT devices seriously?

I fear it will take people getting killed due to lack of security on their IoT devices. And not just any people, but lots of “people who matter” will need to perish. No one will care about a few peons here or there. I fear it will take much more than that.

I mean, it took 2 plane crashes and 347 people getting killed for the FAA and the other national aviation authorities to start taking the problems with Boeing’s 737 Max seriously, and they basically had to ground all the planes for Boeing to start taking the problem seriously, and there was very big money involved in that situation.

I think it will likely be much more difficult to get countless manufacturers, both domestic and foreign, to take relatively cheap IoT device security seriously.

Anonymous Coward says:

Re:

What will It take? For people to start taking the absurd lack of security in IoT devices seriously?

Effective financial penalties. Which, as you said, is what really got Boeing moving. Forced recalls of defective products would be one way to do it. (We’d have to reject the common manufacturer defense of “but we stopped selling those literally months ago! How can anyone still expect software updates!?”)

Ehud Gavron (profile) says:

AC got it

Yeah, this is nothing new. AC posted a link from 2005. There are links to using car FOBs from inside your home with a relay.

Authentication is key [sorry, not intentional pun]. Usually encryption is a part of this but auth can be done without enc.

However, NO auth and NO enc and analog signal repeat means garage door openers, cars, are all fair game. Too many Goog hits to post.

This has NOTHING TO DO WITH IOT. It’s just idiots creating a mechanism to provide access, and not securing it at all.

sigh

Upstream (profile) says:

Re: NOTHING TO DO WITH IOT.

I think that one of the ACs is correct in that significant financial considerations will be the proximate cause for changes in device security, and I think Ehud Gavron is correct in that the problem is not limited to IoT devices, but is found in many areas of electronic device access situations, and other situations, as well. For instance, the Boeing case was neither an IoT issue nor an access issue, but rather a much more general “really bad design” issue.

But I think those financial considerations will not arise until there is significant loss of life due to lack of proper security, or, more generally, a lack of proper design standards. That would make the loss of life the “cause in fact” (in legal parlance) of the changes in device security measures or design standards.

Anonymous Coward says:

Re: Re:

But I think those financial considerations will not arise until there is significant loss of life due to lack of proper security

Maybe in general, but when talking specifically about cars, the elephant in the room is insurance. The insurers could band together and refuse to insure against theft any car made after (e.g.) 2025 that’s vulnerable to such attacks. Things might go similarly for house locks and garage door openers.

That’s how the UL got started long ago. More recently, however, it was government regulators rather than insurers who required immobilizers in new cars.

Upstream (profile) says:

Re: Re: Re: Insurers

It seems to me that there are many problems that insurers could mitigate by refusing to provide insurance. They could go a long way toward solving the police accountability problem by refusing to insure municipalities that hire, or refuse to fire, out-of-control cops. But for whatever reason, insurers rarely take such actions. They will refuse homeowner’s insurance (or raise the rates prohibitively) if you own the wrong breed of dog, and they will drop your auto insurance for too many claims (like >2) even if they are through no fault of yours, but they won’t drop cities with cops who have a proven track record of costing them big $$ in the form of settlements with their victims. I don’t know the details about why they won’t, but I am sure it somehow boils down to $$ in the end.

Anonymous Coward says:

Re: Re: Re:2

They will refuse homeowner’s insurance (or raise the rates prohibitively) if you own the wrong breed of dog, and they will drop your auto insurance for too many claims (like >2) even if they are through no fault of yours, but they won’t drop cities with cops who have a proven track record of costing them big $$ in the form of settlements with their victims. I don’t know the details about why they won’t, but I am sure it somehow boils down to $$ in the end.

You’re hinting at the answer. They’ll raise the rates prohibitively, and then they’ll make more money (because they’re gonna take some extra profit on the increased premiums, and because they’re gonna use the worst cities as the excuse to raise premiums for other cities). Unlike a person, a city’s just gonna pay it. As a city taxpayer, what are you gonna do except bitch when your taxes go up?

You can always buy a “good” breed of dog, but where’s a city gonna find a good breed of cops?

Rocky says:

Re: Re:

For instance, the Boeing case was neither an IoT issue nor an access issue, but rather a much more general “really bad design” issue.

It doesn’t stop at bad design, not informing pilots of a system that starts changing the flying characteristics of the plane in certain conditions is compounding the problem.

Who the fuck thought it was a good idea to have such a system that only relies on one pitot sensor, a type of sensor that regularly have problems – like moisture infiltration, icing up or mud-daubers trying to nest in them.

In the end the cause is the same as with any security issue, flat out ignoring or not assessing all the risks because that keeps the cost down.

Naughty Autie says:

Re: Re:

But I think those financial considerations will not arise until there is significant loss of life due to lack of proper security, or, more generally, a lack of proper design standards.

So basically, we need more burglaries that turn into robberies with hostage situations and abductions at the very least. 🙁

Anonymous Coward says:

Re:

That’s “fob”, not “FOB” (the latter standing being a shipping term).

However, NO auth and NO enc and analog signal repeat means garage door openers, cars, are all fair game. Too many Goog hits to post.

It’s not that simple. An encrypted digital signal with mutual authentication may still be “repeatable” meaning “relayable”: you receive and replay each end of the signal (but only once). It’s still the legitimate fob talking to the legitimate car or door-opener, which the relayer can neither decode nor modify nor re-use at a later time. Much as how 10-20 internet routers are repeating my encrypted and authenticated comment packets heading to techdirt.com; one could reroute them via the moon without triggering any TLS error.

There are really 2 relevant ways to fix it:
1) Make sure the exchange only happens when initiated by the user, perhaps when they press a button on their fob or phone. (Perhaps a slight annoyance if one’s headed to the trunk with full hands; a buttonpress that activates for a few minutes could handle that case without a huge security risk.)
2) If operating without user input, measure the round-trip transmission time, and fail unless the devices are close. Each nanosecond indicates 15 centimeters of displacement. It’s easier than one might think to measure with nanosecond precision in a cheap device—every GPS receiver does it, and I believe EMV payment terminals do it to prevent the 2005 attack.

(A third possibility, needlessly complex and esoteric, would be to use the principles behind Quantum Key Distribution to quantum-mechanically detect the relaying itself.)

While “hundreds of miles away” makes for a good headline, the more realistic attack to guard against is a car thief walking around a neighborhood with a radio repeater/amplifier until they see a car unlock—maybe using the fob in your bedroom to steal the car from your driveway while you sleep.

That Anonymous Coward (profile) says:

I’m having a Pepperidge Farm moment…

No to terribly long ago there were all of these car theft happening to high end cars, stolen from people’s drive ways and car makers & cops refused to admit that extending the fob could be at fault.

They were super criminals blah blah blah…

You own a Lexus and 2 dudes can drive away with your car in minutes… and the best suggestions were add a steering wheel club, or hidden kill switch to the car.
Was a big thing in Canada for a while.

While just pressing a button & your car just working seems like a great idea… if people knew a couple of kids could take off with your car in minutes because you have this feature… how popular would it actually be?

People assume that so many things have to be secure & there aren’t any rules or even strong suggestions about how to make them more secure.

Something something garages & homes being robbed because there were only so many codes you could use & the door into the house always left unlocked.
Something something rolling codes will keep everyone safe!
Something something and when you press the button this unit sends out ALL possible codes in about a minute…

Christenson says:

Fobs and security

Since TAC is telling stories, I think I should mention that I had problems with setting rental cars into panic mode because the pens and my personal keys in my front pocket was pressing buttons on my key fob. These things really need a cover that prevents that.

My 2017 car and my 2010 car are both happy to unlock doors and start on the fob being proximate. So yeah, a key fob radio extension is definitely a security problem, because telepresence is now a commercial triviality. On any kind of payment card, again, the radio is the problem.

I would be curious if demanding an unlock code is a sufficient mitigation.

Anonymous Coward says:

Re:

I would be curious if demanding an unlock code is a sufficient mitigation.

If it’s the car demanding the user type in a code, yes. If it’s the fob, that’s overkill: all you need to do is make sure the user wanted to start or unlock their car, and a button is enough.

(Kind of. It doesn’t prevent the relaying, but the idea is that you’ll only press the button when it’s in view and no suspicious people are near it.)

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...