Just As Attorney General Barr Insists iPhone Users Have Too Much Security, We Learn They Don't Have Nearly Enough

from the well,-look-at-that dept

You may recall a few years back, John Oliver did one of his always excellent Last Week Tonight shows all about encryption. It concluded with an “honest Apple commercial” that highlighted the difficulty of keeping phones secure, and noting that it’s a constant war against malicious attackers who are always trying to figure out new ways to break into people’s phones:

That commercial is a lot more realistic than people might think. And late last week, Google revealed a pretty astounding iOS exploit that broadly targeted anyone who visited a series of compromised websites, using a combination of zero day attacks that allowed them to more or less own anyone’s iPhone who had visited the sites. As Wired noted in its piece about this attack, it changes most of what we know about iPhone attacks these days. At the very least, it demolished the idea that most iPhone hacking really only targeted key individuals.

It also represents a deep shift in how the security community thinks about rare zero-day attacks and the economics of “targeted” hacking. The campaign should dispel the notion, writes Google Project Zero researcher Ian Beer, that every iPhone hacking victim is a “million dollar dissident,” a nickname given to now-imprisoned UAE human rights activist Ahmed Mansour in 2016 after his iPhone was hacked. Since an iPhone hacking technique was estimated at the time to cost $1 million or more?as much as $2 million today, according to some published prices?attacks against dissidents like Mansour were thought to be expensive, stealthy, and highly focused as a rule.

The iPhone-hacking campaign Google uncovered upends those assumptions. If a hacking operation is brazen enough to indiscriminately hack thousands of phones, iPhone hacking isn’t all that expensive, says Cooper Quintin, a security researcher with the Electronic Frontier Foundation’s Threat Lab.

“The prevailing wisdom and math has been incorrect,” says Quintin, who focuses on state-sponsored hacking that targets activists and journalists. “We’ve sort of been operating on this framework, that it costs a million dollars to hack the dissident?s iPhone. It actually costs far less than that per dissident if you?re attacking a group. If your target is an entire class of people and you’re willing to do a watering hole attack, the per-dissident price can be very cheap.”

Now, it’s true that device encryption has nothing to do with this attack — and, in fact, the attack could be seen as a way to get around device encryption, since it was putting malware on your phone that could slurp up your data once you unencrypted it locally — but it does strike me as yet another condemnation of Attorney General William Barr’s utter nonsense lately about how the average consumer doesn’t need that much phone security these days. If you’ll recall, Barr shrugged off concerns about banning real encryption by saying that since all phones have some security vulnerabilities, what’s a few more:

All systems fall short of optimality and have some residual risk of vulnerability ? a point which the tech community acknowledges when they propose that law enforcement can satisfy its requirements by exploiting vulnerabilities in their products. The real question is whether the residual risk of vulnerability resulting from incorporating a lawful access mechanism is materially greater than those already in the unmodified product. The Department does not believe this can be demonstrated.

The Department of Justice and Barr are wrong. Encryption still remains not just a key piece of fighting these vulnerabilities, but one of the most important. Creating “lawful access” points is worse than taking away a protection, it’s literally enabling a multitude of new vulnerabilities — and playing right into the hands of people looking to exploit such vulnerabilities.

Indeed, as the Wired article notes, even as surprising and unexpected as the latest vulnerabilities were, it’s notable that they appeared to be out there for quite some time, with many, many victims, and no one spotted it even though the attackers were super sloppy:

The hackers still made some strangely amateurish mistakes, Williams points out, making it all the more extraordinary that they operated so long without being detected. The spyware the hackers installed with their zero-day tools didn’t use HTTPS encryption, allowing anyone on the same network as a victim to read or intercept the data it stole in transit. And that data was siphoned off to a server whose IP addresses were hardcoded into the malware, making it far easier to locate the group’s servers, and harder for them to adapt their infrastructure over time. (Google carefully left those IP addresses out of its report.)

Given the mismatch between crude spyware and highly sophisticated zero-day chains used to plant it, Williams hypothesizes that the hackers may be a government agency that bought the zero day exploits from a contractor, but whose own inexperienced programmers coded the malware left behind on targeted iPhones. “This is someone with a ton of money and horrible tradecraft, because they?re relatively young at this game,” Williams says.

And that certainly suggests that there are likely already much more sophisticated attacks out there — and if not, many more are coming soon. And, they will target any and all possible vulnerabilities — including any “backdoor” the DOJ/FBI demands that device makers install. Contrary to what you may have heard that the debate over backdoors is a fight between ‘security and privacy,” it’s not. It’s a debate between “security for most people, and rare instances where law enforcement doesn’t want to do basic detective work and wants everything handed to them.”

This latest revelation should now make many people more aware of the security challenges of protecting connected devices. But it should also re-emphasize how utterly ludicrous it would be to purposefully insert new vulnerabilities into phones because the DOJ can’t be bothered to do its job properly.

Filed Under: , , , , , ,
Companies: apple, google

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Just As Attorney General Barr Insists iPhone Users Have Too Much Security, We Learn They Don't Have Nearly Enough”

Subscribe: RSS Leave a comment
Anonymous Coward says:

The real question is whether the residual risk of vulnerability resulting from incorporating a lawful access mechanism is materially greater than those already in the unmodified product.

Yes, it is much greater because it is known to exist, and who to infiltrate to get the details is also known. What also makes it worse, it would be illegal to fix the vulnerability without government permission.

Category 5 hurricane! -- But for you, Category 2. says:

You have NO ACTUAL security now! Only if NOT targeted.

Barr is right that another backdoor doesn’t make anyone less "secure".

By the way, Apple and Google BOTH provide "direct access" to NSA as Snowden said. You cannot defend against the operating system itself. Even capturing actual data sent will tell you nothing: the heap of mystery "telemetry" goes to an Apple server first.

Anonymous Coward says:

Re: Re:

And one more hole in a damn doesn’t let anymore water through, right?

By the way, Apple and Google BOTH provide "direct access" to NSA as Snowden said.

No they don’t, and no he didn’t. They provide code for review upon request (sometimes). That’s not the same as giving them direct access to live operating systems.

You cannot defend against the operating system itself.

Actually, you can, but regardless, the operating system is not what is attacking you.

Even capturing actual data sent will tell you nothing: the heap of mystery "telemetry" goes to an Apple server first.

Wut. You just undercut your own argument with this statement. It’s wrong, but it’s still contradictory to everything else you just said.

Anonymous Coward says:

Re: Re:

You cannot defend against the operating system itself.

So your argument is that between the companies who provide the hardware and software everyone uses, and the government who demands the surveillance and runs it… it’s the companies you want to focus on? Not the government doing questionable things? It’s the operating system you want to bitch about?

Give me a break.

That One Guy (profile) says:

'What's one more (possibly literally) fatal flaw?'

Arguing that since no security is perfect there shouldn’t be a problem adding a known security vulnerability is rather like saying that since people still die in car crashes despite the presence of things like seat-belts, crumple zones and other safety features there shouldn’t be a problem trimming some of those out.

Anonymous Coward says:

Re: 'What's one more (possibly literally) fatal flaw?'

I’d say it’s similar to adding something to a vehicle to let the police control the breaks on your car. Surely there couldn’t be any problems with that? And any vehicle fatalities that result because of it are dwarfed in comparison to the existing fatality count via existing issues, so what’s the big deal?

bhull242 (profile) says:

Not entirely relevant but related topic

One thing a lot of people implicitly assume is that everyone who uses a device uses the most up-to-date firmware and OS, and that’s simply not true. Although in many cases this comes down to laziness or carelessness, a lot of it has to do with limited memory, limited internet, or incompatibilities introduced in the update (either with the hardware or some necessary or incredibly useful software). There are also cases where an update causes thing to run slower or worse than before, or creates unwanted changes to the UI.

As such, there are a lot of devices running on old firmware or OSs with vulnerabilities patched in later versions. There’s not much that can be done about those specific problems.

Anonymous Coward says:

As a recently converted Apple tool (slowly approaching my 1 year mark) I have to disagree with the content of the article. While the price of the hack per-person has probably decreased well below the $1 million mark, the extreme preciseness and limitation of the exploit resulted in it being deployed on indeed, targeted individuals (e.g: Uyghurs in Xinjiang, China). It does help to remember that this subterfuge was in addition to the checkpoint system authoritarian China already deploys, which includes facial recognition, rubber-hose decryption (compelled installation of a Spyware app to proceed through the checkpoint), QR codes on apartments so the police can see photos of who’s supposed to live inside of them, and prosecution for observing religious rituals such as Ramadan. None of these things would save you if you had an Android phone, they have an app for that OS too.

It does suck that this has been lurching around for years, but, it would’ve gained traction far more quickly had it targeted Americans. This goes for much technological exploitation in the west; Russians and the third world have long suffered from disinformation campaigns, bot accounts, and voter fraud amidst an election. It’s sadly time we in the west see our fair share.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »