New Report Says Apple Dropped Plans To Fully Encrypt Backups After FBI Complained

from the encrypt-all-the-things dept

As Attorney General William Barr and other law enforcement officials continue to insist (falsely) that Apple refuses to cooperate with them in undermining encryption and security on all iPhones, plenty of people have been pointing out for years that the reality is that most iPhone encryption is effectively meaningless, because if a user has iCloud backups on, Apple retains the key to that data and can (and does!) open it up for legitimate law enforcement requests. In other words, it’s extremely rare that full device encryption actually keeps law enforcement out (and that leaves aside the fact that technological solutions exist for law enforcement to hack into most iPhones anyway). Indeed. as you might recall, during the FBI’s last big fight about encryption with Apple, over San Bernardino shooter Syed Farook’s iPhone, it was revealed that the FBI’s own incompetence resulted in Farook’s backups being wiped out before the FBI had a chance to access them.

For quite some time now, EFF and others have urged Apple to close this loophole and allow for truly encrypted iCloud backups, such that even Apple can’t get in. Apple has toyed with the idea, but as Tim Cook has said a few times, the company chose not to do it this way after weighing the pros and cons from a user’s perspective. The key issue: if something is fully encrypted and Apple doesn’t have the key, if you lose your password, the data is effectively gone. There is no “password reset” if Apple doesn’t retain the key:

There our users have a key and we have one. We do this because some users lose or forget their key and then expect help from us to get their data back.

However, in that same interview, Cook did suggest that Apple would move towards encrypting backups as well:

It is difficult to estimate when we will change this practice. But I think that will be regulated in the future as with the devices. So we will not have a key for it in the future.

I think that there are legitimate user-centric reasons for the decision that Apple made, though it seems clear that many, many people don’t realize that Apple still has the key to their backups. However, a new report from Reuters says that Apple killed plans to offer fully encrypted backups after the FBI got upset about it:

Apple Inc dropped plans to let iPhone users fully encrypt backups of their devices in the company?s iCloud service after the FBI complained that the move would harm investigations, six sources familiar with the matter told Reuters.

The tech giant?s reversal, about two years ago, has not previously been reported. It shows how much Apple has been willing to help U.S. law enforcement and intelligence agencies, despite taking a harder line in high-profile legal disputes with the government and casting itself as a defender of its customers? information.

At the very least, this shows (yet again) that Barr and other law enforcement officials are blatantly lying when they say that Apple does not cooperate with law enforcement or that it doesn’t take the concerns they raise seriously. On the flip side, it is a bad look for Apple, in that it has chosen to avoid a more secure option for its users’ data, going against the company’s long-standing public support for encryption and protecting users’ data.

Again, even if there is a legitimate reason for not encrypting backups — and it’s equally true that if Apple did offer it, there would be public complaints of people no longer having access to their data — it’s troubling that Apple won’t even make this an option (with clear warning statements) for end users, and that they’re doing so because of blatant fearmongering by law enforcement officials.

Of course, the other way one might look at this decision is that if Apple had gone forward with fully encrypting backups, then the DOJ, FBI and other law enforcement would have gone even more ballistic in demanding a regulatory approach that blocks pretty much all real encryption. If you buy that argument, then failing to encrypt backups is a bit of appeasement. Of course, with Barr’s recent attacks on device encryption, it seems reasonable to argue that this “compromise” isn’t enough (and, frankly, probably would never be enough) for authoritarian law enforcement folks like Barr, and thus, it’s silly for Apple to even bother to try to appease them in such a manner.

Indeed, all of this seems like an argument for why Apple should actually cooperate less with law enforcement, rather than more, as the administration keeps asking. Because even when Apple tries to work with law enforcement, it gets attacked as if it has done nothing. It seems like the only reasonable move at this point is to argue that the DOJ is a hostile actor, and Apple should act accordingly.

Filed Under: , , , , , ,
Companies: apple

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “New Report Says Apple Dropped Plans To Fully Encrypt Backups After FBI Complained”

Subscribe: RSS Leave a comment
28 Comments
This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re:

It’s not difficult to make your own encryption technology

But is very difficult to make a strong encryption technology. Almost all roll you own encryption schemes have a weakness that the implementer is blind to, which make them easy to break.

Also, the iPhone is locked down, and not easy to program without becoming an Apple developer.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re:

That is an almost impressive amount of wrong in just two sentences.

Yes it could be ‘easy’ to make your own encryption, but making secure encryption is another thing entirely, as even the stuff pored over and/or created by experts with massive funding can and does have exploitable holes.

As for encryption not being really useful, that’s not just laughably wrong it’s dangerously wrong. Even with encryption and other security there are the occasional massive security breaches and leak of personal data, take away encryption and that would be happening on a daily basis if not more often, leaving things like medical and financial data there for the taking for anyone who wanted it.

This comment has been deemed insightful by the community.
Tin-Foil-Hat says:

Re: Re: Re:

I think it’s hard for the average person to use encryption outside of the product provided by the phone manufacturer.

You don’t necessarily need to reinvent the wheel. If you wanted encrypted backups you would be better to use a reputable existing product, shut off automatic backups, encrypt your data and save it whereever, like to your pc, cloud etc.

It would be easy for someone who was a computer power user but not so easy for anyone else.

Could you do that with an iPhone as easily as an Android? I don’t think so.

Anonymous Coward says:

Re: Re: Re: Re:

I think it’s hard for the average person to use encryption outside of the product provided by the phone manufacturer.

Then you are clearly mistaken. Creating secure encryption products is hard, but using secure encryption products isn’t nor should it be*. Anyone trying to make using secure encryption harder or impossible should be viewed with extreme suspicion.

As I’ve said before, Apple could lockdown the iOS devices to reject any encryption product that wasn’t approved by the powers that be, but I’ve yet to see it. There are apps on Apple’s app store that provide encryption.

*: Note that for encryption to be useful, the user must understand what they are attempting to accomplish. No amount of over-simplification here is good because it opens the user up to multiple attacks. (Supply chain, backdoors, key escrow, the list goes on.) Ultimately the encryption product you install and configure yourself will be better than what is provided to you by a device manufacturer as long as you understand enough of it to not screw yourself over . A.K.A. Don’t drive a loaded down semi-truck on the interstate before you know how to drive a bumper car at an amusement park.

bhull242 (profile) says:

Re: Re: Re:3 Re:

[Apple] will remove anything that competes with one of their products

[citation needed]

I mean, they absolutely could (as noted by the other AC), but I don’t see evidence that they do. They allow others’ GPS maps, others’ music players/streamers, others’ word processors, others’ email apps, others’ web browsers, others’ spreadsheet apps, others’ music recorders, others’ cloud-based services, others’ calculators, others’ note-taking apps, others’ clock/timer/alarm apps, others’ calendars, others’ email apps, others’ gaming platforms (Google Stadia), others’ slideshow apps, others’ document managers, others’ ad services, etc. I think they just don’t allow others’ operating systems and app stores.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Damned if you do, damned if you don't

Indeed, all of this seems like an argument for why Apple should actually cooperate less with law enforcement, rather than more, as the administration keeps asking. Because even when Apple tries to work with law enforcement, it gets attacked as if it has done nothing. It seems like the only reasonable move at this point is to argue that the DOJ is a hostile actor, and Apple should act accordingly.

It’s actually worse than that, as not only are they attacked by dishonest individuals claiming that they aren’t doing anything but both currently and in the past Apple has had the fact that they are willing to work with law enforcement used against them, with the argument being that if Apple is willing to do A then they should have no problem doing B, even if it is significantly different both in scope and difficulty.

If Apple offers some help for one thing then they are attacked for refusing to offer help on another thing.

If they don’t offer all the help demanded then it is claimed that they aren’t willing to help at all.

If they’re going to be attacked no matter what they do then as the article notes they should probably consider no longer giving their opposition(and at this point with how they’re treated that seems an appropriate label for law enforcement) ammo to be used against them, and if law enforcement tries to attack them for that they can simply point out that it’s already been claimed that they aren’t doing anything, they’re simply making it so those claiming such are no longer lying.

That Other Guy says:

Re: Damned if you do, damned if you don't

Very well said. I would stress that the DOJ making this bad faith argument is a bullying tactic that really should raise some red flags about where this is headed. It seems clear that no matter how much help Apple may offer, as long as it tries to maintain some balance between the privacy interests of its customers and the legitimate investigatory needs of law enforcement they’re going to get slammed because as far as law enforcement is concerned there’s no balancing to be done. There is no right to privacy, they need all your data. What’s not to understand?

Anonymous Coward says:

Re: Damned if you do, damned if you don't

Heck, I just realized that the FBI is like the bad user everyone hates to support.
They demand to be able to do things in a specific way according to their estimation on how it should be done (fantasy mostly) and when the application or security procedures block for that, they will message the manager telling how bad support they got.

This comment has been deemed insightful by the community.
Scary Devil Monastery (profile) says:

Re: Damned if you do, damned if you don't

"…if Apple is willing to do A then they should have no problem doing B, even if it is significantly different both in scope and difficulty."

Where did I hear that argument before. Ah, right, it’s what MPAA and RIAA enforcers keep using in their attacks against ISP’s.

you’d think that the DoJ using the same tactics copyright trolls do ought to ring a few alarm bells. This sort of crap should have made waves. And yet everyone appears to accept it as "business as usual" for US law enforcement…

This comment has been deemed insightful by the community.
This comment has been deemed funny by the community.
Anonymous Anonymous Coward (profile) says:

But were doing this for YOU!

I wonder how many iPhones the government owns? Apple should take the position that they are protecting government data and encrypt everything. Then, when the government comes along and and claims Apple isn’t helping, they can make the truthful claim that they are just protecting the governments data, and how come they don’t want that?

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: 'If it's good enough for the public, it's good enough for you'

Alternatively if they want holes in encryption then Apple can apply that to everyone, I’m sure the DOJ would be perfectly okay with using iPhones with known vulnerabilities, I mean it’s not like the government would ever have a massive breach exposing sensitive information that could be made vastly bigger and/or happen more often with something like that.

Anonymous Coward says:

Re: Re: 'If it's good enough for the public, it's good enough for yo

This is why the government wants more teams to attack supposed bad actors rather than having the competence to truly run investigations or do even the simplest things to secure their own data or networks. They view corporate America the same way, which is readily seen by their recent televised "cyber warnings". (Yeah, they have propaganda reasons for those too, but still.) It’s all "look out for an attack in the near future because reasons," rather than "hey morons, implement even some basic security for your servers and networks, and maybe how about for your (pointlessly) internet-connected consumer products as well".

No, they actually want weaker security. It will never truly bite them in the ass, not at the top levels, i’m sure. /headdesk

Anonymous Coward says:

Re: Re:

but let’s add a ‘moron’ recovery mode with the password 12345 (the same as Dark Helmet’s luggage) that allows any user to recover their password.

I mean what could go wrong with a system like this? Oh, you mean people other than the one who lost their password may use this to access others information? Bah, that’s fake news, that would never happen.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...