New Report Says Apple Dropped Plans To Fully Encrypt Backups After FBI Complained

from the encrypt-all-the-things dept

As Attorney General William Barr and other law enforcement officials continue to insist (falsely) that Apple refuses to cooperate with them in undermining encryption and security on all iPhones, plenty of people have been pointing out for years that the reality is that most iPhone encryption is effectively meaningless, because if a user has iCloud backups on, Apple retains the key to that data and can (and does!) open it up for legitimate law enforcement requests. In other words, it's extremely rare that full device encryption actually keeps law enforcement out (and that leaves aside the fact that technological solutions exist for law enforcement to hack into most iPhones anyway). Indeed. as you might recall, during the FBI's last big fight about encryption with Apple, over San Bernardino shooter Syed Farook's iPhone, it was revealed that the FBI's own incompetence resulted in Farook's backups being wiped out before the FBI had a chance to access them.

For quite some time now, EFF and others have urged Apple to close this loophole and allow for truly encrypted iCloud backups, such that even Apple can't get in. Apple has toyed with the idea, but as Tim Cook has said a few times, the company chose not to do it this way after weighing the pros and cons from a user's perspective. The key issue: if something is fully encrypted and Apple doesn't have the key, if you lose your password, the data is effectively gone. There is no "password reset" if Apple doesn't retain the key:

There our users have a key and we have one. We do this because some users lose or forget their key and then expect help from us to get their data back.

However, in that same interview, Cook did suggest that Apple would move towards encrypting backups as well:

It is difficult to estimate when we will change this practice. But I think that will be regulated in the future as with the devices. So we will not have a key for it in the future.

I think that there are legitimate user-centric reasons for the decision that Apple made, though it seems clear that many, many people don't realize that Apple still has the key to their backups. However, a new report from Reuters says that Apple killed plans to offer fully encrypted backups after the FBI got upset about it:

Apple Inc dropped plans to let iPhone users fully encrypt backups of their devices in the company’s iCloud service after the FBI complained that the move would harm investigations, six sources familiar with the matter told Reuters.

The tech giant’s reversal, about two years ago, has not previously been reported. It shows how much Apple has been willing to help U.S. law enforcement and intelligence agencies, despite taking a harder line in high-profile legal disputes with the government and casting itself as a defender of its customers’ information.

At the very least, this shows (yet again) that Barr and other law enforcement officials are blatantly lying when they say that Apple does not cooperate with law enforcement or that it doesn't take the concerns they raise seriously. On the flip side, it is a bad look for Apple, in that it has chosen to avoid a more secure option for its users' data, going against the company's long-standing public support for encryption and protecting users' data.

Again, even if there is a legitimate reason for not encrypting backups -- and it's equally true that if Apple did offer it, there would be public complaints of people no longer having access to their data -- it's troubling that Apple won't even make this an option (with clear warning statements) for end users, and that they're doing so because of blatant fearmongering by law enforcement officials.

Of course, the other way one might look at this decision is that if Apple had gone forward with fully encrypting backups, then the DOJ, FBI and other law enforcement would have gone even more ballistic in demanding a regulatory approach that blocks pretty much all real encryption. If you buy that argument, then failing to encrypt backups is a bit of appeasement. Of course, with Barr's recent attacks on device encryption, it seems reasonable to argue that this "compromise" isn't enough (and, frankly, probably would never be enough) for authoritarian law enforcement folks like Barr, and thus, it's silly for Apple to even bother to try to appease them in such a manner.

Indeed, all of this seems like an argument for why Apple should actually cooperate less with law enforcement, rather than more, as the administration keeps asking. Because even when Apple tries to work with law enforcement, it gets attacked as if it has done nothing. It seems like the only reasonable move at this point is to argue that the DOJ is a hostile actor, and Apple should act accordingly.

Filed Under: backups, doj, encryption, fbi, going dark, icloud, pressure
Companies: apple


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 21 Jan 2020 @ 12:19pm

    It's not difficult to make your own encryption technology if you want it. I have major problems with cyber attack and terrorism issues and have found that the encryption isn't really useful anyway.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Jan 2020 @ 12:29pm

      Re:

      It's not difficult to make your own encryption technology

      But is very difficult to make a strong encryption technology. Almost all roll you own encryption schemes have a weakness that the implementer is blind to, which make them easy to break.

      Also, the iPhone is locked down, and not easy to program without becoming an Apple developer.

      reply to this | link to this | view in chronology ]

    • icon
      That One Guy (profile), 21 Jan 2020 @ 12:47pm

      Re:

      That is an almost impressive amount of wrong in just two sentences.

      Yes it could be 'easy' to make your own encryption, but making secure encryption is another thing entirely, as even the stuff pored over and/or created by experts with massive funding can and does have exploitable holes.

      As for encryption not being really useful, that's not just laughably wrong it's dangerously wrong. Even with encryption and other security there are the occasional massive security breaches and leak of personal data, take away encryption and that would be happening on a daily basis if not more often, leaving things like medical and financial data there for the taking for anyone who wanted it.

      reply to this | link to this | view in chronology ]

      • identicon
        Tin-Foil-Hat, 21 Jan 2020 @ 3:53pm

        Re: Re:

        I think it's hard for the average person to use encryption outside of the product provided by the phone manufacturer.

        You don't necessarily need to reinvent the wheel. If you wanted encrypted backups you would be better to use a reputable existing product, shut off automatic backups, encrypt your data and save it whereever, like to your pc, cloud etc.

        It would be easy for someone who was a computer power user but not so easy for anyone else.

        Could you do that with an iPhone as easily as an Android? I don't think so.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 22 Jan 2020 @ 8:08am

          Re: Re: Re:

          I think it's hard for the average person to use encryption outside of the product provided by the phone manufacturer.

          Then you are clearly mistaken. Creating secure encryption products is hard, but using secure encryption products isn't nor should it be*. Anyone trying to make using secure encryption harder or impossible should be viewed with extreme suspicion.

          As I've said before, Apple could lockdown the iOS devices to reject any encryption product that wasn't approved by the powers that be, but I've yet to see it. There are apps on Apple's app store that provide encryption.

          *: Note that for encryption to be useful, the user must understand what they are attempting to accomplish. No amount of over-simplification here is good because it opens the user up to multiple attacks. (Supply chain, backdoors, key escrow, the list goes on.) Ultimately the encryption product you install and configure yourself will be better than what is provided to you by a device manufacturer as long as you understand enough of it to not screw yourself over . A.K.A. Don't drive a loaded down semi-truck on the interstate before you know how to drive a bumper car at an amusement park.

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 22 Jan 2020 @ 11:53am

            Re: Re: Re: Re:

            There are apps on Apple's app store that provide encryption.

            So long as Apple approves of them. That is the app store is controlled by apple, who will remove anything that competes with one of their products, or goes against Apples changing policies.

            reply to this | link to this | view in chronology ]

            • icon
              bhull242 (profile), 26 Jan 2020 @ 1:45pm

              Re: Re: Re: Re: Re:

              [Apple] will remove anything that competes with one of their products

              [citation needed]

              I mean, they absolutely could (as noted by the other AC), but I don’t see evidence that they do. They allow others’ GPS maps, others’ music players/streamers, others’ word processors, others’ email apps, others’ web browsers, others’ spreadsheet apps, others’ music recorders, others’ cloud-based services, others’ calculators, others’ note-taking apps, others’ clock/timer/alarm apps, others’ calendars, others’ email apps, others’ gaming platforms (Google Stadia), others’ slideshow apps, others’ document managers, others’ ad services, etc. I think they just don’t allow others’ operating systems and app stores.

              reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Jan 2020 @ 1:44pm

      Re:

      "the encryption isn't really useful anyway"

      Nice try Bill

      reply to this | link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 22 Jan 2020 @ 5:01am

      Re:

      "It's not difficult to make your own encryption technology if you want it."

      Well, no, but the issue comes in implementation when the third party a bit down the line sends or keeps the passkey in unencrypted form or transmits it in a way which allows a skilled interceptor to guesstimate half of the certificate.

      reply to this | link to this | view in chronology ]

  • identicon
    Professor Ronny, 21 Jan 2020 @ 12:37pm

    Backups

    If you buy that argument, then failing to encrypt backups is a bit of appeasement.

    Real terrorists don't back up. Problem solved.

    reply to this | link to this | view in chronology ]

  • identicon
    Professor Ronny, 21 Jan 2020 @ 12:37pm

    Backups

    If you buy that argument, then failing to encrypt backups is a bit of appeasement.

    Real terrorists don't back up.

    reply to this | link to this | view in chronology ]

  • icon
    That One Guy (profile), 21 Jan 2020 @ 12:42pm

    Damned if you do, damned if you don't

    Indeed, all of this seems like an argument for why Apple should actually cooperate less with law enforcement, rather than more, as the administration keeps asking. Because even when Apple tries to work with law enforcement, it gets attacked as if it has done nothing. It seems like the only reasonable move at this point is to argue that the DOJ is a hostile actor, and Apple should act accordingly.

    It's actually worse than that, as not only are they attacked by dishonest individuals claiming that they aren't doing anything but both currently and in the past Apple has had the fact that they are willing to work with law enforcement used against them, with the argument being that if Apple is willing to do A then they should have no problem doing B, even if it is significantly different both in scope and difficulty.

    If Apple offers some help for one thing then they are attacked for refusing to offer help on another thing.

    If they don't offer all the help demanded then it is claimed that they aren't willing to help at all.

    If they're going to be attacked no matter what they do then as the article notes they should probably consider no longer giving their opposition(and at this point with how they're treated that seems an appropriate label for law enforcement) ammo to be used against them, and if law enforcement tries to attack them for that they can simply point out that it's already been claimed that they aren't doing anything, they're simply making it so those claiming such are no longer lying.

    reply to this | link to this | view in chronology ]

    • identicon
      That Other Guy, 21 Jan 2020 @ 8:50pm

      Re: Damned if you do, damned if you don't

      Very well said. I would stress that the DOJ making this bad faith argument is a bullying tactic that really should raise some red flags about where this is headed. It seems clear that no matter how much help Apple may offer, as long as it tries to maintain some balance between the privacy interests of its customers and the legitimate investigatory needs of law enforcement they're going to get slammed because as far as law enforcement is concerned there's no balancing to be done. There is no right to privacy, they need all your data. What's not to understand?

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Jan 2020 @ 11:22pm

      Re: Damned if you do, damned if you don't

      Heck, I just realized that the FBI is like the bad user everyone hates to support.
      They demand to be able to do things in a specific way according to their estimation on how it should be done (fantasy mostly) and when the application or security procedures block for that, they will message the manager telling how bad support they got.

      reply to this | link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 22 Jan 2020 @ 5:05am

      Re: Damned if you do, damned if you don't

      "...if Apple is willing to do A then they should have no problem doing B, even if it is significantly different both in scope and difficulty."

      Where did I hear that argument before. Ah, right, it's what MPAA and RIAA enforcers keep using in their attacks against ISP's.

      you'd think that the DoJ using the same tactics copyright trolls do ought to ring a few alarm bells. This sort of crap should have made waves. And yet everyone appears to accept it as "business as usual" for US law enforcement...

      reply to this | link to this | view in chronology ]

  • icon
    Anonymous Anonymous Coward (profile), 21 Jan 2020 @ 12:58pm

    But were doing this for YOU!

    I wonder how many iPhones the government owns? Apple should take the position that they are protecting government data and encrypt everything. Then, when the government comes along and and claims Apple isn't helping, they can make the truthful claim that they are just protecting the governments data, and how come they don't want that?

    reply to this | link to this | view in chronology ]

    • icon
      That One Guy (profile), 21 Jan 2020 @ 1:23pm

      'If it's good enough for the public, it's good enough for you'

      Alternatively if they want holes in encryption then Apple can apply that to everyone, I'm sure the DOJ would be perfectly okay with using iPhones with known vulnerabilities, I mean it's not like the government would ever have a massive breach exposing sensitive information that could be made vastly bigger and/or happen more often with something like that.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 21 Jan 2020 @ 1:52pm

        Re: 'If it's good enough for the public, it's good enough for yo

        This is why the government wants more teams to attack supposed bad actors rather than having the competence to truly run investigations or do even the simplest things to secure their own data or networks. They view corporate America the same way, which is readily seen by their recent televised "cyber warnings". (Yeah, they have propaganda reasons for those too, but still.) It's all "look out for an attack in the near future because reasons," rather than "hey morons, implement even some basic security for your servers and networks, and maybe how about for your (pointlessly) internet-connected consumer products as well".

        No, they actually want weaker security. It will never truly bite them in the ass, not at the top levels, i'm sure. /headdesk

        reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Jan 2020 @ 2:27pm

      Re: But were doing this for YOU!

      I wonder how many Government backups have been read by foreign government agencies? With bot keys and data in the same system, one mole, or one long running break in and the treasure trove can be opened.

      reply to this | link to this | view in chronology ]

  • identicon
    bobob, 21 Jan 2020 @ 1:44pm

    Losing your data is the price you pay for losing the key unless you want someone else to be able to access your data without your consent. I'd rather take my chances on losing my key, just based on principle.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Jan 2020 @ 3:06pm

      Re:

      but let's add a 'moron' recovery mode with the password 12345 (the same as Dark Helmet's luggage) that allows any user to recover their password.

      I mean what could go wrong with a system like this? Oh, you mean people other than the one who lost their password may use this to access others information? Bah, that's fake news, that would never happen.

      reply to this | link to this | view in chronology ]

    • identicon
      christenson, 22 Jan 2020 @ 11:04am

      Re:

      I'd rather have the CHOICE here....

      Do I trust Apple with all my keys?? YMMV, and that's the point!

      reply to this | link to this | view in chronology ]

  • icon
    Norahc (profile), 21 Jan 2020 @ 6:07pm

    The FBI's real going dark problem is that they keep turning off the damn phones before contacting Apple for help.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Jan 2020 @ 7:58pm

      Re:

      They only look for evidence where the flashlight is pointed, and nobody is allowed to move it, on pain of excommunication.

      reply to this | link to this | view in chronology ]

  • identicon
    Christenson, 22 Jan 2020 @ 6:24am

    Backdoor

    Here's a "magic backdoor encryption key" in action.

    How long before it is abused, Mr Cook?? How long before law enforcement abuses it??

    Oh, and my iPhone is refusing to backup lately... too many pictures and I'm not paying for extra space!

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 22 Jan 2020 @ 7:01pm

      Re: Backdoor

      How long before it is abused, Mr Cook?? How long before law enforcement abuses it??

      Five minutes after it was available. Next question?

      reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Insider Shop - Show Your Support!

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.