Apple Responds To Order To Help Decrypt Phone, As More Details Come To Light

from the both-not-as-bad-and-just-as-bad dept

Update: Please see our more recent article detailing why it appears that this attack could apply to more modern iPhones as well.

Last night, we wrote about a judge's order commanding Apple to help the FBI effectively decrypt the contents of Syed Farook's iPhone 5C. Farook, of course, along with his wife, was responsible for the San Bernardino attacks a few months ago. Many of the initial reports about the order suggested that it simply ordered Apple to break the encryption -- which made many people scoff. However, as we noted, that was not accurate. Instead, it was ordering something much more specific: that Apple create a special firmware that would disable two distinct security features within iOS -- one that would effectively wipe the encrypted contents following 10 failed attempts at entering the unlocking PIN (by throwing away the stored decryption key) and a second one that would progressively slow down the amount of time between repeated attempts at entering the PIN.

Late last night, Apple's Tim Cook also posted a very direct Message to Our Customers that highlights the importance of strong encryption and why this order is such a problem (some of which we'll discuss below).
Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.
He notes -- as I did in my original post -- that the FBI is demanding (and the court has agreed) to force Apple to create a backdoor and that creates a number of concerns:
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
Having spent a bunch of time overnight reading through the details of the DOJ's original application, as well as reading through a few public discussions by security experts and, especially, Robert Graham's useful thoughts about the order as well, I have some further thoughts -- including that I think Cook may be slightly overstating his case with his comments, because it's not actually clear that the backdoor the FBI is asking for would actually work on most modern iPhones, though it might work on older phones. However, the larger concerns about the order are still very much valid.
  1. One of the concerns I raised last night was probably inaccurate: that this could force Apple into creating a tool that would put many people's privacy at risk. The order does seem fairly specific to just this phone. Yes, as Cook notes, if Apple is forced to do this and Apple does it successfully, that would open up similar orders for other phones, but the impact of that may be somewhat limited in that it only applies to older phones, and quite possibly older iPhones that have not updated their operating systems.
  2. It does seem clear that if this were a newer iPhone, which includes Apple's "Security Enclave" system, this request would likely be impossible to meet. It's quite interesting to read the details of how Apple's security now works, where the Security Enclave basically cuts off this possibility, because the firmware update itself would wipe out the encryption key, effectively making it impossible to decrypt the content. It's also possible that even in the older phone this order is still impossible, if the operating system was properly updated -- in part because they may not be able to update the firmware without putting in the passcode, which is the problem they want this new firmware to solve.
  3. I keep seeing people say "why can't they just copy the contents of the memory and brute force it elsewhere" but that's not possible with the iPhone, since a part of the key comes from the hardware itself, and there doesn't appear to be any way to extract it (and Apple does not keep it).
  4. The whole focus seems to be on allowing the FBI to bruteforce the passcode, which is in the realm of possibility should the two impediments above be removed, as opposed to bruteforce cracking AES encryption, which is not currently in the realm of possibility.
Graham disagrees with me over whether this is about decrypting or a backdoor -- but to some extent that's just semantics (and Cook agrees with me that it's a backdoor). They're not asking for a systemwide backdoor -- and, indeed, it appears this approach wouldn't work at all with more recent iPhones. However, the reason I focused on a backdoor, rather than direct decryption, is that the way most people were discussing "decryption" indicated that they seemed to think the court was ordering the impossible, which was "crack the keys you don't have access to." Instead, they are asking for a backdoor -- just a narrow one that can only be used for this phone and would be ineffective against most modern iPhones. And then that backdoor will be used to brute force the passcode, which would then decrypt the content of Farook's iPhone.

That said, there are still serious concerns. While the DOJ insists that its use of the 18th Century All Writs Act in this case is pretty ordinary and standard, they're exaggerating in the extreme. Some of the previous examples they discuss do include requirements to use certain machinery in order to execute a warrant, but that's quite different from ordering Apple to write entirely new software. The DOJ again insists that there are examples of All Writs Act requests in the past that have required software, but it's notable that when the DOJ says this, it does not immediately cite any cases, but rather says just that sometimes providers have to "write code in order to gather information in response to subpoenas or other process." There's a pretty big difference between writing some scraping or search code, which this implies, and creating a special firmware as the DOJ is asking for in this case.

Also, as Cook notes, the unprecedented nature of this is that it's not at all similar to previous cases, because this would involve actively undermining the security of devices, rather than just helping to gather information that is readily available.

In fact, even more ridiculous is the idea, as laid out in the DOJ's application for this order, that this will not be burdensome to Apple simply because Apple writes software:
While the order in this case requires Apple to provide modified software, modifying an operating system - writing software code - is not an unreasonable burden for a company that writes software code as part of its regular business.
Uh, yeah, it can be when what you're asking for is fairly complex and may not even be possible depending on the specifics of the way the security in the iPhone 5C is designed. And, seriously, just saying "Apple writes software, therefore any request for it to write software is not burdensome" is ridiculous on its face.

There's a separate question, raised by people such as Chris Soghoian, about whether or not this particular use of the All Writs Act to force Apple to use its code signing keys to "sign" this new firmware violates the First Amendment in compelling speech. It will be interesting to see if Apple raises this issue in its inevitable appeal of the order.

In the end, this is both a big deal and potentially not a big deal. It's a big deal in that after a few previous attempts to use the All Writs Act to force Apple to "decrypt" content on a phone, a court has not only done so, but done so with fairly specific instructions on what Apple has to do to create a very specific bit of software that removes a couple security features. That raises a bunch of legal questions, that I would imagine Apple will quickly raise in response as well. However, from a technological standpoint, it appears that many of these questions will soon be moot, so long as people have more modern and updated phones. But the bigger concern, as Cook notes, is the precedent here that a court can order, at the behest of the FBI, that a tech company undermine the security of a device. As he notes, once you start down that slippery slope it's not hard to see where that can lead:
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
This particular legal battle is going to get very, very interesting.

Filed Under: all writs act, backdoors, decryption, doj, encryption, fbi, going dark, iphone, syed farook, tim cook
Companies: apple

Reader Comments

Subscribe: RSS

View by: Time | Thread

  1. identicon
    Anonymous Coward, 17 Feb 2016 @ 4:45pm


    See, if the truth is that Apple cannot do it, they would just comply with the court order by attempting to do it, prove that they cannot do it, and it would end at some point (say a year or two from now) when Apple proves it would take years of machine time to try to brute force because there is no other choice.

    Why would Apple spend a year trying to do something they know they can't do, rather than tell the court now that it can't be done? That would be insane. And the order itself tells Apple that it should tell the court within 5 days if compliance is not reasonable. Why shouldn't they just do that?

    I think part of the issue here is that Apple doesn't want to admit it can be done. If they follow the court's order and do in fact unlock the phone, they will have blown away more than half a decade of hype on the topic.

    Even if Apple did secretly have the key or a way to regenerate it, this order doesn't even ask for that. The order does not direct them to attempt to unlock the phone or provide the key; it orders them to give the government a program to do specific things to make it easier for the government to brute-force the passcode.

    "Apple's reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware."

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here

Subscribe to the Techdirt Daily newsletter

Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Techdirt Gear
Show Now: Takedown
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Report this ad  |  Hide Techdirt ads
Recent Stories
Report this ad  |  Hide Techdirt ads


Email This

This feature is only available to registered users. Register or sign in to use it.