No, A Judge Did Not Just Order Apple To Break Encryption On San Bernardino Shooter's iPhone, But To Create A New Backdoor

from the slightly-different... dept

So… have you heard the story about how a magistrate judge in California has ordered Apple to help the FBI disable encryption on the iPhone of one of the San Bernardino shooters? You may have because it’s showing up everywhere. Here’s NBC News reporting on it:

A federal judge on Tuesday ordered Apple to give investigators access to encrypted data on the iPhone used by one of the San Bernardino shooters, assistance the computer giant “declined to provide voluntarily,” according to court papers.

In a 40-page filing, the U.S. Attorney’s Office in Los Angeles argued that it needed Apple to help it find the password and access “relevant, critical ? data” on the locked cellphone of Syed Farook, who with his wife Tashfeen Malik murdered 14 people in San Bernardino, California on December 2.

And you’d be forgiven for believing that the court has now ordered Apple to do the impossible. After all, for well over a year, the DOJ has been arguing that the All Writs Act of 1789 can be used to force Apple to help unlock encrypted phones. And that’s an argument it has continued to make in multiple cases.

Many people are now mocking this ruling, pointing out that with end-to-end encryption it’s actually impossible for Apple to do very much to help the FBI, which makes the order seem ridiculous. But that’s because much of the reporting on this story appears to be wrong. Ellen Nakashima, at the Washington Post, has a more detailed report that notes that Apple is actually required to do something a little different:

The order does not ask Apple to break the phone?s encryption, but rather to disable the feature that wipes the data on the phone after 10 incorrect tries at entering a password. That way, the government can try to crack the password using ?brute force? ? attempting tens of millions of combinations without risking the deletion of the data.

The order, signed by a magistrate judge in Los Angeles, comes a week after FBI Director James B. Comey told Congress that the bureau has not been able to open one of the killers? phones. ?It has been two months now, and we are still working on it,? he said.

In other words, the order does not tell Apple to crack the encryption when Apple does not have the key. Rather, it is asking Apple to turn off a specific feature so that the FBI can try to brute force the key — and we can still argue over whether or not it’s appropriate to force Apple to disable a key feature that is designed to protect someone’s privacy. It also raises questions about whether or not Apple can just turn off that feature or if it will have to do development work to obey the court’s order. In fact, the same report notes that there is no way for Apple to actually do this:

According to industry officials, Apple cannot unilaterally dismantle or override the 10-tries-and-wipe feature. Only the user or person who controls the phone?s settings can do so. The company could theoretically write new software to bypass the feature, but likely would see that as a ?backdoor? or a weakening of device security and would resist it, said the officials, who spoke on the condition of anonymity to discuss a sensitive matter.

So you could argue that this is effectively the same thing as asking Apple to break the encryption, since it (apparently) has no direct access to turning off that feature. However, the specifics do matter — and most of the kneejerk responses to the order (and the reporting on it) are suggesting something very different than what the court order seems to say.

I think it’s still perfectly reasonable to argue that this order is highly problematic, and not legally sound. However, it is still quite different than what most are claiming. It also seems like something that could be quite dangerous. Apple is being pressured to write code that undermines an important security feature, and will probably have little time to debug or test it overall, meaning that this feature it is being ordered to build will almost certainly put more users at risk.

Update: Okay, we’ve got the full order and it is, indeed, troubling. Here’s the key part:

Apple’s reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

Apple’s reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File (“SIF”) that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device’s flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE. The SIF will be loaded via Device Firmware Upgrade (“DFU”) mode, recovery mode, or other applicable mode available to the FBI. Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2. The SIF will be loaded on the SUBJECT DEVICE at either a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowing the government to conduct passcode recovery analysis.

If Apple determines that it can achieve the three functions stated above in paragraph 2, as well as the functionality set forth in paragraph 3, using an alternate technological means from that recommended by the government, and the government concurs, Apple may comply with this Order in that way.

The order also sets out that:

To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief within five business days of receipt of the Order.

I would imagine that Apple will be taking the court up on that…

Filed Under: , , , , , , , ,
Companies: apple

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “No, A Judge Did Not Just Order Apple To Break Encryption On San Bernardino Shooter's iPhone, But To Create A New Backdoor”

Subscribe: RSS Leave a comment
341 Comments
TechDescartes (profile) says:

TL;DR

So… have you heard the story about how a magistrate judge in California has ordered Apple to help the FBI disable encryption on the iPhone of one of the San Bernardino shooters?

A magistrate judge, an Apple employee, and an FBI agent agree to meet at a local bar. Only the Apple employee makes it. Why? Because the bar didn’t have a back door.

Troy Hoyt (profile) says:

Re: TL;DR

Too long; didn’t read?
…it must be nice to be so damned important that you don’t have time to read an article about something as important, and as troubling as this.

Are you actually that important, or are you actually just lazy?
You’re going to have to be a bit less lethargic if you hope to make a go of it in the world of stand-up… Especially with jokes like that one.

Anonymous Coward says:

Re: Re: TL;DR

The author didn’t bother to read the court order. Why read his story when the comments are so much better.

Paragraph 4; page3; line3 says Apple can retrieve the data in anyway they want. The FBI only needs the data so they will concur with any technology Apple wishes to use. The court goes on to say Apple doesn’t have to give the FBI any of the technology used to get the data.

I am wondering why Tim Cook threw such a hissy-fit in a blog.

GMacGuffin (profile) says:

or maybe it's more calculated than it appears...

So, we have Judge Pym stuck between the US Attorneys and defense counsel arguing about what Apple can/cannot do, driving her mad. She’s gotta do something to move this forward, or at least to shut counsel up, and the best way to find out what Apple can or cannot do is to ask Apple.

And she may well even understand the implications of asking Apple to undermine its own encryption. But the best way to get that in the record is to give Apple a chance to fully explain why it is a bad idea, or impossible. Notice and opportunity to be heard.

And Apple is not likely to say “Yeah, we can write this backdoor brute-force buddy software” because that would mean that someone else could write that software, which would mean that Apple’s encryption now has a known point of potential compromise. So Apple will say it can’t write that software. And then the US Attys will hopefully shut up about it already.

(It’s not easy to become a federal magistrate …)

elemecca (profile) says:

Re: or maybe it's more calculated than it appears...

And Apple is not likely to say “Yeah, we can write this backdoor brute-force buddy software” because that would mean that someone else could write that software, which would mean that Apple’s encryption now has a known point of potential compromise. So Apple will say it can’t write that software. And then the US Attys will hopefully shut up about it already.

This isn’t entirely true: as noted in the order, any OS-level software to be run on an iPhone needs to be signed by a cryptographic key held only by Apple unless it exploits a vulnerability in the phone’s existing software to install itself (i.e. jailbreaking). It is therefore much easier for Apple to provide this kind of modified software than for a third party. The signature requirement also means that if, as requested in the order, Apple makes the putative custom OS image check the device ID of its host to ensure that it’s running on the target device, that check will have teeth because if it’s edited out the signature will no longer be valid.

Also, the modified software wouldn’t actually weaken the disk encryption scheme itself. It would make it easier to attack weaknesses in the user’s choice of key on this particular device, but if the user chose a decent password a brute-force search would still take prohibitively long.

Of course, that doesn’t really change the likelihood of Apple complying with this order without a fight. It just affects your reasoning as to their motivations.

Danny says:

Re: Re: or maybe it's more calculated than it appears...

The timer for pause between guess attempts is implemented in the hardware not software, so no OS patch would impact this.

The delete key routing is also implemented in hardware so this may not be bypassed either

The other difficulty is the OS would need to be decrypted, loaded, to be patched.

The users on this board saying the phone can be cloned is untrue as the key and uuid are held in the secure enclave in the processor itself, cloning the storage would not be a means to get infinite 10 guesses as the key would not be present to test against.

This is technically impossible for Apple to achieve.
The courts just dont like that.

Anonymous Coward says:

Re: Re: Re:2 or maybe it's more calculated than it appears...

A hardware CPU emulator assume that the CPU and the memory are separate chips, or that all CPU pins are brought out of an integrated chip, with a and include a pin to disconnect the CPU from it signals. This is probably not possible with the chips Apple use, as it is easier to use a software emulator of the system to develop the code. Flashing a chip with debug software is not permissible when gathering evidence, as any change to the firmware render any information gathered as inadmissible.
Note that the order requires that the new software run in ram, and does NOT alter any of the ROMs on the system. So it looks like the FBI want software that can be used for fishing expeditions in the future, or hope to find evidence to incriminate other people on the phone, otherwise the restriction would not be required as there is no court case to be brought against the deceased owner of the phone.

Gandydancer (profile) says:

Re: Re: Re: or maybe it's more calculated than it appears...

I don’t believe for a moment that decrypting the data on the phone is impossible for Apple, though the exact method specified as the default may not work (but read the last paragraph of the first quote).

My understanding is that the phone in question is an old model that doesn’t implement the “secure enclave”.

But if the data can be extracted then it’s just stupid to do the decryption in the phone.

Gandydancer (profile) says:

Re: Re: or maybe it's more calculated than it appears...

“…if the user chose a decent password a brute-force search would still take prohibitively long.”

The FBI appears to believe that a brute force method will work, and I know no reason to disbelieve them. Most people don’t, after all, use “decent” passwords. And my understanding is that the allowed password isn’t very long. The encryption key is, I understand, from a hash with the device id, but if the latter is known that doesn’t increase the number of necessary tries.

Jon Connor says:

Re: or maybe it's more calculated than it appears...

Actually it is easier to become a federal magistrate than you think. Typically it’s major campaign donors who are appointed, or lawyers that the administration likes and wants to place on the federal bench. I know this because I used to work for a federal magistrate Who donated gobs of money until he finally put him on the bench.

KidOmaha (profile) says:

Re: Re: or maybe it's more calculated than it appears...

You clearly do not know a whole lot about the appointment of federal magistrate judges. A federal magistrate judge is hired by the District Court judges in any particular district. They serve as at-will employees of the judges of the court. The “administration” has nothing to do with selection of magistrate judges. The same is true with respect to campaign donors. Those factors are somewhat true when discussing the appointment of federal district or Circuit court judges (who receive lifetime appointments after Senate confirmation) but simply is not the case with respect to the magistrate judges.

IP Lawyer says:

Re: or maybe it's more calculated than it appears...

No, she doesn’t have to do anything. She’s a federal magistrate judge. She can tell the fed to go pound sand and that would be that.

This order is deeply troubling. Getting any order overturned is very difficult. Make no mistake – this is not an invitation for Apple to defend itself and win on appeal – it is simply a blunt ruling in the favor of the FBI. And five days for a response is an insane timeline.

Your analogy is like saying that a referee making a call that awards points to your opponent minutes before the end of the game is simply an invitation to play harder. It’s bullshit.

GMacGuffin (profile) says:

Re: Re: or maybe it's more calculated than it appears...

I was presenting a somewhat cynical but pragmatic viewpoint. The judge must know it’s a big-ticket issue, and that Apple is in the best position to answer it.

Obviously the order is troubling, and I agree that five business days to oppose is prohibitively short. But then, perhaps it doesn’t take much to show that coding new software for one case is unduly burdensome.

I was going from the basic premise that I have never been before a magistrate who was truly an idiot (which I cannot say about Article III judges), so perhaps there was some calculation behind the otherwise frightening order. (AKA, trying to find reason and order amid the chaos.)

Fred_Flintstone says:

Re: or maybe it's more calculated than it appears...

So, what is the difference of breaking into a phone and turning off a key security feature and letting someone then ‘brute force’ break into a phone….same thing.

Apple has their own technology and is pivotal to their features like ApplePay, etc. Now, if Apple builds this back door to turn off the security feature….don’t you think Hackers will find it pretty quickly and be able to do the same. This will damage Apple’s reputation and their business……so yes, they should speak up when the Government makes demands on them.

Iverson says:

Re: or maybe it's more calculated than it appears...

The Federal Magistrate was prior to 1968, known as the “Park Commissioner”.

Following the Magistrates Act, a name change occurred, where the Park Commissioner whose duties were for the administrative adjudication of civil issues occurring within the jurisdiction of the “National Park”, now donned a black robe, to act as an administrative officer for the geographical jurisdiction outside the Federal Park to include the arena of the Federal District Court.

No, becoming a Federal Magistrate is not difficult, for it is first a political appointment.

Secondly, the statutory duties have not changed, for the Park Commissioner who now sits as the Magistrate, is still an administrative officer whose actions are overseen by a a Presiding Politically appointed Federal District Court Judge.

When the Federal Magistrate sat as the Park Commissioner, its decisions were then reviewed by the sitting Federal District Court Judge.

The Park Commissioner, who now sits as a “Federal Magistrate”, is required to submit their decisions to be reviewed by the sitting Federal District Court Judge.

One point, is since 1968, the Park Commissioner, now known as a “Federal Magistrate” may be required to be a registered “BAR” Attorney listed on a State Registry compiled by a respective State’s high Court.

KidOmaha (profile) says:

Re: Re: or maybe it's more calculated than it appears...

This is largely incorrect. To begin with, your statement that, “becoming a Federal Magistrate is not difficult, for it is first a political appointment” is simply false. Further, the term Federal Magistrate was abolished in the 1970’s and they are now called federal Magistrate Judges. They are hired by the District Judges of a court. In almost all scenarios, the magistrate judges’ decisions are not reviewed by the District Judge. They can exercise both criminal and civil jurisdiction. Federal Magistrate judges are, in my 18+ years experience of practicing law, extremely well qualified as a general rule. While a lifetime appointment as a District or Circuit Court judge can involve politics for appointment to a lifetime judgeship following Senate confirmation, these magistrate judges are selected by merit and retained the same way. Magistrate judges ARE NOT required to “submit their decisions to be reviewed by the sitting Federal Court District Judge.” Here’s a good article to read to refresh your stale information regarding federal magistrate judges, their authority and their process of appointment. http://www.nced.uscourts.gov/pdfs/Selection-Appointment-Reappointment-of-Magistrate-Judges.pdf

Wastrel (profile) says:

Re: Possible?

Court orders are generally written by counsel (in this case, the attorneys for the government) and argued about before they are signed by the judge. As you can see from the document that is linked, it was originally a “proposed” order. No doubt Apple had a proposed order, too, and the judge decided in favor of the government’s.

Anonymous Coward says:

Apple doesn't follow court orders

https://ia601402.us.archive.org/34/items/gov.uscourts.wieb.358648/gov.uscourts.wieb.358648.240.0.pdf

See the part “However, the court did ORDER that Mr. Rassbach is the owner of the iPad with serial number F5RKXNH1DFHW and that he is entitled to all incidents of such ownership”

The device was called “lost” by the person it was seized from in a Writ of Replevin and it is locked and otherwise unusable.

It is in Apple’s best interest economic interest to keep devices locked so they can attempt to sell a replacement.

Even More Anonymous Coward says:

Re: Apple doesn't follow court orders

Really? You didn’t find the statement directly before the one you pasted relevant?

“The court noted that Apple was not a party to this case, so the court could not properly order Apple to do anything.”

Whether or not Apple is invested in avoiding this for economic reasons, in your example they were not ordered to do anything except recognize that Rassbach was the owner of the device. From there, it’s up to their own terms and conditions whether or not they have to help him unlock it.

David says:

Re: Send the phone to NSA

That won’t work. The key isn’t stored anywhere on the device. It’s in the owner’s head.

User types in PIN code -> iOS runs 50,000 PBKDF2 key derivation rounds (number of rounds is a guess). This key is then used to decrypt the file system.

Without the PIN, there is nothing to find.

Anonymous Coward says:

I don’t really see the big deal here from a security standpoint. If apple can find a way to disable a security feature that doesn’t make the device any less secure than it already was. That vulnerability already existed from before and anyone else could have used it. Exploiting an existing vulnerability is not what makes the device more vulnerable it’s the fact that the vulnerability is already there that makes it less secure. Next time Apple needs to make a more secure device.

Now Apple being compelled to help may be a different story and I’m not sure where I stand there. To what extent can the government reasonably compel a private party to provide services they don’t wish to provide?

and I don’t really see what’s the big deal. Isn’t the encrypted data stored on some sort of flash memory? Can’t the encrypted data just be directly extracted and copied from its storage medium without going through the rest of the device and then be placed on a very fast computer that contains the decryption and verification algorithm minus the delete portion of said algorithm? The computer can then proceed brute force the password at a very fast rate. If apple is claiming this can’t be done I call lies.

Anonymous Coward says:

Re:

They are not lying. There are a few things you simply aren’t aware of.

First off, Apple’s security scheme assumed that only Apple would have the signing keys to modify the software running on the device, so attackers would not be able to remove the 10 try lockout limit. The court is attempting to force Apple to perform the attack themselves. So no, this is not an ‘existing vulnerability’ per se. It is outside the assumptions made by their security model. Namely, the assumption that Apple themselves would not be trying to crack a particular user’s encrypted device.

Second, the data can not be extracted without Apple’s help. The iOS disk encryption scheme stores device specific keys to the flash memory somewhere on the motherboard. (it may now actually be on the CPU die at this point) This is so you can not put the memory chip from one iPhone into another and read it. This is in addition to any user-supplied password, and uses encryption that even the NSA can not break with all the computing power on earth (we think). So to get those keys, the device needs to be booted up, and have its software modified to report the keys. Again, modifying the software requires Apple to sign the update.

Just to sure up that last point, reading the keys directly from the chips using xrays/microwaves would involve destroying the chips and risk destroying the keys as well, making the data irrecoverable. So getting Apple’s help is a reasonable approach from a technical standpoint.

Anonymous Coward says:

Re: Re: Re:

. The iOS disk encryption scheme stores device specific keys to the flash memory somewhere on the motherboard. (it may now actually be on the CPU die at this point)

The “wipe” ‘feature’ is not a re-setting of the 1’s and 0’s on the flash memory it is the “forgetting” of the encryption key.

If one can pop the top off the chip and not destroy the place where the key is kept, the key can be read via probes.

Good old fashioned police work should get most of the same data as reading the phone data.

Anonymous Coward says:

Re: Re: Re: Re:

Probably not, unless Apple really screwed up on the security.

The PIN or whatever that the user enters is not directly used for the encryption. Instead, it gets combined with other device factors (e.g., hardware identifiers), then gets run through a key-lengthening algorithm (e.g., thousands of rounds of PBKDF2), to generate the actual encryption key. The result is that the only way to brute-force decrypt the data, lacking some of the inputs, is to try more encryption keys than there are atoms in the universe.

There could be a flaw in the encryption that makes this easier, of course, but that’s now starting to pile up the mistakes that would be required in order to successfully decrypt the data.

Gandydancer (profile) says:

Re: Re: Re:2 Re:

If the only input that is missing is the 8-character PIN then the number of tries in a pure brute-force method is the number of possible 8-character PINs. The key-lengthening etc merely imposes computational delay, and if an iPhone can do it in a reasonable amount of time then more capable hardware will breeze through that. So it comes down to my first if.

Uriel-238 (profile) says:

Re: Re: Re: Apple

I’m not entirely anon, but…

~ People have a right to privacy, and access to technologies that would assure that (e.g. an impossible-to-break phone end-to-end encryption).

~ That said, I don’t see a legal reason why the FBI shouldn’t be allowed to try to crack open the phones of San Bernadino killers Syed Rizwan Farook and Tashfeen Malik. They’re dead and they have no rights.

~ I don’t think Apple can be forced to provide an extensive amount of service to the courts to try to crack the phone, through if it is kept secure by algorithm obscurity that is an encryption failing, and it means those phones will eventually be hacked.

Considering the way true encryption works, I can see the courts demanding that Apple de-obscure their phone’s password security.

We have ways to turn a password, even short ones (Less than 64 characters) into ciphers that are expensive to crack, and so Apple has no excuse providing a form of encryption that is hobbled so as not to be.

KnightGeek says:

Re: Re:

You’ve got the wrong idea here. There is no vulnerability that they are taking advantage of. They are being asked to write a special version of iOS, that doesn’t have the delay between passcode attempts. Only Apple could do this, because the phone checks itself each time it turns on and matches signatures to ensure ios is still secure and hasn’t been tampered with. You also couldn’t perform a manual analysis or extraction, because as you stated, it would need the encryption algorithm and there is no way Apple would provide that to them or anyone. That algorithm is actually unique per device, so again, only apple can do this for them.

This opens a precedence issue.

nasch (profile) says:

Re: Re:

To what extent can the government reasonably compel a private party to provide services they don’t wish to provide?

That’s what I was wondering. Apple has nothing to do with this case, and they don’t own the hardware in question. Why can the government compel them to help? If they decided I had skills that would be useful to them in an investigation, could they have a court order issued that forces me to help in an investigation whether I want to or not? This seems very wrong.

ijuin says:

Re: Re: Re:

[i]To what extent can the government reasonably compel a private party to provide services they don’t wish to provide?[/i]

They can compel you on the same grounds that they can draft any random guy into the military and send them to die in combat without consent. Any argument against the federal government being allowed to compel compliance that would stand up in court could also be used as an argument against the validity of conscription.

Anonymous Coward says:

Re: Re:

Karl explains how it would make iPhones (and other devices) less secure if the FBI is able to make Apple do this. It’s a very interesting read.

Screw The FBI: The Model Phone Manufacturers MUST Adopt

http://market-ticker.org/akcs-www?post=231126

So the FBI wants a custom firmware load that will allow:

Any number of password attempts. No “10 wrong and you’re done” auto-wipe.

Any means of entering them. No “must key them on the screen.”

This then means the FBI can attempt to “brute force” the password using a computer over the USB interface and, they demanded, any other means such as Wifi, cellular or Bluetooth!

The latter would mean that for the future they would not even have to physically posses the device. That’s right — they could hack it any time they wanted, from anywhere, at any time.

(snip rest)

Matt (profile) says:

Re: Re:

No, it is not that easy to break AES-256 encrypted data (which, after a cursory google search, it seems Apple uses). If there is a side channel attack available to cracking an iPhone due to a flaw in Apple’s implementation of the protocol then that is a different matter. Apple introducing a backdoor (which, contrary to what the title of this article states is EXACTLY what is being demanded) would be a HUGE flaw in implementation and would render the phone insecure.

Uriel-238 (profile) says:

Re: Where's the NSA

The more specific question a court might be asking is where the crack-teams-with-supercomputers are for when they have the encrypted phone of a dead / uncooperative terrorist (not that many terrorists have encrypted phones, or much data on them).

I’d think the NSA and FBI have access to something meaty and mainframey that could have a brute-force go at an iOS7 iPhone, and then the whole 20-attempts-and-the-phone-bricks software is moot.

The encryption will be less moot, but that’s the real roadblock here.

Also the jurisdiction of the court (is that the right term? IANAL) to turn to an agency (or a company) and say Here, use your big nutcracker to crack this nut.

nasch (profile) says:

Re: Re: Where's the NSA

I’d think the NSA and FBI have access to something meaty and mainframey that could have a brute-force go at an iOS7 iPhone, and then the whole 20-attempts-and-the-phone-bricks software is moot.

How would that be moot? If they start brute forcing it and it wipes itself after 10 tries, all their massively powerful computer will accomplish is to brick the phone faster.

Uriel-238 (profile) says:

Re: Re: Re: They're not playing the ten-guesses game.

Because with a mainframe attack, they’re only examining the file structure and data, not using Apple’s wipe-after-X-fails routine.

The FBI crack software assuredly doesn’t give them only a limited number of attempts, and it assuredly doesn’t wipe the data.

That’s the point of a full phone encrypt. Similarly, Windows account password doesn’t encrypt the drive, so when the police nick your desktop PC, they’ll probably not even boot it, but analyze the (probably unencrypted) files of the drive.

nasch (profile) says:

Re: Re: Re:2 They're not playing the ten-guesses game.

Because with a mainframe attack, they’re only examining the file structure and data, not using Apple’s wipe-after-X-fails routine.

The FBI crack software assuredly doesn’t give them only a limited number of attempts, and it assuredly doesn’t wipe the data.

I’m not sure what you’re getting at. They cannot bypass the phone’s security features without Apple’s help, or they would just do that. They cannot offload the data and crack it outside the phone because the AES-256 encryption is not susceptible to brute force attacks in a useful amount of time with current technology. They must brute force the phone’s access key (not the same as the AES encryption key), which requires the phone hardware. This means they need a way to disable the wipe-after-10 feature.

Uriel-238 (profile) says:

Re: Re: Re:3 They're not playing the ten-guesses game.

The last time that I played the encryption game, depending on security through obscurity is a no-no. You assume that people know how it’s encrypted.

If they can force Apple to bypass the access key and by that break (or bypass) the AES encryption, then that means the encryption is pretty much hobbled. There is a back door (or at least a thin back wall), and people who shouldn’t have access to it do.

Apple shouldn’t have an option here that will help the court. If Apple does have an option here, then that means people shouldn’t be relying on that encryption in the first place. It’s a false product.

If this is the case, say that the pin is used to generate one of a small number of AES keys, then the FBI can just have Apple hand over the algorithm, write their own code and attack the data.

nasch (profile) says:

Re: Re: Re:6 Do you believe in magic, in a young girl's heart...

Anything they could do within the phone’s device-wiping framework they could do outside it.

That is incorrect. The key system of the phone is partly encoded in the phone’s hardware. If they remove the data from the phone, they could no longer access it through the phone’s security system, and would be left with brute forcing the encryption directly. This is not feasible.

KidOmaha (profile) says:

Re: Re: Where's the NSA

I’m not sure why you think that “not that many terrorists have encrypted phones, or much data on them.” The growing problem our intelligence agencies are facing is the prevalence of third party apps that make encrypted communications readily available to terrorists. Further, a court lacks any power to to turn to some corporation or government agency and say, “Here, use your big nutcracker to crack this nut.” Finally, the point of this article and of all the comments is that the government DOES NOT have anything readily able to crack the encryption of this iPhone.

nasch (profile) says:

Re: Re: Re: Where's the NSA

The growing problem our intelligence agencies are facing is the prevalence of third party apps that make encrypted communications readily available to terrorists.

Has there been a terrorist attack that was facilitated by encryption? Even one? Has any intelligence agency shown any evidence that any of their investigations have been hampered by encryption?

Anonymous Coward says:

Isn’t one of the major rules in computer forensics basically “Thou shalt not fuck with the actual device”? It’s It’s been a few years since I had anything resembling a data forensics class but at the time I was under the impression that step 1 was to make a bitstream forensic image of whatever it was you were examining otherwise you risk contaminating the evidence. Is it no longer possible to do that?

Arthur Moore (profile) says:

Re: Re:

Well yeah, but you run into the same issue that self encrypting drives have. The data in flash memory is encrypted with a huge key which is kept on the encryption chip. There’s no way to read the key without de-encapsulating the chip, and that carries with it significant risk. Those things are meant to be tamper proof after all.

If apple did this correctly, the encryption chip is a separate component, that can not have its firmware changed. It can even still be in the same package as the CPU. By tightly defining the security element’s inputs and outputs you can create an extremely hard to crack system, even if it can’t receive firmware updates.

My fear is that the checks, including the counter for number of retries is handled by upgradable firmware. That would mean not only could apple crack any phone, but the next time a bootloader jailbreak is found everyone else could too.

Anonymous Coward says:

Re: Digital Forensics

They’re seeming to want to try not polluting the bitstream too much by ordering it to run in RAM…. however yea, you’re right they’re fucking with the device. AFAIK they can’t rip a clone because it’s locked and any attempt outside of an NSA or TLA lab specifically equipped to rip locked/encrypted data that doesn’t want to be copied because the device is locked and thus all ‘Fuck off I ain’t telling you shit. 5th amendment and all that cocksucker!’ as far as letting a PC copy stuff the normal way. IE over USB or a backup suite. I don’t know if digital forensics software can force a mount of the device yet it seems that either:
The decryption algorithm is locked in hardware so a copy of the data is useless
OR
The government doesn’t want to let loose they CAN rip data anyways from locked/encrypted devices
OR
They can’t actually rip data from the device in any meaningful way

z! (profile) says:

Not an iPhone user here, but is it even possible to load new software while the phone is locked? Usually software update is a user-mode application, thus can’t be started while locked. Off hand, the only alternative that I see is to in-circuit rewrite the OS in the flash memory; if there’s a file system involved (and if there’s any crypto on that) it becomes somewhat difficult.

JBDragon says:

Re: Re:

I think this is a older iPhone running iOS7 and so security isn’t as tight as it got in iOS8 and newer where a Password is required to so anything. I think in iOS7 you could still plug your phone into a PC and run iTunes and install a newer OS that way without having to unlock the phone. I’m not 100% sure on that, but I think that’s the case. So if Apple created a new iOS 7 version and called is say 7.4 and a new correct digital signature and made it so you could brute force the password. disabling the 10 try and then wipe feature or the so many try’s and then delay and longer and longer delays, that could be possible. Install the update OS, then brute force Unlock it. You of course could already just wipe the phone and start over unlocked of course, but they want the Data on it.

Once you get to iOS8, you need to enter the password to do anything. You can’t update the phone plugging into the computer and using iTunes either without the phone being unlocked. Apple really tighten up security on the iPhone. Part of that reasoning was if someone steals your iPhone, they can’t plug it into a PC and wipe it or anything. It’s LOCKED UP and worthless at that point without the password. You’re less likely to get mugged for your phone if people can’t steal it, wipe and and sell it for a couple hundred. If it’s locked up, it’s almost worthless. Part it out I guess but it would be so very little to make it not worth mugging people over.

To get people to use the Security, Apple made it as simple as possible by adding TouchID and having it built into the home button. It’s almost as fast to get onto a iPhone with security on as not using it at all. I use a 8 digit number of my iphone. Good luck trying to brute force that!

Anonymous Coward says:

Free labor for the fbi now?

So now this sets a precedent that if the Government does not want to pay for its workers or hire the best, it can just get a court order to force some other company to do their work for them.

The judge showed what a fool he is. Another dinosaur making decisions that he has no understanding of.

Kern says:

Re: Free labor for the fbi now?

Don’t forget that not long after the attack the FBI said it had figured out all of what the attackers had done for a day or two except for about 15 minutes and asked the public to come forward with information to fill in that gap.So how did the FBI do all that legwork so quickly? What doesn’t it know ?

Whatever (profile) says:

A couple of things to note here: First off, if they have the phone in theory they can clone all of the memory without wiping it. That would mean they could have an unlimited number of 10 times tries. So the limit doesn’t really exist, it’s just there to safeguard.

Second, the software that enforces that 10 swipe limit could also be disabled on a phone that clones the memory of the locked phone. So yes, it can always be disabled, if Apple so desires. It’s just code, and it can be removed.

Third, love it or hate it, Apple likely does either have a back door or knows the most direct method by which to undo the encryption on their own phones. They created it, they will generally know the answer.

It is quite possible that there is no real way to easily unlock the phone except brute force. Apple may be able to look at the encoded results and perhaps determine a key length or something similar that could narrow the search, but likely they will have to use the same blunt tool the rest of us use, albeit with the advantage of not having to deal with the 10 and fail problem.

I suspect Apple already has a tool, but they aren’t going to tell anyone about it.

Is this a good ruling? Well, it’s not a terrible ruling. Love it or hate it, it is a very good indication that there are reasons that encryption does present certain drawbacks to law enforcement. This is an extreme case, but it does show the block, and shows that in exceptional cases, perhaps options are needed. I do understand that “options” means that hackers can do the same, but is the price worth the result to do otherwise?

Mike Masnick (profile) says:

Re: Re:

Yet again, almost nothing “Whatever” has to say is accurate.

First off, if they have the phone in theory they can clone all of the memory without wiping it. That would mean they could have an unlimited number of 10 times tries. So the limit doesn’t really exist, it’s just there to safeguard.

This is wrong. The encryption key is embedded in the hardware. Clone the memory and try it somewhere else and you’re done.

Second, the software that enforces that 10 swipe limit could also be disabled on a phone that clones the memory of the locked phone. So yes, it can always be disabled, if Apple so desires. It’s just code, and it can be removed.

Again, this is wrong. The key is in the hardware and cannot be cloned.

Third, love it or hate it, Apple likely does either have a back door or knows the most direct method by which to undo the encryption on their own phones. They created it, they will generally know the answer.

Apple has always claimed that they throw away the key. If this is NOT true that would be a HUGE discovery and would destroy a ton of trust in Apple.

Whatever (profile) says:

Re: Re: Re:

Always fun to watch you wave your sureriority complex over the crowd. It’s almost awe inspiring, if it wasn’t quite so, how can you say it, amusing?

Apple’s system in theory is all that, you are correct. However, much of it hinges on the question if Apple’s system does generate truly unique codes that cannot be broken, read, or otherwise accessed. There is plenty of debate online to that question. There is also a question in regards to the repeatability of the process under which the UID is created. There is the potential that the UID could be gleaned or otherwise determined by repeated the process under which is was created (because random numbers are rarely truly random).

The two real security measures that have to be overcome is the 10 attempts pincode limit, and the 5 second delay per failed attempt (there to stop brute force attacks). If there is any way to determine the UID, the rest is a walk in the park. Taking the data out of the phone and putting it into another device with the same UID and no other security would turn this into a short project.

“Apple has always claimed that they throw away the key. If this is NOT true that would be a HUGE discovery and would destroy a ton of trust in Apple.”

Actually, Apple is very careful in how they phrase this. As the secure area is created by others, the potential is that Apple doesn’t have the method or retain the key, but others do. Plenty of chatter online in those areas as well. One would also have to wonder how Apple would deal with governments in places like China on this topic. There appears to be plenty of wiggle room here for Apple to have been telling the truth in concept but perhaps having outs it doesn’t want to talk about.

I actually wonder why Apple would be fighting so hard if they can just simple show that it doesn’t work, that they cannot do it, and brick the phone after a particularly bad attempt I think they are very concerned that the government already knows the real answer (that it is in fact possible) and Apple is trying very hard to stick by their “it’s impossible” story.

Anonymous Coward says:

Re: Re: Re: Re:

“Actually, Apple is very careful in how they phrase this.”

“There appears to be plenty of wiggle room here for Apple to have been telling the truth in concept but perhaps having outs it doesn’t want to talk about.”

Zero wiggle room, all the legal babble and clever wording in the world won’t change that fact that If they crack it once, then it’s not secure, and no one will trust them again. Especially after they made a big deal about protecting users data.

Jim McDonald says:

Re: Re:

I am sorry, but If you read the first 10 amendments of the constitution, you discover that the intent of the document was to LIMIT powers granted to the government by the Public. The 4th amendment specifically states that people shall be secure in their papers, possessions, and persons, unless warrants are provided.
The 5th amendment specifically states that you cannot be compelled to give testimony against yourself. IN ANY CASE.
Passwords and encryption phrases are “Products of the Mind” according the the 11th district Court of Federal Appeals. And thus testimony. If the federal government wants to work at entering the Safe, without the Combination, they are given permission to do so. What the FBI wants here is 2 things.
1. Simple access. No need to focus a lot of resources on this, just pop it into the faraday cage, plug in this app, and wallah, we have it all.
2. The precedent that such a App will not be rendered unusable in the future.
Why did this go to a Admin Law Judge? (Or Magistrate?) Because a Real judge would have thrown this out the door based on the 11th district court of appeals ruling.

Cops have a dirty, dangerous, difficult job. It is THEIR CHOICE. Want something easy? Be a burger flipper at McDonalds.

KidOmaha (profile) says:

Re: Re: Re:

I don’t think I disagree with the main point you are trying to make but I want to pint out a couple of things. First, there is no such thing as “the 11th district Court of Federal Appeals. There are “District Courts” which are the federal trial courts and there are “Circuit Courts of Appeal,” the federal appellate courts. There is then the U.S. Supreme Court. The 5th Amendment states as you say it does but it is not implicated in this matter. No one is prosecuting Apple for a crime thus it is not refusing to testify against itself. The court’s order was not issued by an administrative law judge but by a federal magistrate judge. Federal Magistrate Judges routinely issue preliminary orders in both civil and criminal cases. I agree that the order is improper but the fact that it was issued by a magistrate judges versus a District Court Judge is of no consequence.

TruthHurts says:

Re: Re:

The FBI, CIA, NSA, Homeland Security and TSA are already on the terrorist watch list tied for pole position. They only trail behind Congress and the Presidents since 9/11.

9/11 didn’t change the Constitution, nor the bill of rights.

That means every patriot act like law, executive order, kangaroo courts setup post 9/11 are all acts of treason, punishable by death.

When do we start stepping on their necks and balls forcing the arrests that should have already happened?

There are how many millions of us, and only thousands of them, with the military 100% on our side as the honorable veterans swore to uphold the Constitution above all other laws or orders, we’ve got this in the bag.

rorybaust (profile) says:

what error number will this get

if Apple can stop external repairers with an error 53 all they need to do is tell the court that they tried but got an error 54 and sorry they need to buy a new device. But really if the best brains in US justice cant beat the security then of course all Apple have to say is No it can’t be beaten and the fact they have to ask should already mean its pretty robust

Anonymous Coward says:

This iPhone will self destruct in 5..4..3..2..1..

Apple should add an optional timer and or counter to the locking settings. A timer should wipe the phone a certain amount of time (set by user) after the nth wrong passcode guess. And the timer should be started after the nth reboot if the phone hasn’t been unlocked in the interim. And someone really security conscious might just want the timer to countdown from the moment the phone is first locked. If you use it every day, it wouldn’t be wiped. If it’s stolen, 36 hours later, poof.

andy says:

Re: This iPhone will self destruct in 5..4..3..2..1..

“Apple should add an optional timer and or counter to the locking settings. A timer should wipe the phone a certain amount of time (set by user) after the nth wrong passcode guess. “

Uhhh, that’s already IN the iPhone, and the FBI wants Apple to eliminate this feature to make future attempts to unlock phones easier.

Mark Wing (user link) says:

Technically this isn’t as big of a deal as everyone is making it out to be. The whole premise of encryption is that having the data doesn’t help you without the key. Because math.

Erasing the data after n failed password attempts sure doesn’t hurt from a privacy standpoint, but that feature is not what is keeping your data secure.

Politically, this could be a huge deal, since our public policies regarding technology are mostly FUD-driven. Nobody with an IQ higher than room temperature would advocate undermining encryption, but there’s a lot of empty bobble heads putting out a lot of sound bites lately.

Tripod says:

Re: Re:

“Erasing the data after n failed password attempts sure doesn’t hurt from a privacy standpoint, but that feature is not what is keeping your data secure.”

It is an essential security feature when the passcode is only 4 digits. Anyone could work their way through an average of 5000 tries, 10000 at most, in a weekend without that protection.

The FBI doesn’t have any way of knowing if that feature is turned on for that phone, but they have to assume it is.

Apple can’t help the FBI with this because they took the security implementation seriously, and also because helping to create a backdoor would significantly damage their brand.

Anonymous Coward says:

Re: Re: Re:

The FBI doesn’t have any way of knowing if that feature is turned on for that phone

According to the writ application they filed in court, while they don’t know whether it’s currently turned on, it was turned on when the county issued him the phone, and it was on at the phone’s last backup. So it’s probably safe to assume that it’s on.

Danny says:

Re: Re: Stupid Judge

Like Bitlocker on a laptop or and device with a TPM chip to store the key. There is a small boot partition that asks for the passphrase to decrypt the key that will in turn decipher the main partition where the OS is held.

https://en.wikipedia.org/wiki/BitLocker

There are lots of examples of encrypting the entire OS, they all rely on the similar method, a small boot program outside the OS either in BIOS or a separate partition that is used to decrypt the main drive where the main OS is stored.

cob says:

Re: Stupid Judge

You can put a locked iPhone into DFU mode and perform an update, which will erase all data on the phone. The order wants Apple to build a custom OS build that will NOT erase the user data, but leave in encrypted and remove the ability for the OS to wipe the data if too many password attempts are tried.

AYO says:

Re: Stupid Judge

Nah, in iOS9 only the users data is encrypted not the OS partition. Which means, the logic that increases guess entropy and wipes the phone after 10 incorrect passes can be changed.

However, in order for this to happen Apple would need to sign the software so it can infect be installed.

So the brute force method is the best option and thus the demand for apple to sign and install a version that removes both the delay and the wipe feature.

Source: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Anonymous Coward says:

“at an Apple facility; if the latter, Apple shall provide the government with remote access”

“The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE”

This is reassuring. Also, I sure hope these maniacs used an insecure password, because i want the FBI to know as much as possible about these criminals without compromising everyone else’s cell phone security in the process.

TKnarr (profile) says:

Re: What Apple Gives, Apple Can Take Away

Up to a point. If I were designing it, I’d have set it up so that the firmware couldn’t be updated until the phone was unlocked by entry of the passcode. That would help close many of the holes exploited to root phones in general, and as a side-effect would prevent what the FBI’s trying to do. Normal firmware upgrades would be happening with the phone already unlocked so it wouldn’t bother normal users, and a phone couldn’t have it’s firmware forcibly back-leveled to a version that was vulnerable to rooting or a modified recovery image installed.

AYO says:

Re: Re: Re:2 so that the firmware couldn't be updated until the phone was unlocked

Only on untrusted devices.

“Trusted computers can sync with your iOS device, create backups, and access your device’s photos, videos, contacts, and other content. These computers remain trusted unless you change which computers you trust or erase your iOS device.”

/var/db/lockdown or %ProgramData% contain a list of trusted devices.

Note: you can only trust a device when your device is unlocked, something the shooters probably did when backing up candy crush to iTunes.

JBDragon says:

Re: Re: What Apple Gives, Apple Can Take Away

On iOS 7 and earlier, you can plug a iPhone into a PC and Update the OS without the password I think. But need it to Log into the phone still. Or you could Wipe the iPhone and then log in with no passcode, but all the Data would be gone.

So I think it could be possible for Apple to create a iOS7.5 with a good Security certificate. Where the the 10 password fail and wipe is disabled along with the slowing down entering the passcode that way brute force could then be used to unlock and get the Data at that point.

With iOS8 and newer, Apple closed up tight everything. You can’t update without a passcode. You can’t even just wipe the phone and start over. That’s part of the security so that if someone mugs you and steals your phone, it’s almost worthless and they’d get only a tiny fraction for it if anything then what they would have gotten before. So not even Apple and install some Modified version of iOS on a iPhone with iOS8 or newer. Security is a huge thing for Apple these days.

This whole error 53 thing is part of that. Replacing the touchID button with a 3rd party installer that doesn’t match and then you end up with a phone that won’t work.

Anonymous Coward says:

(3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware

Industry standard algorithms encrypt data with multiple layers so that anyone trying to brute force it must work through multiple layers to see if the guessed password is correct. It, by design, slows down the decryption process to help protect users with weaker passwords.

Link talks about password hashing, but the same general idea applies to disk encryption.
http://security.stackexchange.com/questions/211/how-to-securely-hash-passwords/31846#31846

Kevin Marquette (profile) says:

What are they looking for that

Is this phone really that important to the case? Can’t they get all the data they need from the CIA. They already have a copy of it, don’t they?

Besides the call logs and text messages stored at the cell carrier, what more are they really hoping to find?

What would they have done before we had cell phones?

Peter (profile) says:

Why the FBI needs to read the phone data in the first place? It is too late to prevent the crime, the perps are dead – no evidence needed for convictions-, and the FBI itself has concluded that “They were not directed by [foreign terrorist] groups and were not part of any terrorist cell or network”” (https://en.wikipedia.org/wiki/2015_San_Bernardino_attack)

Ryunosuke (profile) says:

hoi, mike, Apple put out a statement...

within the last couple hours, Reported from ABC within the last half hour (as of this post)

— Truncated important bits (Ie. TLDR)

1) US govt issued court order to break our own encryption (more of the court order itself) Let’s have a discussion about encryption NOW.

2) Need for encryption, How everyone uses encryption and what breaking IOS would mean.

3) Apple is shocked and outraged by San Bernadino and has co-operated to the fullest extent of the law (Important bit there)

** Important bits here **

“Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone. Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation.”

and a bit further on…

“The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”

then goes on to describe the legality of using the All Writs Act for this purpose.

**Full article here**

http://abcnews.go.com/US/apple-opposes-judges-order-unlock-shooters-phone/story?id=36993038

Anonymous Coward says:

This is fucking insanity, talk about murrica’s “never let a good tragedy go to waste”.

If they get someone competent in there they can dump the storage on the phone and brute force it all they want. If the data is soooo valuable they can spend the money on the computing power to truly brute force it… But no, lets use this “opportunity” to set the legal president that companies have to defeat their on encryption on demand.

John Bravo says:

Re: Re:

This thread seems to be suffering some technical confusion. I blame Hollywood for how badly the average person understands encryption, but that’s a rant for another day. There are two different issues at work here. First is that people choose weak passwords. Second, it is impossible to brute force 256 bit encryption, the computing power to do this in any reasonable time wouldn’t fit in our universe.

This is how these issues apply here, Apple takes your password (no matter how weak), encrypts it with 256 bit encryption found on a secure co-processor in your phone and uses that result to encrypt your data. Copying the data and using it on another phone or computer system is useless, your password can only be used on your phone due to the encryption key from the secure chip.

All the secure chip does is take one input (your password), run the 256 bit encryption on it, then output that result to the OS to decrypt your data. Neither the firmware, nor the OS can get the encryption key from the chip. It would be possible to write a new key to the chip, which is how it got there in the first place, but this would be useless, since combining it with your password would result in the wrong output to decrypt your data.

Obviously, humans are the weak link here, unless someone is really keen on information security, chances are they have a weak password. Apple has two methods to help reduce this vulnerability. First, 10 tries before they wipe some data needed to recover your data. Second, the encryption chip has a five second delay built into it. Which doesn’t seem like much, but if you want to try a few million different combinations to brute force the password, that will seriously increase the time it takes.

In practical terms, it could still take decades or centuries to brute force a simple password, but Apple isn’t going to just hand out a system vulnerability on the promise that Law Enforcement won’t just give a patch to anyone who “really needs it”. LE can’t keep control of guns or drugs, much less a small easily copied piece of software.

That One Guy (profile) says:

Re: Re:

That would be the benefit of the doubt interpretation, which I would say is anything but deserved at this point. Me, I figure they simply don’t care what happens once they’ve got what they demanded.

If forcing Apple or another company to introduce a security vulnerability causes significant problems for the company and it’s customers in the future, it’s not their problem, they got what they were after.

Danny says:

Why this is an impossible request from the court

You mistake an iPhone’s unlock code with the iPhone’s encryption key. the iPhones do typically use a 4-6 digit pin as an unlock code. The user also has the ability to create a full alphanumeric password for the unlock code as well. However, that is simply the code that’s used to unlock the actual full encryption key that is stored within dedicated crypto hardware. Apple uses a dedicated chip to store and process the encryption. They call this the Secure Enclave. The secure enclave stores a full 256-bit AES encryption key.

Within the secure enclave itself, you have the device’s Unique ID (UID) . The only place this information is stored is within the secure enclave. It can’t be queried or accessed from any other part of the device or OS. Within the phone’s processor you also have the device’s Group ID (GID). Both of these numbers combine to create 1/2 of the encryption key. These are numbers that are burned into the silicon, aren’t accessible outside of the chips themselves, and aren’t recorded anywhere once they are burned into the silicon. Apple doesn’t keep records of these numbers. Since these two different pieces of hardware combine together to make 1/2 of the encryption key, you can’t separate the secure enclave from it’s paired processor.

The second half of the encryption key is generated using a random number generator chip. It creates entropy using the various sensors on the iPhone itself during boot (microphone, accelerometer, camera, etc.) This part of the key is stored within the Secure Enclave as well, where it resides and doesn’t leave. This storage is tamper resistant and can’t be accessed outside of the encryption system. Even if the UID and GID components of the encryption key are compromised on Apple’s end, it still wouldn’t be possible to decrypt an iPhone since that’s only 1/2 of the key.

The secure enclave is part of an overall hardware based encryption system that completely encrypts all of the user storage. It will only decrypt content if provided with the unlock code. The unlock code itself is entangled with the device’s UDID so that all attempts to decrypt the storage must be done on the device itself. You must have all 3 pieces present: The specific secure enclave, the specific processor of the iphone, and the flash memory that you are trying to decrypt. Basically, you can’t pull the device apart to attack an individual piece of the encryption or get around parts of the encryption storage process. You can’t run the decryption or brute forcing of the unlock code in an emulator. It requires that the actual hardware components are present and can only be done on the specific device itself.

The secure enclave also has hardware enforced time-delays and key-destruction. You can set the phone to wipe the encryption key (and all the data contained on the phone) after 10 failed attempts. If you have the data-wipe turned on, then the secure enclave will nuke the key that it stores after 10 failed attempts, effectively erasing all the data on the device. Whether the device-wipe feature is turned on or not, the secure enclave still has a hardware-enforced delay between attempts at entering the code: Attempts 1-4 have no delay, Attempt 5 has a delay of 1 minute. Attempt 6 has a delay of 5 minutes. Attempts 7 and 8 have a delay of 15 minutes. And attempts 9 or more have a delay of 1 hour. This delay is enforced by the secure enclave and can not be bypassed, even if you completely replace the operating system of the phone itself. If you have a 6-digit pin code, it will take, on average, nearly 6 years to brute-force the code. 4-digit pin will take almost a year. if you have an alpha-numeric password the amount of time required could extend beyond the heat-death of the universe. Key destruction is turned on by default.

Even if you pull the flash storage out of the device, image it, and attempt to get around key destruction that way it won’t be successful. The key isn’t stored in the flash itself, it’s only stored within the secure enclave itself which you can’t remove the storage from or image it.

Each boot, the secure enclave creates it’s own temporary encryption key, based on it’s own UID and random number generator with proper entropy, that it uses to store the full device encryption key in ram. Since the encryption key is also stored in ram encrypted, it can’t simply be read out of the system memory by reading the RAM bus.

The only way I can possibly see to potentially unlock the phone without the unlock code is to use an electron microscope to read the encryption key from the secure enclave’s own storage. This would take considerable time and expense (likely millions of dollars and several months) to accomplish. This also assumes that the secure enclave chip itself isn’t built to be resistant to this kind of attack. The chip could be physically designed such that the very act of exposing the silicon to read it with an electron microscope could itself be destructive.

TLDR: Brute forcing the unlock code isn’t at all possible through pretty much any means…reasonable or even unreasonable…maybe…JUST MAYBE…it’s possible through absurdly unreasonable means.

Mike Masnick (profile) says:

Re: Why this is an impossible request from the court

This is all good and accurate… except for the fact that it’s an iPhone 5C which doesn’t have the Security Enclave.

This point, which I discuss in a post that will be going up soon… does make it clear that the request is impossible for newer iPhones but could still apply to older iPhones, including the one in this case.

What’s not entirely clear is if there is still a key encoded in the hardware of the 5C which may effectively do the same thing. But that seems to be something that no one is quite sure of.

John says:

Re: Why this is an impossible request from the court

Just a question – referring back to the ‘safe’ analogy (inappropriate if talking about decryption – but not if talking brute force attacks on a passphrase, where the attacks are being ‘slowed down’ by or ‘frozen’ if too many attacks are created) – I have had firmware recovered from locked microprocessors by a company that ‘destroys the bit’ that essentially ‘pins the lock’ – which then allows them to download the firmware and create a file that allows me to program other microprocessors. (for the techy’s the recovered code is an image – not decompiles source). Maybe the government should be requesting the location of the circuit trace, or memory location, in the processor or memory that would enable the flash erase – and just blow that trace. (Drill the safe).
Granted – this was done on a microprocessor much smaller, and less capable – but maybe brute force is the answer. As is so often the case, insider attacks are easiest:)

jameshogg says:

Who needs copyright law when you can design device-unique combined hardware/softwa-*cough*- Digital Rights Management that makes the copying of flash data exponentially hard? Apple seems to be able to stop the copying process so well with natural scarcity as it is that they don’t even need copyright law! I guess the FBI will be joining forces with the EFF to condemn DRM then.

Anyway, I wouldn’t want to be one of those Apple programmers. One slip-up, one bug, one wrong library, and you face possible federal prosecution for triggering the self-erase function.

What would be hilarious is if they unlock the phone only to find third-party advanced RSA encryption on the phone without any private prime-number keys – the only location being inside the criminal’s mind, memorised using mnemonics. Because after all that is the only way you can guarantee (for now) that nobody will ever crack your messages.

jameshogg says:

Re: Hmmmmm

You’re quite right that the murderer should be subject to the full force of the law.

And “the full force” is not the same thing as an unstoppable force. Also, you’re up against an immovable object by the looks of things.

From the post by Danny above, I highly doubt Apple will be able to do this faster than the FBI can.

morganwick (profile) says:

Re: Hmmmmm

Forgetting that there’s this little thing called the Constitution that explicitly says murderers do have rights at least until they’re prosecuted in a court of law (yes, it still exists as much as the government likes to pretend otherwise), it’s technically impossible to open the phone without opening up a massive vulnerability in everyone’s iPhone (and no, “those smart people in Silicon Valley should be able to find a way” isn’t good enough, again as much as the government likes to think otherwise). What’s so hard to understand here?

Leigh Beadon (profile) says:

Re: Hmmmmm

What’s so hard to understand here?

Well, for me, it’s what exactly you mean by “murderers should have no rights.” Leaving the encryption thing aside momentarily, that statement by itself is either (a) poorly thought through, or (b) quite radical and monstrous.

Are you saying convicted murderers shouldn’t have the rights that protect them from cruel and unusual punishment, for example? What about their ongoing right to due process, including their right to appeal their conviction? Or their right to access the parole process?

What about the right under the 14th Amendment to not be treated unequally based on race, sex or creed? Prisoners retain that right. Should they lose that, so we’re free to punish murders more or less based on their race? Do they lose their right to medical care? Do disabled prisoner’s lose their right to accessible prison facilities?

Debate the encryption aspect all you want – but “murderers should have no rights” is a barbaric starting point.

Leigh Beadon (profile) says:

Re: Re: Re: Hmmmmm

Too bad for those persecuted and arrested unjustly by the Government, no?

That’s part of it, but it’s also about those justly arrested and convicted.

I mean look, I’m not overflowing with sympathy for murderers, and there are certain breeds of human monster that make me feel they deserve any horrors they must endure. But it isn’t all about sympathy or what anyone “deserves” — it’s about what our treatment of the guilty does to us. It’s not healthy for a human being to be able to throw someone in a hole to suffer and die, to stand over a starving wretch and feel nothing, to hurl stones with glee and cheer when they draw blood. It’s not healthy for a society to condone those things, or to ignore them. Perhaps one can personally believe that certain people deserve those treatments, because their actions have rendered their humanity forfeit — but there is no way to dish them out without sacrificing your own humanity to do so.

Ninja (profile) says:

Re: Re: Re:2 Hmmmmm

I hear you and agree, my bad for not being clear on my reply. I’m not that good of a human in these regards. I don’t have sympathy towards those monsters and I honestly don’t feel bad when they are lynched or killed in any ‘vengeful’ way. Wrath is a bad feeling. But I’m wrong. I know it and I actively fight this part of me and I wholeheartedly agree with you.

Leigh Beadon (profile) says:

Re: Re: Re:3 Hmmmmm

Yeah but that’s the thing – everyone feels that, and I wasn’t suggesting you’re any different. And that’s precisely why America and so many other modern states recognized that bans on cruel and unusual punishment, and other rights even for criminals, were an important founding principal – because a system can (theoretically) be immune to wrath in a way people can’t. So sure, people can and will all feel that rage, but when a whole community endorses it you end up with children treating “stone the prisoner” (metaphorically or literally) as a game they look forward to. And anyone who can do that, can do it to anyone. Our brains are only concerned with justice at the upper levels – the deeper bits that get numbed to, and then practiced at, harming people are much less discriminating.

Anonymous Coward says:

Re: Hmmmmm

Apple should open this phone and any other similarly scummy murderers’ phones as required.

Doing so makes millions of other people’s phones also insecure.

Murderers should have no secrets — whether dead or alive. They should have no rights.

What about everybody else?

What’s so hard to understand here?

Indeed.

john says:

Re: Re: Hmmmmm

Involuntary servitude or Involuntary slavery is a United States legal and constitutional term for a person laboring against that person’s will to benefit another, under some form of coercion other than the worker’s financial needs. While laboring to benefit another occurs also in the condition of slavery, involuntary servitude does not necessarily connote the complete lack of freedom experienced in chattel slavery; involuntary servitude may also refer to other forms of unfree labor. Involuntary servitude is not dependent upon compensation or its amount.

Decorus says:

Re: Hmmmmm

This isn’t about one terrorists phone.

This is about giving someone the ability to hack every Iphone ever made. Once this exists the next step will be all phones ever made. Do you want it so that anyone in the world can download a program for 5 bucks that can hack your phone and post all your private photos, texts and notes online then sure lets give the FBI what they want.

If you are like me I don’t want anyone to be able to do what they are asking.

Lets put it this way Senator uses an Iphone has pictures he took with a prostitute. Now that China paid an FBI agent 1 million dollars for the program China can Access his phone copy those pictures and black mail a senator.

Movie Star has an Iphone now some sleaze bag who paid 50 bucks online can copy all of his texts to his gay lover and post them online for the entire world to see.

Karl (profile) says:

Apple's security implementation, from Apple itself

If anyone is interested, here is the explanation of how Apple uses encryption to make their devices secure:

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

This is for iOS 9.0 or later, so I don’t know if it applies to the phone in question.

But everything in this PDF shows that Danny’s comment is correct (and Whatever’s is wrong).

DannyB (profile) says:

The REAL questions we are dancing around

It could be argued that since the device can receive over the air (OTA) updates, that Apple can subvert the device, even if it cannot break the encryption.

Apple could develop (at whose expense?) an OTA update that collects the password the next time the user unlocks the encryption.

But can Apple be compelled to develop this? Even if the government pays, it cannot pay Apple for the lost opportunity cost of diverting resources from developing new products. Money cannot make up for time-to-market.

In this situation, Apple could selectively deploy such an OTA update to a single target. But then this is just the camel’s nose under the tent, or the foot in the door.

The real question we’re dancing around by suggesting that Apple COULD defeat encryption by expending tremendous resources are really these:

Should Apple (and everyone else) be LEGALLY BARRED from building a secure product?

Even though there may be no such law in writing. The effect becomes just the same. If you can, through tremendous cost and effort, manage to defeat encryption, then you should be required to do so at the government’s mere whim and slightest wish.

Other questions:

Can Apple (and anyone else) be COMPELLED to expend tremendous resources to break into a device? At whose expense? Will all of their expenses be compensated, including a lost market because Apple diverted resources away from product development? Is there some level of cost (exact dollar value please) at which Apple is no longer required to break into a device? What if Apple deliberately engineers a device to ensure that the cost to break in will exceed this threshold?

And finally a lesson for those concerned with privacy. Once your phone has been seized, they may not be able to unlock it, but once the bad guys return it to you, it may have been compromised with software such that your next successful unlock of the device will open it up for them to rummage through fishing for, or to manufacture evidence.

JoeDetroit (profile) says:

What could possibly be on this phone anyhow?!

Regardless of what is being asked of Apple…
what exactly could be on this phone this is not already known? Is it not true that in the first 24 hours law enforcement all communications to & from this device?

Seriously. What am I missing here? Will there be a to-do list or something? Something other than pictures of their kids or their last vacation?

David Taylor says:

As an Operating Systems and Information Risk Management expert having worked for Dell, Microsoft, and two other Fortune 500 companies directly; not to mention having visited most of the rest…. I cannot even begin to tell you just how serious this problem is at this time. As a Veteran and concerned citizens, I want to see terrorism stopped.but, I agree with Apple on this matter: http://www.apple.com/customer-letter.

KidOmaha (profile) says:

Re: Re:

Ummm… I don’t think the FBI is interested merely in the phone numbers called by that phone or that called that phone. They are interested in the wealth of information we all store on our phones. Identities/locations of possible terrorists, banking information, travel plans etc… The information stored on an iPhone far exceeds just phone numbers.

KingChris says:

US Government INEPT!

Why the FBI and US Government just don;t a better job. stop importing terrorism. Millions of American must lose their privacy and personal freedom because the retarded U.S Government continues to give green cards to radicalized terrorist!

What is Apple supposed to do about it. Had the government had the encryption keys it needs would they still been able to stop the San Bernadino massacre. NO!

Government can go F***** themselves!

John Digweed says:

Apple doesn’t have to give the Feds anything special. The order says it can be done at an apple facility. Why can’t they just unlock the phone, and give it to the Feds. There is nothing that says the need to provide the Feds with software or anything of the sort, simply a means to unlock the phone, or an unlocked phone itself. It also says they need to provide the government with an estimate of costs. These costs should include the line item that their best programmers wil be working on this and not other projects which would advance the future of Apple, so it comes very expensive. Apple should take the phone, find a way to unlock it, give it back to the Feds and charge them $50 billion to do it. I doubt the government would be asking again unless the felt it were very necessary. Money rules the decision on both sides of the coin, give em a price they can’t pay unless they really need it. If the government would agree to spend $50 billion for an unlocked device, it probably important

Monday (profile) says:

There is no such fallacy that gets around what's going to happen.

This is a really technical request, MEANING, whomever helped draft the request is obviously a “Nerd” (I’ve always been fond of that title, and it is now become sheik.) and they know completely, what this means in the short term, and even more troubling, the long term, is the access to any phone anytime is at hand, and they are trying to help make it happen.

I have had the same mobile phone since January 2002 – the battery lasts a week, and have recently been considering getting something new (Galaxy, iPhone etc) although I really don’t need the internet, or to read someone’s texts when I’m doing something else I’m sure is more imporatnt, but as I read this post, I was impressed with the ’10 fails it’s gone’ feature you’ve mentioned… I did not know that.

Nevertheless, getting software, hacks, and other little treasures from Apple to get around this, will definitely lead to more audacious requests because it sets precedent, and the FBI won’t really need anything after that – including physical warrants, to access any phone anytime. It also sets precedent in ‘forcing’ other manufacturers to ‘comply’ with the FBI’s ‘requests.

It is not a ‘slippery-slope’. The inevitable bench warrants to access phones for doing anything from watching a .tor to liking Met-Art to finding out why pressure cooker bombs were so popular to learning about the bridges of New York is going to be fair fodder for any request – by any agency… do you think that technology is going to stay locked up in FBI offices and they will be the only ones to use it.

I hope Apple can get around it, through some law that was written in the nineteenth century, and they do not comply with the order. It is a disaster waiting to happen.

Kevin says:

While it’s worth noting that the order doesn’t ask apple to decrypt the data (apple, and all other known agencies lack the computing power needed to do so), or to provide the password (which they wouldn’t know unless the guy told them), asking apple to remove the only safeguard against brute force attacks on a 4/8 digit pin (easily doable) is tantamount to asking them to allow the device to be easily decrypted by anyone with moderate technical capabilities.
If you had a locker combo with 3 numbers, someone with a lot of time on their hands could break it, but that time constraint makes you comfortable that nobody will. But now imagine there exists a person who can test 1000 combos a second. To stop such a person, one reasonable option if you use locker combos would be to set a maximum number of tries per a day– or a maximum number of tries period after which the contents of the locker are considered compromised and are destroyed.
Obviously apple already has a backdoor wherein they can update their OS and have it run on the same data. My guess is if someone was dedicated enough, they could probably exploit this at some great cost. Since the FBI likely lacks both the time and resources to accomplish this for such a relatively small concern of investigating this attack, they are making a move to have apple do the change since they already have the OS source code and would know how to go about making the change in a much cheaper, timely manner

Wyldkat (profile) says:

VERY misleading headline (doesn't match article's updates)

Please update your headline so we can share this with others in response to the ongoing dialogue.

With the updates you’ve added to the article, it now reads the OPPOSITE of what the headline actually says. I see what you’re trying to say with the headline (that it’s essentially worse than what the news media are reporting, or at best, semantics regarding methodology), but it’s too long of a headline and most social media truncates it, so every time I post the article as a supportive follow-up, it looks like I’m correcting the info with an urban legends corrective instead of adding more detailed info and a link to the court order.

john says:

Involuntary servitude ???

Isn’t the court ordering Involuntary servitude which is prohibited by the 13th amendment?

Involuntary servitude or Involuntary slavery is a United States legal and constitutional term for a person laboring against that person’s will to benefit another, under some form of coercion other than the worker’s financial needs. While laboring to benefit another occurs also in the condition of slavery, involuntary servitude does not necessarily connote the complete lack of freedom experienced in chattel slavery; involuntary servitude may also refer to other forms of unfree labor. Involuntary servitude is not dependent upon compensation or its amount.

Anonymous Coward says:

Re: Involuntary servitude ???

Yes, it is involuntary servitude. And the Constitution’s prohibition is absolute: it states that involuntary servitude shall not exist within the US. Do you think that will stop the government here any more than it stops them from compelling jury duty, or the draft, or hospital employees to draw blood from DUI suspects, or photographers from attending weddings they’d rather not attend? Heck, every time it snows, the government even compels you to shovel the government’s sidewalk that happens to run in front of your property.

But another interesting thing here is that Apple actually claims to own the specific copy of the software on each of their phones. “This software is licensed, not sold”. A small part of me wants to make that have a negative consequence for them, for once. You claim you don’t sell this software? Then fine, it’s yours; but if you retain ownership that means you can’t just claim you have nothing to do with it anymore.

Jim Gianatsis (user link) says:

Apple Is Being Unreasonable.

Apple is damn stupid. They should have treated this as a 1-time request from the federal government to help solve a terrorist crime, and help make all people safe. Now Apple could be charged with hindering an ivestigation and aiding terrorism. The NSA already monitors all world phone and internet communications in the name of National Security, so what the difference?

Anonymous Coward says:

Re: Apple Is Being Unreasonable.

They should have treated this as a 1-time request from the federal government

This isn’t exactly the first time the government has demanded Apple do something help them break encryption. And once the software is written the government will come back to them every single time it wants them to unlock a phone. You can’t treat something as a 1-time request unless it only happens 1 time.

KidOmaha (profile) says:

Re: Apple Is Being Unreasonable.

The difference is that, in this case, the government is attempting to force a private company to create code to defeat its own encryption. And it isn’t for just this one case. Creating a backdoor to encryption would mean that it will work on ALL iPhones, not just the one in question. Further, it wouldn’t just be for one ohone. If this order stands you can rest assured that the FBI, other law enforcement agencies and other governments will demand the same for other phones. The Manhattan DA admitted in a recent interview that he will demand Apple do the same with respect to 155-160 phones in his possession. Here’s a link to Apples open letter to its customers that also explains why your argument must fail.http://www.apple.com/customer-letter/

John Roberts says:

iPhone sales

I’m reasonably confident that for Mr. Cook this is more about future sales of apple products than it is about protecting American rights. TBH I’m not an apple user or fan but just so no one tries to label me a hater, I’m pretty sure samsung, htc, lg, etc… would all take the same stand. Money over rights, the American way.

Anonymous Coward says:

iPhone is not cool anymore !

Apple is insulting my intelligence, it cares more for the safety of my private than the safety of my life ?

Apple’s decision to protect their customers’ privacy had also provided unbreakable protection to those ISIS organizations to be able to continue their operations of terrorism without exposures to US law enforcement system

It’s easy for Apple to reject court’s request base on a righteous business principal.

I want to ask Apple:

Why Apple want to separate its business interest from its community interest ? Business is unbreakable from community, how do you separate these 2 ?!

What about Apple people’s citizen awareness for the public safety, the homeland security and the well being of humanity in general ?

I look at my iPhone and I am very sad for all the money that I paid to its maker.

They don’t care about my true concern of life safety, they care more about promoting their unbreakable business image in the name of customer privacy.

nasch (profile) says:

Re: Re:

Apple’s decision to protect their customers’ privacy had also provided unbreakable protection to those ISIS organizations to be able to continue their operations of terrorism without exposures to US law enforcement system

This article may put things in a different perspective: https://www.techdirt.com/articles/20160206/06570933540/senator-john-mccain-weighs-going-dark-debate-insists-that-he-understands-cryptography-better-than-cryptographers.shtml

Michael Corrieri says:

Apple could give the government a hardware solution.

Apple has their head up their backside, as usual. The back door could be the connection of a hardware solution, like Periscope was for CPU debugging – similar idea. The future iPhone could have connection points which allow the hardware to be connected to it, and provide the requested backdoor. Without the physical hardware connected to it, and a machine running that hardware with the right encyrption, there would be no risk to consumers (unless they get a search warrant). I am anti-terrorist, pro-security and privacy as well, but I also recognize that a search warrant is the people’s demand of discovery, and must by honored. They could do this in a way that hackers would need to have your phone and the hardware device with it’s own encryption keys to access your data. Simple enough – that’s not an online threat.

MB says:

NSA

Our definition of brute force and the NSAs are not the same thing. Theyve got shit we’ve never even heard of (if quantum code breakers exist, theyve got em).

I’d argue its VERY likely this phone was cracked by the NSA, probably weeks ago. But there are other reasons this is happening:
1) If there is incriminating evidence against somebody alive, the FBI will want an alternative chain of custody so NSA agents arent appearing in court. Thats a huge thing- and it happens a lot. NSA passes tips to the FBI who go snooping into they find a plausible tact of investigation they COULD have found themselves, then just treat it like they DID find it themselves. Its legal and happens with lot of evidence that wasnt obtained legally or whos source they want to keep out of the limelight, sometimes the judge will send the prosecution back to go find a path that will pass muster.
2)This isnt about this case. The Feds have been strong-arming tech companies to provide them back doors for years (Google immediately rolled over like a whipped dog). I picture the conversation going “So Apple, what do you think the public will say when we tell them you arent cooperating in this terror investigation? This wouldnt even be an issue if you’d give us a tool to disable the multiple failed login bricking.” This is another chess move about encryption.

Anonymous Coward says:

Use the iforgot feature

Stick an activated dim with a data plan on it then press to enter password and click Iforgot as they already have access to his ivlouf they can then log in to the email and click the link to reset the iPhone password and hey presto voila they got in the phone no earrent needed! Or as the phone had been identified as being on iOS 7 Google for the password bypass bug which means you can get access to all the stuff they are after! And again hey presto they got their Intel! No stupid apple getting in the way!

nasch (profile) says:

Re: Use the iforgot feature

as they already have access to his ivlouf they can then log in

What is that word supposed to be? And is it really possible to reset the phone’s passcode via email? I know it’s not on Android because the phone PIN and Google account password are totally separate. This would be a major security oversight by Apple.

smaines (profile) says:

This is a non-issue, isn't it?

I am a developer, but am not as familiar with iPhone as with Android.

I believe these two things to be true (please tell me if these are wrong),

* An update must be signed by a secret key secured by Apple
* S/N in HW can be unspoofably interrogated in signed update

If the signed update says “if s/n equals bad-guy-phone open command pipe over wifi and bypass authentication-attempt delays”, why does it matter whether the FBI (or anyone else) has it?

This order should change nothing in either legal precedent or user security, and may be performed with a modicum of low-complexity code.

What am I missing here?

-SM

Anonymous Coward says:

Re: This is a non-issue, isn't it?

What am I missing here?

The next step, being able to order any company to install a modified operating system on any identified device, totally bypassing all protection given by code signing in proprietary software. The company could be required to do anything to help a government agency break into a device, such as installing a key-logger. Note they would not need physical access to the device, just a unique device identifier for the companies update servers to serve up the modified version of the operating system.

smaines (profile) says:

Re: Re: This is a non-issue, isn't it?

SM>> What am I missing here?

AC> The next step….

Thanks for replying.

Before we get to next steps, are my two technical assumptions (required key, unspooffable s/n), and the specific inference (the code to be provided in the order can only be used for the one device) correct?

-SM

Anonymous Coward says:

Re: Re: Re: This is a non-issue, isn't it?

The only required key is the signing key for software updates, and targeting a single device is a smoke screen as it takes extra code in the update to target a single device. That is once the compromised version of the OS is written, the OS is compromised wherever that version is deployed. Given how deep the spy agencies are in the Internet, they would be able to deploy the compromised version at will.

smaines (profile) says:

Re: Re: Re:2 This is a non-issue, isn't it?

AC> That is once the compromised version of the OS is written, the OS is compromised wherever that version is deployed.

Why is it compromised? That update on your otherwise identical phone will not function, as the serial number check putatively included in the update will fail, won’t it?

smaines (profile) says:

Re: Re: Re:4 This is a non-issue, isn't it?

“The operative word is putative, especially after several hundred warrants have been served on the company to produce a compromised version.”

If your suggesting that Apple will weary of protecting its customers and open the flood gates, that is a different discussion. If you are suggesting that the Government will demand Apple covertly install a digital wiretap on a future target, that is also a different discussion.

I am disputing the notion that specific compliance with this order gives the Government a broader capability. Those who say that the Government can change the code appear not to understand code signing, but are numerous in the discussion.

Anonymous Coward says:

Re: Re: Re:5 This is a non-issue, isn't it?

Code signing only states that the code comes from the signer, and is only useful so long as their signing key is not compromised, and it has nothing to do with targeting an actual device. This goes beyond requiring a company to provide Information that they have, but is trying to set the precedence that a government can order a company to compromise their own code and then sign it.
Also remember there are only two numbers of Interest, one and many, there is only one Earth, but many Iphones that could contain evidence. This would never remain a one time request but could rapidly become such a drain on Apples resources that they give the signed code to the FBI.

smaines (profile) says:

Re: Re: Re:6 This is a non-issue, isn't it?

AC> Code signing only states that the code comes from the signer, and is only useful so long as their signing key is not compromised, and it has nothing to do with targeting an actual device.

Code signing is not a mere autograph, if it is not signed, it will not run on any Apple device requiring a valid key, correct? The code targets a single device, the signature enables it, the code cannot be modified and still function. Can you not concede this point?

Compromised keys are a separate discussion, and only Apple’s problem. They lost this round. Build a better phone next time.

Anonymous Coward says:

Re: Re: Re:7 This is a non-issue, isn't it?

If it is signed, unless Apple write even more code, it will install and run on any IOS7 device. Installers do not target a particular machine, although they may check that the software that they are upgrading is properly registered.
The real problem is this could set the precedence that the Government can order any company to WRITE and install compromised code on any machine. Then they use terrorism as the reason for demanding that they are given signed code that they can install on any machine to aid an investigation.

nasch (profile) says:

Re: Re: Re:10 This is a non-issue, isn't it?

The installer doesn’t target the machine, nor does the signature. The signed code does.

I don’t know a lot about iOS code, I was just trying to make the point that Apple would put something in this update that would make sure it could only run on this one device. Perhaps that is already understood and I didn’t need to mention it.

Anonymous Coward says:

Re: Re: Re:9 This is a non-issue, isn't it?

At issue is not the specific request, but rather the precedent that is set if Apple lose this battle. It would mean that government agencies could use writs to force companies to compromise their software for installation on targeted machines. It is a small step from compromising machines in the governments possession, to targeting machines of suspected terrorists or criminals.

smaines (profile) says:

Re: Re: Re:10 This is a non-issue, isn't it?

I’m saying that it is not even a legal precedent. This order, once upheld, does not affect the prospects of any future order where a judge says, “open this specific phone”, and does not legally enable further activity, e.g. installing a covert wiretap.

Apple claims that the Government’s request is unprecedented. In opposing the order, it is Apple, not the Government, who seeks to make new law.

Apple lost this round when they tried and failed to build a device secure against themselves. It’s embarrassing. Too bad. Build a more secure phone next time, boys.

Can you imagine a theory under which a safe manufacturer could refuse to help defeat a safe in the shooters’ storage space? “But, then the Government will know how easy it is, and may ask more of us in the future!” The government already knows how easy it is, and yes they might ask more in the future. That’s for a judge to decide then, and having opened this safe today won’t affect it.

KidOmaha (profile) says:

Re: Re: Re:11 This is a non-issue, isn't it?

“Apple claims that the Government’s request is unprecedented. In opposing the order, it is Apple, not the Government, who seeks to make new law.” How so? “Apple lost this round when they tried and failed to build a device secure against themselves. It’s embarrassing. Too bad. Build a more secure phone next time, boys.” How so? The point is that Apple’s phone IS secure such that the FBI cannot get info off of it without getting a court order forcing Apple to write a different OS that introduces a backdoor to encryption. Once Apple writes the OS, after being compelled to do so, the cat will be out of the bag and what was once qa secure system will have been undermined by the court order. That is how it affects future activity.Please do answer my question regarding your statement that it is Apple that seeks to make new law by opposing the court’s order. I am genuinely interested in your answer.

CalebBoone (profile) says:

Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Ladies and Gentlemen:

The court order which requires Apple to assist law enforcement to open the San Bernardino Terrorism Defendants’ Apple IPhones and other devices is correct and will be affirmed and enforced. Apple is embarrassingly wrong in its idiotic argument that the order jeopardizes Apple customers’ privacy.

Apple’s argument against the court order assumes that allowing law enforcement access to the San Bernardino Defendant’s Apple ‘phone will allow computer hackers worldwide access to all other Apple ‘phones.

That conclusion is based upon another assumption: that American state and federal law enforcement personnel cannot be trusted to maintain Apple’s security outside the specific criminal investigation at hand.

Or, another assumption: that the Apple personnel involved in opening the Apple ‘phone cannot be trusted to maintain Apple’s security outside the specific task of cooperating with the government on this single case.

If Apple cannot trust its own employees, that is Apple’s dilemma: not the American state or federal governments’. If Apple employees are not reliable, trustworthy and ethical, then Apple needs new employees.

If Apple does not trust California state law enforcement personnel in San Bernardino County or within the United States Government, then Apple needs to change its mind.

This entire debate is childish and silly. Apple must obey the court order. Surely Apple has the wherewithal to absolutely guarantee the integrity of the process of opening one telephone to comply with one subpoena.

It is idiotic to conclude that every Apple employee involved in opening this single ‘phone is a thief and a pirate who would leak the password to the world and destroy Apple customers’ privacy worldwide.

If that were true, then Apple’s worldwide security is already in shambles and all customers’ data are already exposed.

It is embarrassing to observe that everyone with an iPad or a Notebook, who can type, now proclaims himself an expert in law, mathematics, engineering, politics, worldwide business and government.

I have watched Tim Cook talk. Bless his heart. He has absolutely no knowledge of anything.

I have no doubt that Apple will be required to open the ‘phone and any other device necessary to this law enforcement investigation.

But not until after every Silicon Valley Hippy has had the opportunity to give a Ted Talk on American Corporate Capitalism and President Dwight David Eisenhower’s Military Industrial Complex.

Have a Dovely.

Sincerely yours,
Caleb Boone.

CalebBoone (profile) says:

Re: Re: Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Gandydancer:

Please see the 5:15 a.m. CT (US) reply which I typed this morning for Nasch below.

I disagree with you.

I admit that I have not read all the Orders issued in this case, but I don’t need to. I don’t need to know the precise procedural path which this case has taken.

All I need to know is the Constitution.

The United States Constitution, Fourth Amendment, and the hundreds of thousands of cases construing it, supports the Order to Apple to disclose or provide the information necessary to open the IPhone, just as if it were a metal key to a door to the Defendant’s apartment and the murder weapon were inside the apartment.

Have a Dovely.

Sincerely yours,
Caleb Boone.

CalebBoone (profile) says:

Re: Re: Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Gandydancer:

I have re-read your comment.

I note that you did not state disagreement with me, as such.

Therefore, I retract the sentence which I typed: “I disagree with you.”

I don’t disagree with you.

Instead, I offer the comment which I typed a few moments ago to further articulate what I typed yesterday.

Have a Dovely.

Sincerely yours,
Caleb Boone.