No, A Judge Did Not Just Order Apple To Break Encryption On San Bernardino Shooter's iPhone, But To Create A New Backdoor

from the slightly-different... dept

So… have you heard the story about how a magistrate judge in California has ordered Apple to help the FBI disable encryption on the iPhone of one of the San Bernardino shooters? You may have because it’s showing up everywhere. Here’s NBC News reporting on it:

A federal judge on Tuesday ordered Apple to give investigators access to encrypted data on the iPhone used by one of the San Bernardino shooters, assistance the computer giant “declined to provide voluntarily,” according to court papers.

In a 40-page filing, the U.S. Attorney’s Office in Los Angeles argued that it needed Apple to help it find the password and access “relevant, critical ? data” on the locked cellphone of Syed Farook, who with his wife Tashfeen Malik murdered 14 people in San Bernardino, California on December 2.

And you’d be forgiven for believing that the court has now ordered Apple to do the impossible. After all, for well over a year, the DOJ has been arguing that the All Writs Act of 1789 can be used to force Apple to help unlock encrypted phones. And that’s an argument it has continued to make in multiple cases.

Many people are now mocking this ruling, pointing out that with end-to-end encryption it’s actually impossible for Apple to do very much to help the FBI, which makes the order seem ridiculous. But that’s because much of the reporting on this story appears to be wrong. Ellen Nakashima, at the Washington Post, has a more detailed report that notes that Apple is actually required to do something a little different:

The order does not ask Apple to break the phone?s encryption, but rather to disable the feature that wipes the data on the phone after 10 incorrect tries at entering a password. That way, the government can try to crack the password using ?brute force? ? attempting tens of millions of combinations without risking the deletion of the data.

The order, signed by a magistrate judge in Los Angeles, comes a week after FBI Director James B. Comey told Congress that the bureau has not been able to open one of the killers? phones. ?It has been two months now, and we are still working on it,? he said.

In other words, the order does not tell Apple to crack the encryption when Apple does not have the key. Rather, it is asking Apple to turn off a specific feature so that the FBI can try to brute force the key — and we can still argue over whether or not it’s appropriate to force Apple to disable a key feature that is designed to protect someone’s privacy. It also raises questions about whether or not Apple can just turn off that feature or if it will have to do development work to obey the court’s order. In fact, the same report notes that there is no way for Apple to actually do this:

According to industry officials, Apple cannot unilaterally dismantle or override the 10-tries-and-wipe feature. Only the user or person who controls the phone?s settings can do so. The company could theoretically write new software to bypass the feature, but likely would see that as a ?backdoor? or a weakening of device security and would resist it, said the officials, who spoke on the condition of anonymity to discuss a sensitive matter.

So you could argue that this is effectively the same thing as asking Apple to break the encryption, since it (apparently) has no direct access to turning off that feature. However, the specifics do matter — and most of the kneejerk responses to the order (and the reporting on it) are suggesting something very different than what the court order seems to say.

I think it’s still perfectly reasonable to argue that this order is highly problematic, and not legally sound. However, it is still quite different than what most are claiming. It also seems like something that could be quite dangerous. Apple is being pressured to write code that undermines an important security feature, and will probably have little time to debug or test it overall, meaning that this feature it is being ordered to build will almost certainly put more users at risk.

Update: Okay, we’ve got the full order and it is, indeed, troubling. Here’s the key part:

Apple’s reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

Apple’s reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File (“SIF”) that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device’s flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE. The SIF will be loaded via Device Firmware Upgrade (“DFU”) mode, recovery mode, or other applicable mode available to the FBI. Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2. The SIF will be loaded on the SUBJECT DEVICE at either a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowing the government to conduct passcode recovery analysis.

If Apple determines that it can achieve the three functions stated above in paragraph 2, as well as the functionality set forth in paragraph 3, using an alternate technological means from that recommended by the government, and the government concurs, Apple may comply with this Order in that way.

The order also sets out that:

To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief within five business days of receipt of the Order.

I would imagine that Apple will be taking the court up on that…

Filed Under: , , , , , , , ,
Companies: apple

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “No, A Judge Did Not Just Order Apple To Break Encryption On San Bernardino Shooter's iPhone, But To Create A New Backdoor”

Subscribe: RSS Leave a comment
341 Comments
TechDescartes (profile) says:

TL;DR

So… have you heard the story about how a magistrate judge in California has ordered Apple to help the FBI disable encryption on the iPhone of one of the San Bernardino shooters?

A magistrate judge, an Apple employee, and an FBI agent agree to meet at a local bar. Only the Apple employee makes it. Why? Because the bar didn’t have a back door.

Troy Hoyt (profile) says:

Re: TL;DR

Too long; didn’t read?
…it must be nice to be so damned important that you don’t have time to read an article about something as important, and as troubling as this.

Are you actually that important, or are you actually just lazy?
You’re going to have to be a bit less lethargic if you hope to make a go of it in the world of stand-up… Especially with jokes like that one.

Anonymous Coward says:

Re: Re: TL;DR

The author didn’t bother to read the court order. Why read his story when the comments are so much better.

Paragraph 4; page3; line3 says Apple can retrieve the data in anyway they want. The FBI only needs the data so they will concur with any technology Apple wishes to use. The court goes on to say Apple doesn’t have to give the FBI any of the technology used to get the data.

I am wondering why Tim Cook threw such a hissy-fit in a blog.

GMacGuffin (profile) says:

or maybe it's more calculated than it appears...

So, we have Judge Pym stuck between the US Attorneys and defense counsel arguing about what Apple can/cannot do, driving her mad. She’s gotta do something to move this forward, or at least to shut counsel up, and the best way to find out what Apple can or cannot do is to ask Apple.

And she may well even understand the implications of asking Apple to undermine its own encryption. But the best way to get that in the record is to give Apple a chance to fully explain why it is a bad idea, or impossible. Notice and opportunity to be heard.

And Apple is not likely to say “Yeah, we can write this backdoor brute-force buddy software” because that would mean that someone else could write that software, which would mean that Apple’s encryption now has a known point of potential compromise. So Apple will say it can’t write that software. And then the US Attys will hopefully shut up about it already.

(It’s not easy to become a federal magistrate …)

elemecca (profile) says:

Re: or maybe it's more calculated than it appears...

And Apple is not likely to say “Yeah, we can write this backdoor brute-force buddy software” because that would mean that someone else could write that software, which would mean that Apple’s encryption now has a known point of potential compromise. So Apple will say it can’t write that software. And then the US Attys will hopefully shut up about it already.

This isn’t entirely true: as noted in the order, any OS-level software to be run on an iPhone needs to be signed by a cryptographic key held only by Apple unless it exploits a vulnerability in the phone’s existing software to install itself (i.e. jailbreaking). It is therefore much easier for Apple to provide this kind of modified software than for a third party. The signature requirement also means that if, as requested in the order, Apple makes the putative custom OS image check the device ID of its host to ensure that it’s running on the target device, that check will have teeth because if it’s edited out the signature will no longer be valid.

Also, the modified software wouldn’t actually weaken the disk encryption scheme itself. It would make it easier to attack weaknesses in the user’s choice of key on this particular device, but if the user chose a decent password a brute-force search would still take prohibitively long.

Of course, that doesn’t really change the likelihood of Apple complying with this order without a fight. It just affects your reasoning as to their motivations.

Danny says:

Re: Re: or maybe it's more calculated than it appears...

The timer for pause between guess attempts is implemented in the hardware not software, so no OS patch would impact this.

The delete key routing is also implemented in hardware so this may not be bypassed either

The other difficulty is the OS would need to be decrypted, loaded, to be patched.

The users on this board saying the phone can be cloned is untrue as the key and uuid are held in the secure enclave in the processor itself, cloning the storage would not be a means to get infinite 10 guesses as the key would not be present to test against.

This is technically impossible for Apple to achieve.
The courts just dont like that.

Anonymous Coward says:

Re: Re: Re:2 or maybe it's more calculated than it appears...

A hardware CPU emulator assume that the CPU and the memory are separate chips, or that all CPU pins are brought out of an integrated chip, with a and include a pin to disconnect the CPU from it signals. This is probably not possible with the chips Apple use, as it is easier to use a software emulator of the system to develop the code. Flashing a chip with debug software is not permissible when gathering evidence, as any change to the firmware render any information gathered as inadmissible.
Note that the order requires that the new software run in ram, and does NOT alter any of the ROMs on the system. So it looks like the FBI want software that can be used for fishing expeditions in the future, or hope to find evidence to incriminate other people on the phone, otherwise the restriction would not be required as there is no court case to be brought against the deceased owner of the phone.

Gandydancer (profile) says:

Re: Re: Re: or maybe it's more calculated than it appears...

I don’t believe for a moment that decrypting the data on the phone is impossible for Apple, though the exact method specified as the default may not work (but read the last paragraph of the first quote).

My understanding is that the phone in question is an old model that doesn’t implement the “secure enclave”.

But if the data can be extracted then it’s just stupid to do the decryption in the phone.

Gandydancer (profile) says:

Re: Re: or maybe it's more calculated than it appears...

“…if the user chose a decent password a brute-force search would still take prohibitively long.”

The FBI appears to believe that a brute force method will work, and I know no reason to disbelieve them. Most people don’t, after all, use “decent” passwords. And my understanding is that the allowed password isn’t very long. The encryption key is, I understand, from a hash with the device id, but if the latter is known that doesn’t increase the number of necessary tries.

Jon Connor says:

Re: or maybe it's more calculated than it appears...

Actually it is easier to become a federal magistrate than you think. Typically it’s major campaign donors who are appointed, or lawyers that the administration likes and wants to place on the federal bench. I know this because I used to work for a federal magistrate Who donated gobs of money until he finally put him on the bench.

KidOmaha (profile) says:

Re: Re: or maybe it's more calculated than it appears...

You clearly do not know a whole lot about the appointment of federal magistrate judges. A federal magistrate judge is hired by the District Court judges in any particular district. They serve as at-will employees of the judges of the court. The “administration” has nothing to do with selection of magistrate judges. The same is true with respect to campaign donors. Those factors are somewhat true when discussing the appointment of federal district or Circuit court judges (who receive lifetime appointments after Senate confirmation) but simply is not the case with respect to the magistrate judges.

IP Lawyer says:

Re: or maybe it's more calculated than it appears...

No, she doesn’t have to do anything. She’s a federal magistrate judge. She can tell the fed to go pound sand and that would be that.

This order is deeply troubling. Getting any order overturned is very difficult. Make no mistake – this is not an invitation for Apple to defend itself and win on appeal – it is simply a blunt ruling in the favor of the FBI. And five days for a response is an insane timeline.

Your analogy is like saying that a referee making a call that awards points to your opponent minutes before the end of the game is simply an invitation to play harder. It’s bullshit.

GMacGuffin (profile) says:

Re: Re: or maybe it's more calculated than it appears...

I was presenting a somewhat cynical but pragmatic viewpoint. The judge must know it’s a big-ticket issue, and that Apple is in the best position to answer it.

Obviously the order is troubling, and I agree that five business days to oppose is prohibitively short. But then, perhaps it doesn’t take much to show that coding new software for one case is unduly burdensome.

I was going from the basic premise that I have never been before a magistrate who was truly an idiot (which I cannot say about Article III judges), so perhaps there was some calculation behind the otherwise frightening order. (AKA, trying to find reason and order amid the chaos.)

Fred_Flintstone says:

Re: or maybe it's more calculated than it appears...

So, what is the difference of breaking into a phone and turning off a key security feature and letting someone then ‘brute force’ break into a phone….same thing.

Apple has their own technology and is pivotal to their features like ApplePay, etc. Now, if Apple builds this back door to turn off the security feature….don’t you think Hackers will find it pretty quickly and be able to do the same. This will damage Apple’s reputation and their business……so yes, they should speak up when the Government makes demands on them.

Iverson says:

Re: or maybe it's more calculated than it appears...

The Federal Magistrate was prior to 1968, known as the “Park Commissioner”.

Following the Magistrates Act, a name change occurred, where the Park Commissioner whose duties were for the administrative adjudication of civil issues occurring within the jurisdiction of the “National Park”, now donned a black robe, to act as an administrative officer for the geographical jurisdiction outside the Federal Park to include the arena of the Federal District Court.

No, becoming a Federal Magistrate is not difficult, for it is first a political appointment.

Secondly, the statutory duties have not changed, for the Park Commissioner who now sits as the Magistrate, is still an administrative officer whose actions are overseen by a a Presiding Politically appointed Federal District Court Judge.

When the Federal Magistrate sat as the Park Commissioner, its decisions were then reviewed by the sitting Federal District Court Judge.

The Park Commissioner, who now sits as a “Federal Magistrate”, is required to submit their decisions to be reviewed by the sitting Federal District Court Judge.

One point, is since 1968, the Park Commissioner, now known as a “Federal Magistrate” may be required to be a registered “BAR” Attorney listed on a State Registry compiled by a respective State’s high Court.

KidOmaha (profile) says:

Re: Re: or maybe it's more calculated than it appears...

This is largely incorrect. To begin with, your statement that, “becoming a Federal Magistrate is not difficult, for it is first a political appointment” is simply false. Further, the term Federal Magistrate was abolished in the 1970’s and they are now called federal Magistrate Judges. They are hired by the District Judges of a court. In almost all scenarios, the magistrate judges’ decisions are not reviewed by the District Judge. They can exercise both criminal and civil jurisdiction. Federal Magistrate judges are, in my 18+ years experience of practicing law, extremely well qualified as a general rule. While a lifetime appointment as a District or Circuit Court judge can involve politics for appointment to a lifetime judgeship following Senate confirmation, these magistrate judges are selected by merit and retained the same way. Magistrate judges ARE NOT required to “submit their decisions to be reviewed by the sitting Federal Court District Judge.” Here’s a good article to read to refresh your stale information regarding federal magistrate judges, their authority and their process of appointment. http://www.nced.uscourts.gov/pdfs/Selection-Appointment-Reappointment-of-Magistrate-Judges.pdf

Wastrel (profile) says:

Re: Possible?

Court orders are generally written by counsel (in this case, the attorneys for the government) and argued about before they are signed by the judge. As you can see from the document that is linked, it was originally a “proposed” order. No doubt Apple had a proposed order, too, and the judge decided in favor of the government’s.

Anonymous Coward says:

Apple doesn't follow court orders

https://ia601402.us.archive.org/34/items/gov.uscourts.wieb.358648/gov.uscourts.wieb.358648.240.0.pdf

See the part “However, the court did ORDER that Mr. Rassbach is the owner of the iPad with serial number F5RKXNH1DFHW and that he is entitled to all incidents of such ownership”

The device was called “lost” by the person it was seized from in a Writ of Replevin and it is locked and otherwise unusable.

It is in Apple’s best interest economic interest to keep devices locked so they can attempt to sell a replacement.

Even More Anonymous Coward says:

Re: Apple doesn't follow court orders

Really? You didn’t find the statement directly before the one you pasted relevant?

“The court noted that Apple was not a party to this case, so the court could not properly order Apple to do anything.”

Whether or not Apple is invested in avoiding this for economic reasons, in your example they were not ordered to do anything except recognize that Rassbach was the owner of the device. From there, it’s up to their own terms and conditions whether or not they have to help him unlock it.

David says:

Re: Send the phone to NSA

That won’t work. The key isn’t stored anywhere on the device. It’s in the owner’s head.

User types in PIN code -> iOS runs 50,000 PBKDF2 key derivation rounds (number of rounds is a guess). This key is then used to decrypt the file system.

Without the PIN, there is nothing to find.

Anonymous Coward says:

I don’t really see the big deal here from a security standpoint. If apple can find a way to disable a security feature that doesn’t make the device any less secure than it already was. That vulnerability already existed from before and anyone else could have used it. Exploiting an existing vulnerability is not what makes the device more vulnerable it’s the fact that the vulnerability is already there that makes it less secure. Next time Apple needs to make a more secure device.

Now Apple being compelled to help may be a different story and I’m not sure where I stand there. To what extent can the government reasonably compel a private party to provide services they don’t wish to provide?

and I don’t really see what’s the big deal. Isn’t the encrypted data stored on some sort of flash memory? Can’t the encrypted data just be directly extracted and copied from its storage medium without going through the rest of the device and then be placed on a very fast computer that contains the decryption and verification algorithm minus the delete portion of said algorithm? The computer can then proceed brute force the password at a very fast rate. If apple is claiming this can’t be done I call lies.

Anonymous Coward says:

Re:

They are not lying. There are a few things you simply aren’t aware of.

First off, Apple’s security scheme assumed that only Apple would have the signing keys to modify the software running on the device, so attackers would not be able to remove the 10 try lockout limit. The court is attempting to force Apple to perform the attack themselves. So no, this is not an ‘existing vulnerability’ per se. It is outside the assumptions made by their security model. Namely, the assumption that Apple themselves would not be trying to crack a particular user’s encrypted device.

Second, the data can not be extracted without Apple’s help. The iOS disk encryption scheme stores device specific keys to the flash memory somewhere on the motherboard. (it may now actually be on the CPU die at this point) This is so you can not put the memory chip from one iPhone into another and read it. This is in addition to any user-supplied password, and uses encryption that even the NSA can not break with all the computing power on earth (we think). So to get those keys, the device needs to be booted up, and have its software modified to report the keys. Again, modifying the software requires Apple to sign the update.

Just to sure up that last point, reading the keys directly from the chips using xrays/microwaves would involve destroying the chips and risk destroying the keys as well, making the data irrecoverable. So getting Apple’s help is a reasonable approach from a technical standpoint.

Anonymous Coward says:

Re: Re: Re:

. The iOS disk encryption scheme stores device specific keys to the flash memory somewhere on the motherboard. (it may now actually be on the CPU die at this point)

The “wipe” ‘feature’ is not a re-setting of the 1’s and 0’s on the flash memory it is the “forgetting” of the encryption key.

If one can pop the top off the chip and not destroy the place where the key is kept, the key can be read via probes.

Good old fashioned police work should get most of the same data as reading the phone data.

Anonymous Coward says:

Re: Re: Re: Re:

Probably not, unless Apple really screwed up on the security.

The PIN or whatever that the user enters is not directly used for the encryption. Instead, it gets combined with other device factors (e.g., hardware identifiers), then gets run through a key-lengthening algorithm (e.g., thousands of rounds of PBKDF2), to generate the actual encryption key. The result is that the only way to brute-force decrypt the data, lacking some of the inputs, is to try more encryption keys than there are atoms in the universe.

There could be a flaw in the encryption that makes this easier, of course, but that’s now starting to pile up the mistakes that would be required in order to successfully decrypt the data.

Gandydancer (profile) says:

Re: Re: Re:2 Re:

If the only input that is missing is the 8-character PIN then the number of tries in a pure brute-force method is the number of possible 8-character PINs. The key-lengthening etc merely imposes computational delay, and if an iPhone can do it in a reasonable amount of time then more capable hardware will breeze through that. So it comes down to my first if.

Uriel-238 (profile) says:

Re: Re: Re: Apple

I’m not entirely anon, but…

~ People have a right to privacy, and access to technologies that would assure that (e.g. an impossible-to-break phone end-to-end encryption).

~ That said, I don’t see a legal reason why the FBI shouldn’t be allowed to try to crack open the phones of San Bernadino killers Syed Rizwan Farook and Tashfeen Malik. They’re dead and they have no rights.

~ I don’t think Apple can be forced to provide an extensive amount of service to the courts to try to crack the phone, through if it is kept secure by algorithm obscurity that is an encryption failing, and it means those phones will eventually be hacked.

Considering the way true encryption works, I can see the courts demanding that Apple de-obscure their phone’s password security.

We have ways to turn a password, even short ones (Less than 64 characters) into ciphers that are expensive to crack, and so Apple has no excuse providing a form of encryption that is hobbled so as not to be.

KnightGeek says:

Re: Re:

You’ve got the wrong idea here. There is no vulnerability that they are taking advantage of. They are being asked to write a special version of iOS, that doesn’t have the delay between passcode attempts. Only Apple could do this, because the phone checks itself each time it turns on and matches signatures to ensure ios is still secure and hasn’t been tampered with. You also couldn’t perform a manual analysis or extraction, because as you stated, it would need the encryption algorithm and there is no way Apple would provide that to them or anyone. That algorithm is actually unique per device, so again, only apple can do this for them.

This opens a precedence issue.

nasch (profile) says:

Re: Re:

To what extent can the government reasonably compel a private party to provide services they don’t wish to provide?

That’s what I was wondering. Apple has nothing to do with this case, and they don’t own the hardware in question. Why can the government compel them to help? If they decided I had skills that would be useful to them in an investigation, could they have a court order issued that forces me to help in an investigation whether I want to or not? This seems very wrong.

ijuin says:

Re: Re: Re:

[i]To what extent can the government reasonably compel a private party to provide services they don’t wish to provide?[/i]

They can compel you on the same grounds that they can draft any random guy into the military and send them to die in combat without consent. Any argument against the federal government being allowed to compel compliance that would stand up in court could also be used as an argument against the validity of conscription.

Anonymous Coward says:

Re: Re:

Karl explains how it would make iPhones (and other devices) less secure if the FBI is able to make Apple do this. It’s a very interesting read.

Screw The FBI: The Model Phone Manufacturers MUST Adopt

http://market-ticker.org/akcs-www?post=231126

So the FBI wants a custom firmware load that will allow:

Any number of password attempts. No “10 wrong and you’re done” auto-wipe.

Any means of entering them. No “must key them on the screen.”

This then means the FBI can attempt to “brute force” the password using a computer over the USB interface and, they demanded, any other means such as Wifi, cellular or Bluetooth!

The latter would mean that for the future they would not even have to physically posses the device. That’s right — they could hack it any time they wanted, from anywhere, at any time.

(snip rest)

Matt (profile) says:

Re: Re:

No, it is not that easy to break AES-256 encrypted data (which, after a cursory google search, it seems Apple uses). If there is a side channel attack available to cracking an iPhone due to a flaw in Apple’s implementation of the protocol then that is a different matter. Apple introducing a backdoor (which, contrary to what the title of this article states is EXACTLY what is being demanded) would be a HUGE flaw in implementation and would render the phone insecure.

Uriel-238 (profile) says:

Re: Where's the NSA

The more specific question a court might be asking is where the crack-teams-with-supercomputers are for when they have the encrypted phone of a dead / uncooperative terrorist (not that many terrorists have encrypted phones, or much data on them).

I’d think the NSA and FBI have access to something meaty and mainframey that could have a brute-force go at an iOS7 iPhone, and then the whole 20-attempts-and-the-phone-bricks software is moot.

The encryption will be less moot, but that’s the real roadblock here.

Also the jurisdiction of the court (is that the right term? IANAL) to turn to an agency (or a company) and say Here, use your big nutcracker to crack this nut.

nasch (profile) says:

Re: Re: Where's the NSA

I’d think the NSA and FBI have access to something meaty and mainframey that could have a brute-force go at an iOS7 iPhone, and then the whole 20-attempts-and-the-phone-bricks software is moot.

How would that be moot? If they start brute forcing it and it wipes itself after 10 tries, all their massively powerful computer will accomplish is to brick the phone faster.

Uriel-238 (profile) says:

Re: Re: Re: They're not playing the ten-guesses game.

Because with a mainframe attack, they’re only examining the file structure and data, not using Apple’s wipe-after-X-fails routine.

The FBI crack software assuredly doesn’t give them only a limited number of attempts, and it assuredly doesn’t wipe the data.

That’s the point of a full phone encrypt. Similarly, Windows account password doesn’t encrypt the drive, so when the police nick your desktop PC, they’ll probably not even boot it, but analyze the (probably unencrypted) files of the drive.

nasch (profile) says:

Re: Re: Re:2 They're not playing the ten-guesses game.

Because with a mainframe attack, they’re only examining the file structure and data, not using Apple’s wipe-after-X-fails routine.

The FBI crack software assuredly doesn’t give them only a limited number of attempts, and it assuredly doesn’t wipe the data.

I’m not sure what you’re getting at. They cannot bypass the phone’s security features without Apple’s help, or they would just do that. They cannot offload the data and crack it outside the phone because the AES-256 encryption is not susceptible to brute force attacks in a useful amount of time with current technology. They must brute force the phone’s access key (not the same as the AES encryption key), which requires the phone hardware. This means they need a way to disable the wipe-after-10 feature.

Uriel-238 (profile) says:

Re: Re: Re:3 They're not playing the ten-guesses game.

The last time that I played the encryption game, depending on security through obscurity is a no-no. You assume that people know how it’s encrypted.

If they can force Apple to bypass the access key and by that break (or bypass) the AES encryption, then that means the encryption is pretty much hobbled. There is a back door (or at least a thin back wall), and people who shouldn’t have access to it do.

Apple shouldn’t have an option here that will help the court. If Apple does have an option here, then that means people shouldn’t be relying on that encryption in the first place. It’s a false product.

If this is the case, say that the pin is used to generate one of a small number of AES keys, then the FBI can just have Apple hand over the algorithm, write their own code and attack the data.

nasch (profile) says:

Re: Re: Re:6 Do you believe in magic, in a young girl's heart...

Anything they could do within the phone’s device-wiping framework they could do outside it.

That is incorrect. The key system of the phone is partly encoded in the phone’s hardware. If they remove the data from the phone, they could no longer access it through the phone’s security system, and would be left with brute forcing the encryption directly. This is not feasible.

KidOmaha (profile) says:

Re: Re: Where's the NSA

I’m not sure why you think that “not that many terrorists have encrypted phones, or much data on them.” The growing problem our intelligence agencies are facing is the prevalence of third party apps that make encrypted communications readily available to terrorists. Further, a court lacks any power to to turn to some corporation or government agency and say, “Here, use your big nutcracker to crack this nut.” Finally, the point of this article and of all the comments is that the government DOES NOT have anything readily able to crack the encryption of this iPhone.

nasch (profile) says:

Re: Re: Re: Where's the NSA

The growing problem our intelligence agencies are facing is the prevalence of third party apps that make encrypted communications readily available to terrorists.

Has there been a terrorist attack that was facilitated by encryption? Even one? Has any intelligence agency shown any evidence that any of their investigations have been hampered by encryption?

Anonymous Coward says:

Isn’t one of the major rules in computer forensics basically “Thou shalt not fuck with the actual device”? It’s It’s been a few years since I had anything resembling a data forensics class but at the time I was under the impression that step 1 was to make a bitstream forensic image of whatever it was you were examining otherwise you risk contaminating the evidence. Is it no longer possible to do that?

Arthur Moore (profile) says:

Re: Re:

Well yeah, but you run into the same issue that self encrypting drives have. The data in flash memory is encrypted with a huge key which is kept on the encryption chip. There’s no way to read the key without de-encapsulating the chip, and that carries with it significant risk. Those things are meant to be tamper proof after all.

If apple did this correctly, the encryption chip is a separate component, that can not have its firmware changed. It can even still be in the same package as the CPU. By tightly defining the security element’s inputs and outputs you can create an extremely hard to crack system, even if it can’t receive firmware updates.

My fear is that the checks, including the counter for number of retries is handled by upgradable firmware. That would mean not only could apple crack any phone, but the next time a bootloader jailbreak is found everyone else could too.

Anonymous Coward says:

Re: Digital Forensics

They’re seeming to want to try not polluting the bitstream too much by ordering it to run in RAM…. however yea, you’re right they’re fucking with the device. AFAIK they can’t rip a clone because it’s locked and any attempt outside of an NSA or TLA lab specifically equipped to rip locked/encrypted data that doesn’t want to be copied because the device is locked and thus all ‘Fuck off I ain’t telling you shit. 5th amendment and all that cocksucker!’ as far as letting a PC copy stuff the normal way. IE over USB or a backup suite. I don’t know if digital forensics software can force a mount of the device yet it seems that either:
The decryption algorithm is locked in hardware so a copy of the data is useless
OR
The government doesn’t want to let loose they CAN rip data anyways from locked/encrypted devices
OR
They can’t actually rip data from the device in any meaningful way

z! (profile) says:

Not an iPhone user here, but is it even possible to load new software while the phone is locked? Usually software update is a user-mode application, thus can’t be started while locked. Off hand, the only alternative that I see is to in-circuit rewrite the OS in the flash memory; if there’s a file system involved (and if there’s any crypto on that) it becomes somewhat difficult.

JBDragon says:

Re: Re:

I think this is a older iPhone running iOS7 and so security isn’t as tight as it got in iOS8 and newer where a Password is required to so anything. I think in iOS7 you could still plug your phone into a PC and run iTunes and install a newer OS that way without having to unlock the phone. I’m not 100% sure on that, but I think that’s the case. So if Apple created a new iOS 7 version and called is say 7.4 and a new correct digital signature and made it so you could brute force the password. disabling the 10 try and then wipe feature or the so many try’s and then delay and longer and longer delays, that could be possible. Install the update OS, then brute force Unlock it. You of course could already just wipe the phone and start over unlocked of course, but they want the Data on it.

Once you get to iOS8, you need to enter the password to do anything. You can’t update the phone plugging into the computer and using iTunes either without the phone being unlocked. Apple really tighten up security on the iPhone. Part of that reasoning was if someone steals your iPhone, they can’t plug it into a PC and wipe it or anything. It’s LOCKED UP and worthless at that point without the password. You’re less likely to get mugged for your phone if people can’t steal it, wipe and and sell it for a couple hundred. If it’s locked up, it’s almost worthless. Part it out I guess but it would be so very little to make it not worth mugging people over.

To get people to use the Security, Apple made it as simple as possible by adding TouchID and having it built into the home button. It’s almost as fast to get onto a iPhone with security on as not using it at all. I use a 8 digit number of my iphone. Good luck trying to brute force that!

Anonymous Coward says:

Free labor for the fbi now?

So now this sets a precedent that if the Government does not want to pay for its workers or hire the best, it can just get a court order to force some other company to do their work for them.

The judge showed what a fool he is. Another dinosaur making decisions that he has no understanding of.

Kern says:

Re: Free labor for the fbi now?

Don’t forget that not long after the attack the FBI said it had figured out all of what the attackers had done for a day or two except for about 15 minutes and asked the public to come forward with information to fill in that gap.So how did the FBI do all that legwork so quickly? What doesn’t it know ?

Whatever (profile) says:

A couple of things to note here: First off, if they have the phone in theory they can clone all of the memory without wiping it. That would mean they could have an unlimited number of 10 times tries. So the limit doesn’t really exist, it’s just there to safeguard.

Second, the software that enforces that 10 swipe limit could also be disabled on a phone that clones the memory of the locked phone. So yes, it can always be disabled, if Apple so desires. It’s just code, and it can be removed.

Third, love it or hate it, Apple likely does either have a back door or knows the most direct method by which to undo the encryption on their own phones. They created it, they will generally know the answer.

It is quite possible that there is no real way to easily unlock the phone except brute force. Apple may be able to look at the encoded results and perhaps determine a key length or something similar that could narrow the search, but likely they will have to use the same blunt tool the rest of us use, albeit with the advantage of not having to deal with the 10 and fail problem.

I suspect Apple already has a tool, but they aren’t going to tell anyone about it.

Is this a good ruling? Well, it’s not a terrible ruling. Love it or hate it, it is a very good indication that there are reasons that encryption does present certain drawbacks to law enforcement. This is an extreme case, but it does show the block, and shows that in exceptional cases, perhaps options are needed. I do understand that “options” means that hackers can do the same, but is the price worth the result to do otherwise?

Mike Masnick (profile) says:

Re: Re:

Yet again, almost nothing “Whatever” has to say is accurate.

First off, if they have the phone in theory they can clone all of the memory without wiping it. That would mean they could have an unlimited number of 10 times tries. So the limit doesn’t really exist, it’s just there to safeguard.

This is wrong. The encryption key is embedded in the hardware. Clone the memory and try it somewhere else and you’re done.

Second, the software that enforces that 10 swipe limit could also be disabled on a phone that clones the memory of the locked phone. So yes, it can always be disabled, if Apple so desires. It’s just code, and it can be removed.

Again, this is wrong. The key is in the hardware and cannot be cloned.

Third, love it or hate it, Apple likely does either have a back door or knows the most direct method by which to undo the encryption on their own phones. They created it, they will generally know the answer.

Apple has always claimed that they throw away the key. If this is NOT true that would be a HUGE discovery and would destroy a ton of trust in Apple.

Whatever (profile) says:

Re: Re: Re:

Always fun to watch you wave your sureriority complex over the crowd. It’s almost awe inspiring, if it wasn’t quite so, how can you say it, amusing?

Apple’s system in theory is all that, you are correct. However, much of it hinges on the question if Apple’s system does generate truly unique codes that cannot be broken, read, or otherwise accessed. There is plenty of debate online to that question. There is also a question in regards to the repeatability of the process under which the UID is created. There is the potential that the UID could be gleaned or otherwise determined by repeated the process under which is was created (because random numbers are rarely truly random).

The two real security measures that have to be overcome is the 10 attempts pincode limit, and the 5 second delay per failed attempt (there to stop brute force attacks). If there is any way to determine the UID, the rest is a walk in the park. Taking the data out of the phone and putting it into another device with the same UID and no other security would turn this into a short project.

“Apple has always claimed that they throw away the key. If this is NOT true that would be a HUGE discovery and would destroy a ton of trust in Apple.”

Actually, Apple is very careful in how they phrase this. As the secure area is created by others, the potential is that Apple doesn’t have the method or retain the key, but others do. Plenty of chatter online in those areas as well. One would also have to wonder how Apple would deal with governments in places like China on this topic. There appears to be plenty of wiggle room here for Apple to have been telling the truth in concept but perhaps having outs it doesn’t want to talk about.

I actually wonder why Apple would be fighting so hard if they can just simple show that it doesn’t work, that they cannot do it, and brick the phone after a particularly bad attempt I think they are very concerned that the government already knows the real answer (that it is in fact possible) and Apple is trying very hard to stick by their “it’s impossible” story.

Anonymous Coward says:

Re: Re: Re: Re:

“Actually, Apple is very careful in how they phrase this.”

“There appears to be plenty of wiggle room here for Apple to have been telling the truth in concept but perhaps having outs it doesn’t want to talk about.”

Zero wiggle room, all the legal babble and clever wording in the world won’t change that fact that If they crack it once, then it’s not secure, and no one will trust them again. Especially after they made a big deal about protecting users data.

Jim McDonald says:

Re: Re:

I am sorry, but If you read the first 10 amendments of the constitution, you discover that the intent of the document was to LIMIT powers granted to the government by the Public. The 4th amendment specifically states that people shall be secure in their papers, possessions, and persons, unless warrants are provided.
The 5th amendment specifically states that you cannot be compelled to give testimony against yourself. IN ANY CASE.
Passwords and encryption phrases are “Products of the Mind” according the the 11th district Court of Federal Appeals. And thus testimony. If the federal government wants to work at entering the Safe, without the Combination, they are given permission to do so. What the FBI wants here is 2 things.
1. Simple access. No need to focus a lot of resources on this, just pop it into the faraday cage, plug in this app, and wallah, we have it all.
2. The precedent that such a App will not be rendered unusable in the future.
Why did this go to a Admin Law Judge? (Or Magistrate?) Because a Real judge would have thrown this out the door based on the 11th district court of appeals ruling.

Cops have a dirty, dangerous, difficult job. It is THEIR CHOICE. Want something easy? Be a burger flipper at McDonalds.

KidOmaha (profile) says:

Re: Re: Re:

I don’t think I disagree with the main point you are trying to make but I want to pint out a couple of things. First, there is no such thing as “the 11th district Court of Federal Appeals. There are “District Courts” which are the federal trial courts and there are “Circuit Courts of Appeal,” the federal appellate courts. There is then the U.S. Supreme Court. The 5th Amendment states as you say it does but it is not implicated in this matter. No one is prosecuting Apple for a crime thus it is not refusing to testify against itself. The court’s order was not issued by an administrative law judge but by a federal magistrate judge. Federal Magistrate Judges routinely issue preliminary orders in both civil and criminal cases. I agree that the order is improper but the fact that it was issued by a magistrate judges versus a District Court Judge is of no consequence.

TruthHurts says:

Re: Re:

The FBI, CIA, NSA, Homeland Security and TSA are already on the terrorist watch list tied for pole position. They only trail behind Congress and the Presidents since 9/11.

9/11 didn’t change the Constitution, nor the bill of rights.

That means every patriot act like law, executive order, kangaroo courts setup post 9/11 are all acts of treason, punishable by death.

When do we start stepping on their necks and balls forcing the arrests that should have already happened?

There are how many millions of us, and only thousands of them, with the military 100% on our side as the honorable veterans swore to uphold the Constitution above all other laws or orders, we’ve got this in the bag.

rorybaust (profile) says:

what error number will this get

if Apple can stop external repairers with an error 53 all they need to do is tell the court that they tried but got an error 54 and sorry they need to buy a new device. But really if the best brains in US justice cant beat the security then of course all Apple have to say is No it can’t be beaten and the fact they have to ask should already mean its pretty robust

Anonymous Coward says:

This iPhone will self destruct in 5..4..3..2..1..

Apple should add an optional timer and or counter to the locking settings. A timer should wipe the phone a certain amount of time (set by user) after the nth wrong passcode guess. And the timer should be started after the nth reboot if the phone hasn’t been unlocked in the interim. And someone really security conscious might just want the timer to countdown from the moment the phone is first locked. If you use it every day, it wouldn’t be wiped. If it’s stolen, 36 hours later, poof.

andy says:

Re: This iPhone will self destruct in 5..4..3..2..1..

“Apple should add an optional timer and or counter to the locking settings. A timer should wipe the phone a certain amount of time (set by user) after the nth wrong passcode guess. “

Uhhh, that’s already IN the iPhone, and the FBI wants Apple to eliminate this feature to make future attempts to unlock phones easier.

Mark Wing (user link) says:

Technically this isn’t as big of a deal as everyone is making it out to be. The whole premise of encryption is that having the data doesn’t help you without the key. Because math.

Erasing the data after n failed password attempts sure doesn’t hurt from a privacy standpoint, but that feature is not what is keeping your data secure.

Politically, this could be a huge deal, since our public policies regarding technology are mostly FUD-driven. Nobody with an IQ higher than room temperature would advocate undermining encryption, but there’s a lot of empty bobble heads putting out a lot of sound bites lately.

Tripod says:

Re: Re:

“Erasing the data after n failed password attempts sure doesn’t hurt from a privacy standpoint, but that feature is not what is keeping your data secure.”

It is an essential security feature when the passcode is only 4 digits. Anyone could work their way through an average of 5000 tries, 10000 at most, in a weekend without that protection.

The FBI doesn’t have any way of knowing if that feature is turned on for that phone, but they have to assume it is.

Apple can’t help the FBI with this because they took the security implementation seriously, and also because helping to create a backdoor would significantly damage their brand.

Anonymous Coward says:

Re: Re: Re:

The FBI doesn’t have any way of knowing if that feature is turned on for that phone

According to the writ application they filed in court, while they don’t know whether it’s currently turned on, it was turned on when the county issued him the phone, and it was on at the phone’s last backup. So it’s probably safe to assume that it’s on.

Danny says:

Re: Re: Stupid Judge

Like Bitlocker on a laptop or and device with a TPM chip to store the key. There is a small boot partition that asks for the passphrase to decrypt the key that will in turn decipher the main partition where the OS is held.

https://en.wikipedia.org/wiki/BitLocker

There are lots of examples of encrypting the entire OS, they all rely on the similar method, a small boot program outside the OS either in BIOS or a separate partition that is used to decrypt the main drive where the main OS is stored.

cob says:

Re: Stupid Judge

You can put a locked iPhone into DFU mode and perform an update, which will erase all data on the phone. The order wants Apple to build a custom OS build that will NOT erase the user data, but leave in encrypted and remove the ability for the OS to wipe the data if too many password attempts are tried.

AYO says:

Re: Stupid Judge

Nah, in iOS9 only the users data is encrypted not the OS partition. Which means, the logic that increases guess entropy and wipes the phone after 10 incorrect passes can be changed.

However, in order for this to happen Apple would need to sign the software so it can infect be installed.

So the brute force method is the best option and thus the demand for apple to sign and install a version that removes both the delay and the wipe feature.

Source: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Anonymous Coward says:

“at an Apple facility; if the latter, Apple shall provide the government with remote access”

“The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE”

This is reassuring. Also, I sure hope these maniacs used an insecure password, because i want the FBI to know as much as possible about these criminals without compromising everyone else’s cell phone security in the process.

TKnarr (profile) says:

Re: What Apple Gives, Apple Can Take Away

Up to a point. If I were designing it, I’d have set it up so that the firmware couldn’t be updated until the phone was unlocked by entry of the passcode. That would help close many of the holes exploited to root phones in general, and as a side-effect would prevent what the FBI’s trying to do. Normal firmware upgrades would be happening with the phone already unlocked so it wouldn’t bother normal users, and a phone couldn’t have it’s firmware forcibly back-leveled to a version that was vulnerable to rooting or a modified recovery image installed.

AYO says:

Re: Re: Re:2 so that the firmware couldn't be updated until the phone was unlocked

Only on untrusted devices.

“Trusted computers can sync with your iOS device, create backups, and access your device’s photos, videos, contacts, and other content. These computers remain trusted unless you change which computers you trust or erase your iOS device.”

/var/db/lockdown or %ProgramData% contain a list of trusted devices.

Note: you can only trust a device when your device is unlocked, something the shooters probably did when backing up candy crush to iTunes.

JBDragon says:

Re: Re: What Apple Gives, Apple Can Take Away

On iOS 7 and earlier, you can plug a iPhone into a PC and Update the OS without the password I think. But need it to Log into the phone still. Or you could Wipe the iPhone and then log in with no passcode, but all the Data would be gone.

So I think it could be possible for Apple to create a iOS7.5 with a good Security certificate. Where the the 10 password fail and wipe is disabled along with the slowing down entering the passcode that way brute force could then be used to unlock and get the Data at that point.

With iOS8 and newer, Apple closed up tight everything. You can’t update without a passcode. You can’t even just wipe the phone and start over. That’s part of the security so that if someone mugs you and steals your phone, it’s almost worthless and they’d get only a tiny fraction for it if anything then what they would have gotten before. So not even Apple and install some Modified version of iOS on a iPhone with iOS8 or newer. Security is a huge thing for Apple these days.

This whole error 53 thing is part of that. Replacing the touchID button with a 3rd party installer that doesn’t match and then you end up with a phone that won’t work.

Anonymous Coward says:

(3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware

Industry standard algorithms encrypt data with multiple layers so that anyone trying to brute force it must work through multiple layers to see if the guessed password is correct. It, by design, slows down the decryption process to help protect users with weaker passwords.

Link talks about password hashing, but the same general idea applies to disk encryption.
http://security.stackexchange.com/questions/211/how-to-securely-hash-passwords/31846#31846

Kevin Marquette (profile) says:

What are they looking for that

Is this phone really that important to the case? Can’t they get all the data they need from the CIA. They already have a copy of it, don’t they?

Besides the call logs and text messages stored at the cell carrier, what more are they really hoping to find?

What would they have done before we had cell phones?

Peter (profile) says:

Why the FBI needs to read the phone data in the first place? It is too late to prevent the crime, the perps are dead – no evidence needed for convictions-, and the FBI itself has concluded that “They were not directed by [foreign terrorist] groups and were not part of any terrorist cell or network”” (https://en.wikipedia.org/wiki/2015_San_Bernardino_attack)

Ryunosuke (profile) says:

hoi, mike, Apple put out a statement...

within the last couple hours, Reported from ABC within the last half hour (as of this post)

— Truncated important bits (Ie. TLDR)

1) US govt issued court order to break our own encryption (more of the court order itself) Let’s have a discussion about encryption NOW.

2) Need for encryption, How everyone uses encryption and what breaking IOS would mean.

3) Apple is shocked and outraged by San Bernadino and has co-operated to the fullest extent of the law (Important bit there)

** Important bits here **

“Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone. Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation.”

and a bit further on…

“The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”

then goes on to describe the legality of using the All Writs Act for this purpose.

**Full article here**

http://abcnews.go.com/US/apple-opposes-judges-order-unlock-shooters-phone/story?id=36993038

Anonymous Coward says:

This is fucking insanity, talk about murrica’s “never let a good tragedy go to waste”.

If they get someone competent in there they can dump the storage on the phone and brute force it all they want. If the data is soooo valuable they can spend the money on the computing power to truly brute force it… But no, lets use this “opportunity” to set the legal president that companies have to defeat their on encryption on demand.

John Bravo says:

Re: Re:

This thread seems to be suffering some technical confusion. I blame Hollywood for how badly the average person understands encryption, but that’s a rant for another day. There are two different issues at work here. First is that people choose weak passwords. Second, it is impossible to brute force 256 bit encryption, the computing power to do this in any reasonable time wouldn’t fit in our universe.

This is how these issues apply here, Apple takes your password (no matter how weak), encrypts it with 256 bit encryption found on a secure co-processor in your phone and uses that result to encrypt your data. Copying the data and using it on another phone or computer system is useless, your password can only be used on your phone due to the encryption key from the secure chip.

All the secure chip does is take one input (your password), run the 256 bit encryption on it, then output that result to the OS to decrypt your data. Neither the firmware, nor the OS can get the encryption key from the chip. It would be possible to write a new key to the chip, which is how it got there in the first place, but this would be useless, since combining it with your password would result in the wrong output to decrypt your data.

Obviously, humans are the weak link here, unless someone is really keen on information security, chances are they have a weak password. Apple has two methods to help reduce this vulnerability. First, 10 tries before they wipe some data needed to recover your data. Second, the encryption chip has a five second delay built into it. Which doesn’t seem like much, but if you want to try a few million different combinations to brute force the password, that will seriously increase the time it takes.

In practical terms, it could still take decades or centuries to brute force a simple password, but Apple isn’t going to just hand out a system vulnerability on the promise that Law Enforcement won’t just give a patch to anyone who “really needs it”. LE can’t keep control of guns or drugs, much less a small easily copied piece of software.

That One Guy (profile) says:

Re: Re:

That would be the benefit of the doubt interpretation, which I would say is anything but deserved at this point. Me, I figure they simply don’t care what happens once they’ve got what they demanded.

If forcing Apple or another company to introduce a security vulnerability causes significant problems for the company and it’s customers in the future, it’s not their problem, they got what they were after.

Danny says:

Why this is an impossible request from the court

You mistake an iPhone’s unlock code with the iPhone’s encryption key. the iPhones do typically use a 4-6 digit pin as an unlock code. The user also has the ability to create a full alphanumeric password for the unlock code as well. However, that is simply the code that’s used to unlock the actual full encryption key that is stored within dedicated crypto hardware. Apple uses a dedicated chip to store and process the encryption. They call this the Secure Enclave. The secure enclave stores a full 256-bit AES encryption key.

Within the secure enclave itself, you have the device’s Unique ID (UID) . The only place this information is stored is within the secure enclave. It can’t be queried or accessed from any other part of the device or OS. Within the phone’s processor you also have the device’s Group ID (GID). Both of these numbers combine to create 1/2 of the encryption key. These are numbers that are burned into the silicon, aren’t accessible outside of the chips themselves, and aren’t recorded anywhere once they are burned into the silicon. Apple doesn’t keep records of these numbers. Since these two different pieces of hardware combine together to make 1/2 of the encryption key, you can’t separate the secure enclave from it’s paired processor.

The second half of the encryption key is generated using a random number generator chip. It creates entropy using the various sensors on the iPhone itself during boot (microphone, accelerometer, camera, etc.) This part of the key is stored within the Secure Enclave as well, where it resides and doesn’t leave. This storage is tamper resistant and can’t be accessed outside of the encryption system. Even if the UID and GID components of the encryption key are compromised on Apple’s end, it still wouldn’t be possible to decrypt an iPhone since that’s only 1/2 of the key.

The secure enclave is part of an overall hardware based encryption system that completely encrypts all of the user storage. It will only decrypt content if provided with the unlock code. The unlock code itself is entangled with the device’s UDID so that all attempts to decrypt the storage must be done on the device itself. You must have all 3 pieces present: The specific secure enclave, the specific processor of the iphone, and the flash memory that you are trying to decrypt. Basically, you can’t pull the device apart to attack an individual piece of the encryption or get around parts of the encryption storage process. You can’t run the decryption or brute forcing of the unlock code in an emulator. It requires that the actual hardware components are present and can only be done on the specific device itself.

The secure enclave also has hardware enforced time-delays and key-destruction. You can set the phone to wipe the encryption key (and all the data contained on the phone) after 10 failed attempts. If you have the data-wipe turned on, then the secure enclave will nuke the key that it stores after 10 failed attempts, effectively erasing all the data on the device. Whether the device-wipe feature is turned on or not, the secure enclave still has a hardware-enforced delay between attempts at entering the code: Attempts 1-4 have no delay, Attempt 5 has a delay of 1 minute. Attempt 6 has a delay of 5 minutes. Attempts 7 and 8 have a delay of 15 minutes. And attempts 9 or more have a delay of 1 hour. This delay is enforced by the secure enclave and can not be bypassed, even if you completely replace the operating system of the phone itself. If you have a 6-digit pin code, it will take, on average, nearly 6 years to brute-force the code. 4-digit pin will take almost a year. if you have an alpha-numeric password the amount of time required could extend beyond the heat-death of the universe. Key destruction is turned on by default.

Even if you pull the flash storage out of the device, image it, and attempt to get around key destruction that way it won’t be successful. The key isn’t stored in the flash itself, it’s only stored within the secure enclave itself which you can’t remove the storage from or image it.

Each boot, the secure enclave creates it’s own temporary encryption key, based on it’s own UID and random number generator with proper entropy, that it uses to store the full device encryption key in ram. Since the encryption key is also stored in ram encrypted, it can’t simply be read out of the system memory by reading the RAM bus.

The only way I can possibly see to potentially unlock the phone without the unlock code is to use an electron microscope to read the encryption key from the secure enclave’s own storage. This would take considerable time and expense (likely millions of dollars and several months) to accomplish. This also assumes that the secure enclave chip itself isn’t built to be resistant to this kind of attack. The chip could be physically designed such that the very act of exposing the silicon to read it with an electron microscope could itself be destructive.

TLDR: Brute forcing the unlock code isn’t at all possible through pretty much any means…reasonable or even unreasonable…maybe…JUST MAYBE…it’s possible through absurdly unreasonable means.

Mike Masnick (profile) says:

Re: Why this is an impossible request from the court

This is all good and accurate… except for the fact that it’s an iPhone 5C which doesn’t have the Security Enclave.

This point, which I discuss in a post that will be going up soon… does make it clear that the request is impossible for newer iPhones but could still apply to older iPhones, including the one in this case.

What’s not entirely clear is if there is still a key encoded in the hardware of the 5C which may effectively do the same thing. But that seems to be something that no one is quite sure of.

John says:

Re: Why this is an impossible request from the court

Just a question – referring back to the ‘safe’ analogy (inappropriate if talking about decryption – but not if talking brute force attacks on a passphrase, where the attacks are being ‘slowed down’ by or ‘frozen’ if too many attacks are created) – I have had firmware recovered from locked microprocessors by a company that ‘destroys the bit’ that essentially ‘pins the lock’ – which then allows them to download the firmware and create a file that allows me to program other microprocessors. (for the techy’s the recovered code is an image – not decompiles source). Maybe the government should be requesting the location of the circuit trace, or memory location, in the processor or memory that would enable the flash erase – and just blow that trace. (Drill the safe).
Granted – this was done on a microprocessor much smaller, and less capable – but maybe brute force is the answer. As is so often the case, insider attacks are easiest:)

jameshogg says:

Who needs copyright law when you can design device-unique combined hardware/softwa-*cough*- Digital Rights Management that makes the copying of flash data exponentially hard? Apple seems to be able to stop the copying process so well with natural scarcity as it is that they don’t even need copyright law! I guess the FBI will be joining forces with the EFF to condemn DRM then.

Anyway, I wouldn’t want to be one of those Apple programmers. One slip-up, one bug, one wrong library, and you face possible federal prosecution for triggering the self-erase function.

What would be hilarious is if they unlock the phone only to find third-party advanced RSA encryption on the phone without any private prime-number keys – the only location being inside the criminal’s mind, memorised using mnemonics. Because after all that is the only way you can guarantee (for now) that nobody will ever crack your messages.

jameshogg says:

Re: Hmmmmm

You’re quite right that the murderer should be subject to the full force of the law.

And “the full force” is not the same thing as an unstoppable force. Also, you’re up against an immovable object by the looks of things.

From the post by Danny above, I highly doubt Apple will be able to do this faster than the FBI can.

morganwick (profile) says:

Re: Hmmmmm

Forgetting that there’s this little thing called the Constitution that explicitly says murderers do have rights at least until they’re prosecuted in a court of law (yes, it still exists as much as the government likes to pretend otherwise), it’s technically impossible to open the phone without opening up a massive vulnerability in everyone’s iPhone (and no, “those smart people in Silicon Valley should be able to find a way” isn’t good enough, again as much as the government likes to think otherwise). What’s so hard to understand here?

Leigh Beadon (profile) says:

Re: Hmmmmm

What’s so hard to understand here?

Well, for me, it’s what exactly you mean by “murderers should have no rights.” Leaving the encryption thing aside momentarily, that statement by itself is either (a) poorly thought through, or (b) quite radical and monstrous.

Are you saying convicted murderers shouldn’t have the rights that protect them from cruel and unusual punishment, for example? What about their ongoing right to due process, including their right to appeal their conviction? Or their right to access the parole process?

What about the right under the 14th Amendment to not be treated unequally based on race, sex or creed? Prisoners retain that right. Should they lose that, so we’re free to punish murders more or less based on their race? Do they lose their right to medical care? Do disabled prisoner’s lose their right to accessible prison facilities?

Debate the encryption aspect all you want – but “murderers should have no rights” is a barbaric starting point.

Leigh Beadon (profile) says:

Re: Re: Re: Hmmmmm

Too bad for those persecuted and arrested unjustly by the Government, no?

That’s part of it, but it’s also about those justly arrested and convicted.

I mean look, I’m not overflowing with sympathy for murderers, and there are certain breeds of human monster that make me feel they deserve any horrors they must endure. But it isn’t all about sympathy or what anyone “deserves” — it’s about what our treatment of the guilty does to us. It’s not healthy for a human being to be able to throw someone in a hole to suffer and die, to stand over a starving wretch and feel nothing, to hurl stones with glee and cheer when they draw blood. It’s not healthy for a society to condone those things, or to ignore them. Perhaps one can personally believe that certain people deserve those treatments, because their actions have rendered their humanity forfeit — but there is no way to dish them out without sacrificing your own humanity to do so.

Ninja (profile) says:

Re: Re: Re:2 Hmmmmm

I hear you and agree, my bad for not being clear on my reply. I’m not that good of a human in these regards. I don’t have sympathy towards those monsters and I honestly don’t feel bad when they are lynched or killed in any ‘vengeful’ way. Wrath is a bad feeling. But I’m wrong. I know it and I actively fight this part of me and I wholeheartedly agree with you.

Leigh Beadon (profile) says:

Re: Re: Re:3 Hmmmmm

Yeah but that’s the thing – everyone feels that, and I wasn’t suggesting you’re any different. And that’s precisely why America and so many other modern states recognized that bans on cruel and unusual punishment, and other rights even for criminals, were an important founding principal – because a system can (theoretically) be immune to wrath in a way people can’t. So sure, people can and will all feel that rage, but when a whole community endorses it you end up with children treating “stone the prisoner” (metaphorically or literally) as a game they look forward to. And anyone who can do that, can do it to anyone. Our brains are only concerned with justice at the upper levels – the deeper bits that get numbed to, and then practiced at, harming people are much less discriminating.

Anonymous Coward says:

Re: Hmmmmm

Apple should open this phone and any other similarly scummy murderers’ phones as required.

Doing so makes millions of other people’s phones also insecure.

Murderers should have no secrets — whether dead or alive. They should have no rights.

What about everybody else?

What’s so hard to understand here?

Indeed.

john says:

Re: Re: Hmmmmm

Involuntary servitude or Involuntary slavery is a United States legal and constitutional term for a person laboring against that person’s will to benefit another, under some form of coercion other than the worker’s financial needs. While laboring to benefit another occurs also in the condition of slavery, involuntary servitude does not necessarily connote the complete lack of freedom experienced in chattel slavery; involuntary servitude may also refer to other forms of unfree labor. Involuntary servitude is not dependent upon compensation or its amount.

Decorus says:

Re: Hmmmmm

This isn’t about one terrorists phone.

This is about giving someone the ability to hack every Iphone ever made. Once this exists the next step will be all phones ever made. Do you want it so that anyone in the world can download a program for 5 bucks that can hack your phone and post all your private photos, texts and notes online then sure lets give the FBI what they want.

If you are like me I don’t want anyone to be able to do what they are asking.

Lets put it this way Senator uses an Iphone has pictures he took with a prostitute. Now that China paid an FBI agent 1 million dollars for the program China can Access his phone copy those pictures and black mail a senator.

Movie Star has an Iphone now some sleaze bag who paid 50 bucks online can copy all of his texts to his gay lover and post them online for the entire world to see.

Karl (profile) says:

Apple's security implementation, from Apple itself

If anyone is interested, here is the explanation of how Apple uses encryption to make their devices secure:

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

This is for iOS 9.0 or later, so I don’t know if it applies to the phone in question.

But everything in this PDF shows that Danny’s comment is correct (and Whatever’s is wrong).

DannyB (profile) says:

The REAL questions we are dancing around

It could be argued that since the device can receive over the air (OTA) updates, that Apple can subvert the device, even if it cannot break the encryption.

Apple could develop (at whose expense?) an OTA update that collects the password the next time the user unlocks the encryption.

But can Apple be compelled to develop this? Even if the government pays, it cannot pay Apple for the lost opportunity cost of diverting resources from developing new products. Money cannot make up for time-to-market.

In this situation, Apple could selectively deploy such an OTA update to a single target. But then this is just the camel’s nose under the tent, or the foot in the door.

The real question we’re dancing around by suggesting that Apple COULD defeat encryption by expending tremendous resources are really these:

Should Apple (and everyone else) be LEGALLY BARRED from building a secure product?

Even though there may be no such law in writing. The effect becomes just the same. If you can, through tremendous cost and effort, manage to defeat encryption, then you should be required to do so at the government’s mere whim and slightest wish.

Other questions:

Can Apple (and anyone else) be COMPELLED to expend tremendous resources to break into a device? At whose expense? Will all of their expenses be compensated, including a lost market because Apple diverted resources away from product development? Is there some level of cost (exact dollar value please) at which Apple is no longer required to break into a device? What if Apple deliberately engineers a device to ensure that the cost to break in will exceed this threshold?

And finally a lesson for those concerned with privacy. Once your phone has been seized, they may not be able to unlock it, but once the bad guys return it to you, it may have been compromised with software such that your next successful unlock of the device will open it up for them to rummage through fishing for, or to manufacture evidence.

JoeDetroit (profile) says:

What could possibly be on this phone anyhow?!

Regardless of what is being asked of Apple…
what exactly could be on this phone this is not already known? Is it not true that in the first 24 hours law enforcement all communications to & from this device?

Seriously. What am I missing here? Will there be a to-do list or something? Something other than pictures of their kids or their last vacation?

David Taylor says:

As an Operating Systems and Information Risk Management expert having worked for Dell, Microsoft, and two other Fortune 500 companies directly; not to mention having visited most of the rest…. I cannot even begin to tell you just how serious this problem is at this time. As a Veteran and concerned citizens, I want to see terrorism stopped.but, I agree with Apple on this matter: http://www.apple.com/customer-letter.

KidOmaha (profile) says:

Re: Re:

Ummm… I don’t think the FBI is interested merely in the phone numbers called by that phone or that called that phone. They are interested in the wealth of information we all store on our phones. Identities/locations of possible terrorists, banking information, travel plans etc… The information stored on an iPhone far exceeds just phone numbers.

KingChris says:

US Government INEPT!

Why the FBI and US Government just don;t a better job. stop importing terrorism. Millions of American must lose their privacy and personal freedom because the retarded U.S Government continues to give green cards to radicalized terrorist!

What is Apple supposed to do about it. Had the government had the encryption keys it needs would they still been able to stop the San Bernadino massacre. NO!

Government can go F***** themselves!

John Digweed says:

Apple doesn’t have to give the Feds anything special. The order says it can be done at an apple facility. Why can’t they just unlock the phone, and give it to the Feds. There is nothing that says the need to provide the Feds with software or anything of the sort, simply a means to unlock the phone, or an unlocked phone itself. It also says they need to provide the government with an estimate of costs. These costs should include the line item that their best programmers wil be working on this and not other projects which would advance the future of Apple, so it comes very expensive. Apple should take the phone, find a way to unlock it, give it back to the Feds and charge them $50 billion to do it. I doubt the government would be asking again unless the felt it were very necessary. Money rules the decision on both sides of the coin, give em a price they can’t pay unless they really need it. If the government would agree to spend $50 billion for an unlocked device, it probably important

Monday (profile) says:

There is no such fallacy that gets around what's going to happen.

This is a really technical request, MEANING, whomever helped draft the request is obviously a “Nerd” (I’ve always been fond of that title, and it is now become sheik.) and they know completely, what this means in the short term, and even more troubling, the long term, is the access to any phone anytime is at hand, and they are trying to help make it happen.

I have had the same mobile phone since January 2002 – the battery lasts a week, and have recently been considering getting something new (Galaxy, iPhone etc) although I really don’t need the internet, or to read someone’s texts when I’m doing something else I’m sure is more imporatnt, but as I read this post, I was impressed with the ’10 fails it’s gone’ feature you’ve mentioned… I did not know that.

Nevertheless, getting software, hacks, and other little treasures from Apple to get around this, will definitely lead to more audacious requests because it sets precedent, and the FBI won’t really need anything after that – including physical warrants, to access any phone anytime. It also sets precedent in ‘forcing’ other manufacturers to ‘comply’ with the FBI’s ‘requests.

It is not a ‘slippery-slope’. The inevitable bench warrants to access phones for doing anything from watching a .tor to liking Met-Art to finding out why pressure cooker bombs were so popular to learning about the bridges of New York is going to be fair fodder for any request – by any agency… do you think that technology is going to stay locked up in FBI offices and they will be the only ones to use it.

I hope Apple can get around it, through some law that was written in the nineteenth century, and they do not comply with the order. It is a disaster waiting to happen.

Kevin says:

While it’s worth noting that the order doesn’t ask apple to decrypt the data (apple, and all other known agencies lack the computing power needed to do so), or to provide the password (which they wouldn’t know unless the guy told them), asking apple to remove the only safeguard against brute force attacks on a 4/8 digit pin (easily doable) is tantamount to asking them to allow the device to be easily decrypted by anyone with moderate technical capabilities.
If you had a locker combo with 3 numbers, someone with a lot of time on their hands could break it, but that time constraint makes you comfortable that nobody will. But now imagine there exists a person who can test 1000 combos a second. To stop such a person, one reasonable option if you use locker combos would be to set a maximum number of tries per a day– or a maximum number of tries period after which the contents of the locker are considered compromised and are destroyed.
Obviously apple already has a backdoor wherein they can update their OS and have it run on the same data. My guess is if someone was dedicated enough, they could probably exploit this at some great cost. Since the FBI likely lacks both the time and resources to accomplish this for such a relatively small concern of investigating this attack, they are making a move to have apple do the change since they already have the OS source code and would know how to go about making the change in a much cheaper, timely manner

Wyldkat (profile) says:

VERY misleading headline (doesn't match article's updates)

Please update your headline so we can share this with others in response to the ongoing dialogue.

With the updates you’ve added to the article, it now reads the OPPOSITE of what the headline actually says. I see what you’re trying to say with the headline (that it’s essentially worse than what the news media are reporting, or at best, semantics regarding methodology), but it’s too long of a headline and most social media truncates it, so every time I post the article as a supportive follow-up, it looks like I’m correcting the info with an urban legends corrective instead of adding more detailed info and a link to the court order.

john says:

Involuntary servitude ???

Isn’t the court ordering Involuntary servitude which is prohibited by the 13th amendment?

Involuntary servitude or Involuntary slavery is a United States legal and constitutional term for a person laboring against that person’s will to benefit another, under some form of coercion other than the worker’s financial needs. While laboring to benefit another occurs also in the condition of slavery, involuntary servitude does not necessarily connote the complete lack of freedom experienced in chattel slavery; involuntary servitude may also refer to other forms of unfree labor. Involuntary servitude is not dependent upon compensation or its amount.

Anonymous Coward says:

Re: Involuntary servitude ???

Yes, it is involuntary servitude. And the Constitution’s prohibition is absolute: it states that involuntary servitude shall not exist within the US. Do you think that will stop the government here any more than it stops them from compelling jury duty, or the draft, or hospital employees to draw blood from DUI suspects, or photographers from attending weddings they’d rather not attend? Heck, every time it snows, the government even compels you to shovel the government’s sidewalk that happens to run in front of your property.

But another interesting thing here is that Apple actually claims to own the specific copy of the software on each of their phones. “This software is licensed, not sold”. A small part of me wants to make that have a negative consequence for them, for once. You claim you don’t sell this software? Then fine, it’s yours; but if you retain ownership that means you can’t just claim you have nothing to do with it anymore.

Jim Gianatsis (user link) says:

Apple Is Being Unreasonable.

Apple is damn stupid. They should have treated this as a 1-time request from the federal government to help solve a terrorist crime, and help make all people safe. Now Apple could be charged with hindering an ivestigation and aiding terrorism. The NSA already monitors all world phone and internet communications in the name of National Security, so what the difference?

Anonymous Coward says:

Re: Apple Is Being Unreasonable.

They should have treated this as a 1-time request from the federal government

This isn’t exactly the first time the government has demanded Apple do something help them break encryption. And once the software is written the government will come back to them every single time it wants them to unlock a phone. You can’t treat something as a 1-time request unless it only happens 1 time.

KidOmaha (profile) says:

Re: Apple Is Being Unreasonable.

The difference is that, in this case, the government is attempting to force a private company to create code to defeat its own encryption. And it isn’t for just this one case. Creating a backdoor to encryption would mean that it will work on ALL iPhones, not just the one in question. Further, it wouldn’t just be for one ohone. If this order stands you can rest assured that the FBI, other law enforcement agencies and other governments will demand the same for other phones. The Manhattan DA admitted in a recent interview that he will demand Apple do the same with respect to 155-160 phones in his possession. Here’s a link to Apples open letter to its customers that also explains why your argument must fail.http://www.apple.com/customer-letter/

John Roberts says:

iPhone sales

I’m reasonably confident that for Mr. Cook this is more about future sales of apple products than it is about protecting American rights. TBH I’m not an apple user or fan but just so no one tries to label me a hater, I’m pretty sure samsung, htc, lg, etc… would all take the same stand. Money over rights, the American way.

Anonymous Coward says:

iPhone is not cool anymore !

Apple is insulting my intelligence, it cares more for the safety of my private than the safety of my life ?

Apple’s decision to protect their customers’ privacy had also provided unbreakable protection to those ISIS organizations to be able to continue their operations of terrorism without exposures to US law enforcement system

It’s easy for Apple to reject court’s request base on a righteous business principal.

I want to ask Apple:

Why Apple want to separate its business interest from its community interest ? Business is unbreakable from community, how do you separate these 2 ?!

What about Apple people’s citizen awareness for the public safety, the homeland security and the well being of humanity in general ?

I look at my iPhone and I am very sad for all the money that I paid to its maker.

They don’t care about my true concern of life safety, they care more about promoting their unbreakable business image in the name of customer privacy.

nasch (profile) says:

Re: Re:

Apple’s decision to protect their customers’ privacy had also provided unbreakable protection to those ISIS organizations to be able to continue their operations of terrorism without exposures to US law enforcement system

This article may put things in a different perspective: https://www.techdirt.com/articles/20160206/06570933540/senator-john-mccain-weighs-going-dark-debate-insists-that-he-understands-cryptography-better-than-cryptographers.shtml

Michael Corrieri says:

Apple could give the government a hardware solution.

Apple has their head up their backside, as usual. The back door could be the connection of a hardware solution, like Periscope was for CPU debugging – similar idea. The future iPhone could have connection points which allow the hardware to be connected to it, and provide the requested backdoor. Without the physical hardware connected to it, and a machine running that hardware with the right encyrption, there would be no risk to consumers (unless they get a search warrant). I am anti-terrorist, pro-security and privacy as well, but I also recognize that a search warrant is the people’s demand of discovery, and must by honored. They could do this in a way that hackers would need to have your phone and the hardware device with it’s own encryption keys to access your data. Simple enough – that’s not an online threat.

MB says:

NSA

Our definition of brute force and the NSAs are not the same thing. Theyve got shit we’ve never even heard of (if quantum code breakers exist, theyve got em).

I’d argue its VERY likely this phone was cracked by the NSA, probably weeks ago. But there are other reasons this is happening:
1) If there is incriminating evidence against somebody alive, the FBI will want an alternative chain of custody so NSA agents arent appearing in court. Thats a huge thing- and it happens a lot. NSA passes tips to the FBI who go snooping into they find a plausible tact of investigation they COULD have found themselves, then just treat it like they DID find it themselves. Its legal and happens with lot of evidence that wasnt obtained legally or whos source they want to keep out of the limelight, sometimes the judge will send the prosecution back to go find a path that will pass muster.
2)This isnt about this case. The Feds have been strong-arming tech companies to provide them back doors for years (Google immediately rolled over like a whipped dog). I picture the conversation going “So Apple, what do you think the public will say when we tell them you arent cooperating in this terror investigation? This wouldnt even be an issue if you’d give us a tool to disable the multiple failed login bricking.” This is another chess move about encryption.

Anonymous Coward says:

Use the iforgot feature

Stick an activated dim with a data plan on it then press to enter password and click Iforgot as they already have access to his ivlouf they can then log in to the email and click the link to reset the iPhone password and hey presto voila they got in the phone no earrent needed! Or as the phone had been identified as being on iOS 7 Google for the password bypass bug which means you can get access to all the stuff they are after! And again hey presto they got their Intel! No stupid apple getting in the way!

nasch (profile) says:

Re: Use the iforgot feature

as they already have access to his ivlouf they can then log in

What is that word supposed to be? And is it really possible to reset the phone’s passcode via email? I know it’s not on Android because the phone PIN and Google account password are totally separate. This would be a major security oversight by Apple.

smaines (profile) says:

This is a non-issue, isn't it?

I am a developer, but am not as familiar with iPhone as with Android.

I believe these two things to be true (please tell me if these are wrong),

* An update must be signed by a secret key secured by Apple
* S/N in HW can be unspoofably interrogated in signed update

If the signed update says “if s/n equals bad-guy-phone open command pipe over wifi and bypass authentication-attempt delays”, why does it matter whether the FBI (or anyone else) has it?

This order should change nothing in either legal precedent or user security, and may be performed with a modicum of low-complexity code.

What am I missing here?

-SM

Anonymous Coward says:

Re: This is a non-issue, isn't it?

What am I missing here?

The next step, being able to order any company to install a modified operating system on any identified device, totally bypassing all protection given by code signing in proprietary software. The company could be required to do anything to help a government agency break into a device, such as installing a key-logger. Note they would not need physical access to the device, just a unique device identifier for the companies update servers to serve up the modified version of the operating system.

smaines (profile) says:

Re: Re: This is a non-issue, isn't it?

SM>> What am I missing here?

AC> The next step….

Thanks for replying.

Before we get to next steps, are my two technical assumptions (required key, unspooffable s/n), and the specific inference (the code to be provided in the order can only be used for the one device) correct?

-SM

Anonymous Coward says:

Re: Re: Re: This is a non-issue, isn't it?

The only required key is the signing key for software updates, and targeting a single device is a smoke screen as it takes extra code in the update to target a single device. That is once the compromised version of the OS is written, the OS is compromised wherever that version is deployed. Given how deep the spy agencies are in the Internet, they would be able to deploy the compromised version at will.

smaines (profile) says:

Re: Re: Re:2 This is a non-issue, isn't it?

AC> That is once the compromised version of the OS is written, the OS is compromised wherever that version is deployed.

Why is it compromised? That update on your otherwise identical phone will not function, as the serial number check putatively included in the update will fail, won’t it?

smaines (profile) says:

Re: Re: Re:4 This is a non-issue, isn't it?

“The operative word is putative, especially after several hundred warrants have been served on the company to produce a compromised version.”

If your suggesting that Apple will weary of protecting its customers and open the flood gates, that is a different discussion. If you are suggesting that the Government will demand Apple covertly install a digital wiretap on a future target, that is also a different discussion.

I am disputing the notion that specific compliance with this order gives the Government a broader capability. Those who say that the Government can change the code appear not to understand code signing, but are numerous in the discussion.

Anonymous Coward says:

Re: Re: Re:5 This is a non-issue, isn't it?

Code signing only states that the code comes from the signer, and is only useful so long as their signing key is not compromised, and it has nothing to do with targeting an actual device. This goes beyond requiring a company to provide Information that they have, but is trying to set the precedence that a government can order a company to compromise their own code and then sign it.
Also remember there are only two numbers of Interest, one and many, there is only one Earth, but many Iphones that could contain evidence. This would never remain a one time request but could rapidly become such a drain on Apples resources that they give the signed code to the FBI.

smaines (profile) says:

Re: Re: Re:6 This is a non-issue, isn't it?

AC> Code signing only states that the code comes from the signer, and is only useful so long as their signing key is not compromised, and it has nothing to do with targeting an actual device.

Code signing is not a mere autograph, if it is not signed, it will not run on any Apple device requiring a valid key, correct? The code targets a single device, the signature enables it, the code cannot be modified and still function. Can you not concede this point?

Compromised keys are a separate discussion, and only Apple’s problem. They lost this round. Build a better phone next time.

Anonymous Coward says:

Re: Re: Re:7 This is a non-issue, isn't it?

If it is signed, unless Apple write even more code, it will install and run on any IOS7 device. Installers do not target a particular machine, although they may check that the software that they are upgrading is properly registered.
The real problem is this could set the precedence that the Government can order any company to WRITE and install compromised code on any machine. Then they use terrorism as the reason for demanding that they are given signed code that they can install on any machine to aid an investigation.

nasch (profile) says:

Re: Re: Re:10 This is a non-issue, isn't it?

The installer doesn’t target the machine, nor does the signature. The signed code does.

I don’t know a lot about iOS code, I was just trying to make the point that Apple would put something in this update that would make sure it could only run on this one device. Perhaps that is already understood and I didn’t need to mention it.

Anonymous Coward says:

Re: Re: Re:9 This is a non-issue, isn't it?

At issue is not the specific request, but rather the precedent that is set if Apple lose this battle. It would mean that government agencies could use writs to force companies to compromise their software for installation on targeted machines. It is a small step from compromising machines in the governments possession, to targeting machines of suspected terrorists or criminals.

smaines (profile) says:

Re: Re: Re:10 This is a non-issue, isn't it?

I’m saying that it is not even a legal precedent. This order, once upheld, does not affect the prospects of any future order where a judge says, “open this specific phone”, and does not legally enable further activity, e.g. installing a covert wiretap.

Apple claims that the Government’s request is unprecedented. In opposing the order, it is Apple, not the Government, who seeks to make new law.

Apple lost this round when they tried and failed to build a device secure against themselves. It’s embarrassing. Too bad. Build a more secure phone next time, boys.

Can you imagine a theory under which a safe manufacturer could refuse to help defeat a safe in the shooters’ storage space? “But, then the Government will know how easy it is, and may ask more of us in the future!” The government already knows how easy it is, and yes they might ask more in the future. That’s for a judge to decide then, and having opened this safe today won’t affect it.

KidOmaha (profile) says:

Re: Re: Re:11 This is a non-issue, isn't it?

“Apple claims that the Government’s request is unprecedented. In opposing the order, it is Apple, not the Government, who seeks to make new law.” How so? “Apple lost this round when they tried and failed to build a device secure against themselves. It’s embarrassing. Too bad. Build a more secure phone next time, boys.” How so? The point is that Apple’s phone IS secure such that the FBI cannot get info off of it without getting a court order forcing Apple to write a different OS that introduces a backdoor to encryption. Once Apple writes the OS, after being compelled to do so, the cat will be out of the bag and what was once qa secure system will have been undermined by the court order. That is how it affects future activity.Please do answer my question regarding your statement that it is Apple that seeks to make new law by opposing the court’s order. I am genuinely interested in your answer.

CalebBoone (profile) says:

Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Ladies and Gentlemen:

The court order which requires Apple to assist law enforcement to open the San Bernardino Terrorism Defendants’ Apple IPhones and other devices is correct and will be affirmed and enforced. Apple is embarrassingly wrong in its idiotic argument that the order jeopardizes Apple customers’ privacy.

Apple’s argument against the court order assumes that allowing law enforcement access to the San Bernardino Defendant’s Apple ‘phone will allow computer hackers worldwide access to all other Apple ‘phones.

That conclusion is based upon another assumption: that American state and federal law enforcement personnel cannot be trusted to maintain Apple’s security outside the specific criminal investigation at hand.

Or, another assumption: that the Apple personnel involved in opening the Apple ‘phone cannot be trusted to maintain Apple’s security outside the specific task of cooperating with the government on this single case.

If Apple cannot trust its own employees, that is Apple’s dilemma: not the American state or federal governments’. If Apple employees are not reliable, trustworthy and ethical, then Apple needs new employees.

If Apple does not trust California state law enforcement personnel in San Bernardino County or within the United States Government, then Apple needs to change its mind.

This entire debate is childish and silly. Apple must obey the court order. Surely Apple has the wherewithal to absolutely guarantee the integrity of the process of opening one telephone to comply with one subpoena.

It is idiotic to conclude that every Apple employee involved in opening this single ‘phone is a thief and a pirate who would leak the password to the world and destroy Apple customers’ privacy worldwide.

If that were true, then Apple’s worldwide security is already in shambles and all customers’ data are already exposed.

It is embarrassing to observe that everyone with an iPad or a Notebook, who can type, now proclaims himself an expert in law, mathematics, engineering, politics, worldwide business and government.

I have watched Tim Cook talk. Bless his heart. He has absolutely no knowledge of anything.

I have no doubt that Apple will be required to open the ‘phone and any other device necessary to this law enforcement investigation.

But not until after every Silicon Valley Hippy has had the opportunity to give a Ted Talk on American Corporate Capitalism and President Dwight David Eisenhower’s Military Industrial Complex.

Have a Dovely.

Sincerely yours,
Caleb Boone.

CalebBoone (profile) says:

Re: Re: Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Gandydancer:

Please see the 5:15 a.m. CT (US) reply which I typed this morning for Nasch below.

I disagree with you.

I admit that I have not read all the Orders issued in this case, but I don’t need to. I don’t need to know the precise procedural path which this case has taken.

All I need to know is the Constitution.

The United States Constitution, Fourth Amendment, and the hundreds of thousands of cases construing it, supports the Order to Apple to disclose or provide the information necessary to open the IPhone, just as if it were a metal key to a door to the Defendant’s apartment and the murder weapon were inside the apartment.

Have a Dovely.

Sincerely yours,
Caleb Boone.

CalebBoone (profile) says:

Re: Re: Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Gandydancer:

I have re-read your comment.

I note that you did not state disagreement with me, as such.

Therefore, I retract the sentence which I typed: “I disagree with you.”

I don’t disagree with you.

Instead, I offer the comment which I typed a few moments ago to further articulate what I typed yesterday.

Have a Dovely.

Sincerely yours,
Caleb Boone.

katie higgins (profile) says:

Re: Apple Must Open The San Bernardino Terrorist's IPhone.

Yeah– just bypass the nerdy technical stuff and assume away– based on “assumptions”. This is exactly the methodology that almost got SOPA passed–

But, speaking strictly about this being a case of “one iPhone “– pretty damn lame , when you stop and THINK about it not being the personal/private phone of the dead terrorist.

I sense that Apple will be garnering stronger support from those who understand both the nature of the demand and the inherent risks of complying.

Speaking for myself only– I appreciate hearing from people who address the core issues with emphasis on technical knowledge with consideration for unintended consequences

What does it mean to have the wherewithal to guarantee the integrity of a process? As you claim surely, Apple must have– ? I think Apple customers would hope to see integrity expressed with regard to Apple’s promise to its customers.

CalebBoone (profile) says:

Re: Re: Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Miss Higgins:

I disagree.

The legal issue is very simple.

The courts have ordered Apple to open one ‘phone.

The issue is whether Apple is legally justified in its refusal.

There is no legal or scientific justification for Apple’s refusal.

Apple’s IPhone contains evidence of multiple murders committed by a terrorist.

Apple must open the ‘phone and disgorge the evidence.

There is nothing technologically esoteric about this at all.

It is no different from opening a milk can to find a pistol a murderer hid inside the milk can.

So put away your slide rule and your copy of Herrman Hesse’s “Siddhartha.”

They cannot help you now.

Have a Dovely.

Sincerely yours,
Caleb.

nasch (profile) says:

Re: Re: Re: Apple Must Open The San Bernardino Terrorist's IPhone.

Apple’s IPhone contains evidence of multiple murders committed by a terrorist.

Firstly, it is the county’s iPhone. I don’t know if you meant to indicate Apple’s ownership or just the origin of the phone. Secondly, how do you know what’s on the phone?

It is no different from opening a milk can to find a pistol a murderer hid inside the milk can.

If that is true, why do they need Apple’s help? Can’t the FBI open a milk can?

katie higgins (profile) says:

Re: Re: Re: Apple Must Open The San Bernardino Terrorist's IPhone.

“There is no legal or scientific justification for Apple’s refusal. “

But there is a legal process that will address what I can safely bet you will not be asked to decide.

“There is nothing technologically esoteric about this at all.”

Oh, but there is– to a breathtaking degree–.And, there is something very esoteric about your claiming to know what type of evidence is on this iPhone.

“It is no different from opening a milk can to find a pistol a murderer hid inside the milk can.”

Really? — you buy your milk in a can? Big enough to conceal a pistol?

Expect a knock on your door 😉

smaines (profile) says:

Re: Re: Re:2 Apple Must Open The San Bernardino Terrorist's IPhone.

CB> There is nothing technologically esoteric about this at all.

KH> Oh, but there is– to a breathtaking degree.

Could you please explain for us- in pure technology terms, not in terms of presumed social, legal our business consequences- what makes this technologically esoteric at all, let alone to a degree that should take ones breath away.

I have asked what there would be in the technical dimensions of performing the order that would contradict the assertion that this is a low-complexity/high-criticality code deliverable.

katie higgins (profile) says:

Re: Re: Re:3 Apple Must Open The San Bernardino Terrorist's IPhone.

@smaines

Your assertion contradicts the dimensions of Apple’s claims regarding the security features specific to their iPhones. Actually, your assertion begs the question:”Why order Apple to perform a “low-complexity/high-criticality code”?

IF Apple disclosed every detail of the security features they designed, I think it is reasonable to assume that they weren’t serious about security– and their assistance would not be crucial.

So– imo this is a stunning example of esoteric -laced technology– simply because the technical terms for an Apple exclusive design feature are alluded to– imagined, in the context of impossible to breach— by brilliant nerds, whose musings are captivating–.

No one here accepts that it is possible to design an impenetrable security system– or that the design engineer would not be able to reverse or unlock a system he/she designed.

Steve Jobs did not take his genius with him, he built a company that expanded on his very unique perspective on the relationships we develop with our cell phones. There isn’t a technical term to describe the desire for privacy and primacy over what defines us. It is an esoteric concept, and a treasure. The Apple imitators employ technological expertise to capture a market– but the connection Apple has forged with shared core values is not replicable.

I appreciate that esoteric and technical are contradictory terms– but in the combination I have attempted to describe, they hold the key to protecting our most fundamentally human right to privacy.

Remember what happened when Eve took a bite of the apple from the Tree of Knowledge? Metaphorically speaking, in a nearly universally known symbolic context, she attained knowledge of good and evil. This is pure metaphor, symbolic of the realization that our advancing technological capabilities were greatly exceeding our capacity to foresee unintended consequences. Apple’s Brand is a promise that it’s products were not designed by fools.

In all likelihood– any attempt to override or unlock this iPhone will enable the data erasure function. Why? Because that would be the only technological design feature that requires the person who set the security function — to unlock it.

My opinions are both personal and influenced by association with Apple.

Not pushing a case or an argument here– just answering your question.

smaines (profile) says:

Re: Re: Re:4 Apple Must Open The San Bernardino Terrorist's IPhone.

Nothing you’ve said would contradict the assertion that this is a low-complexity/high-criticality code deliverable. Your argument that if it were so, Apple would not be needed suggests that you have no working knowledge of code signing.

Your reply doesn’t speak in any real technical terms, not even simple ones. I’m guessing you are not a developer.

I’m sorry to be blunt, but I think you find this technology breathtakingly esoteric because you have no actual understanding of it.

smaines (profile) says:

Re: Re: Re:4 Apple Must Open The San Bernardino Terrorist's IPhone.

Nothing you’ve said would contradict the assertion that this is a low-complexity/high-criticality code deliverable. Your argument that if it were so, Apple would not be needed suggests that you have no working knowledge of code signing.

Your reply doesn’t speak in any real technical terms, not even simple ones. I’m guessing you are not a developer.

I’m sorry to be blunt, but I think you find this technology breathtakingly esoteric because you have no actual understanding of it.

katie higgins (profile) says:

Re: Re: Re:5 Apple Must Open The San Bernardino Terrorist's IPhone.

Sorry? I am not obligated to disclose my credentials here– nor my sources.

But, though this may appear to more a product of legal reasoning– or rather, case supporting strategy, the fact remains that despite the impressive number of top notch hackers who have either volunteered for this job, or reported the ease with which the task of retrieving the data from this iPhone could be performed, the FBI sought the strongest method available to them to coerce Apple to do their bidding.

You think the FBI wants to cripple Apple’s credibility– slash their global market to shreds? Or, maybe the FBI wants to broadcast their frustration over NOT having a master key for yet another lock that secures so much more than they have any right to know.

Do you believe Apple was selected due to their having ostensible means to wage a legal battle? Just the sort of opponent the government would hope to intimidate? I don’t think so.

The premise; that this iPhone is encrypted with crucial life saving information about the global threat of terrorism is a long shot– . However this iPhone is loaded with all of the buzz words to force an unprecedented overreach of the government into private industry.

During WW2 there were exceptions upheld by the Supreme court, re:first amendment rights– . Any statements that undermined the war effort were seen as potentially life threatening on a national level– encouraging someone to boycott government sanctioned work that supported the military, for example was * a crime*. These lose interpretations were employed because we were at war– and the magnitude of the threat of that war was very strongly engrained via many unprecedented violations of civil rights for Americans. Obviously this was a temporary war time over reach by the government– but it has found new expression in our Patriot Act, direct violations of our constitutional rights — enacted under the same mindset that prevailed during WW2.

There is something inherently misguided about the definition of this iPhone, as crucial evidence pertaining to a horrific attack on innocent people, as well as a few bells and whistles sounding in concert with targeting Apple for this low-complexity task–

BTW I simply don;t have access to the technical language that captures Apple’s brand design of the security systems in their iPhones– Your inferring I am simply lacking understanding is amusing–. The technical details have yet to be disclosed. Don’t hold your breath. 🙂

smaines (profile) says:

Re: Re: Re:6 Apple Must Open The San Bernardino Terrorist's IPhone.

“BTW I simply don;t have access to the technical language that captures Apple’s brand design of the security systems in their iPhones”

This statement is jibberish. Miss South Carolina-level jibberish. I didn’t ask about your credentials, I asked whether you could answer a question. You can’t.

“….Your inferring I am simply lacking understanding is amusing”

The preceding jibberish alone suffices to support my inference.

Matt (profile) says:

Re: Re: Re:4 Apple Must Open The San Bernardino Terrorist's IPhone.

“No one here accepts that it is possible to design an impenetrable security system– or that the design engineer would not be able to reverse or unlock a system he/she designed.”

I mean, that’s the idea. And if a design engineer was able to unlock a system he/she designed, that would be a TERRIBLE system that was not safe. Also, if Apple DIDN’T disclose the details of their security protocols that would also be a very bad thing.

Security that is done secretly is bad security because once someone figures out the flaw they can exploit it. Much better is to (as is actually done) publish all the protocols and let everyone figure out the flaws and fix the flaws.

Why should an engineer have access to encrypted data? That makes no sense. Do people believe that when Apple says they can’t access the data, that they are lying and can actually access the data?

katie higgins (profile) says:

Re: Re: Re:5 Apple Must Open The San Bernardino Terrorist's IPhone.

Matt, my understanding is that the secret aspect of Apple’s security design is the individual purchaser– ; that both access and access denied/erasure is scripted to the individual’s passcode– the encrypted data is attached and cannot be retrieved without it– a back door attempt would be detected and the data erasure function would be initiated.

I have heard that all attempts to retrieve encrypted data from locked/secure iPhones have resulted in loss of data– AND, that government nerds believe Apple has sign keys that function like clones to whatever pass code has been used– and would be master keys

Essentially the one means- that has been attributed to Apple- (sign keys that clone pass codes) for unlocking a secured iPhone — fits all–.

There is no reason to believe that Apple can create a single phone, one tome only, backdoor– or that they would really be fool enough to give the master key to any third party.

Is this the swim suit competition? I am having wardrobe issues.
Yours truly :-),
Miss South Carolina

Uriel-238 (profile) says:

Re: Re: Re:6 Apple Must Open The San Bernardino Terrorist's IPhone.

I have heard that all attempts to retrieve encrypted data from locked/secure iPhones have resulted in loss of data

I’m pretty sure the phone’s data can be retrieved (albeit salted thoroughly with AES) by dismantling the phone and pulling it directly from the flash memory.

The problem is if the relevant key data from the Trusted Platform Module can be extracted without triggering its tamper-protections.

I do suspect a few tiger teams of engineers are working on how to do a memory pull of that very component.

It, for all its strengths and vulnerabilities, are what is slowing the FBI from doing the hack themselves. And when someone succeeds they won’t need Apple to break their own security protocols.

KidOmaha (profile) says:

Re: Re: Re: Apple Must Open The San Bernardino Terrorist's IPhone.

Caleb, your continued posts regarding your thoughts on the legality of the court order at issue continue to illustrate the fact that you do not even have a firm grasp on what issues are at play in this matter.

You state, “The legal issue is very simple. The courts have ordered Apple to open one ‘phone. The issue is whether Apple is legally justified in its refusal.”

Incorrect. The court has not ordered Apple to to “open one phone.” Read the Order! It requires Apple to invent a new operating system that would provide a backdoor to encryption and make it possible for the FBI to “brute force” its way into the phone at issue.

You further state, “The issue is whether Apple is legally justified in its refusal. There is no legal or scientific justification for Apple’s refusal. Apple’s IPhone contains evidence of multiple murders committed by a terrorist.” All these statements that you present as factual are incorrect! The issue is whether the federal government can compel Apple to create and implement the operating system described above which would provide backdoor access to Apple’s encryption. There are a host of Constitutional arguments that support the notion that the federal government does not possess the power to compel Apple to create the operating system it desires. Among those are 1st Amendment issues regarding compelled speech, 4th Amendment concerns regarding unreasonable search and seizure of Apple intellectual property, public taking of Apple property without just compensation, due process of law concerns etc… The list goes on.

You next state: “Apple’s IPhone contains evidence of multiple murders committed by a terrorist.” That is pure speculation. You have no idea what is contained on the phone. Nonetheless, whether or not it contains evidence of multiple murders committed by terrorists does not change the legal issues discussed above.

“Apple must open the ‘phone and disgorge the evidence.” Incorrect. See legal issues addressed above.

“There is nothing technologically esoteric about this at all.” No, there are pressing legal/constitutional concerns at play that override any “esoteric concerns.”

Finally, you state, “It is no different from opening a milk can to find a pistol a murderer hid inside the milk can.” I have no idea what you are attempting to say here. I assume you mean that opening the phone is like opening a container that contains a murder weapon. For all the reasons cited above, your analysis is very flawed and incorrect.

So put away your slide rule and your copy of Herrman Hesse’s “Siddhartha.” They cannot help you now. Have a Dovely.” Instead, I would suggest that you either refrain from asserting your legal theories without significantly more education and/or research and that you remember to include yourself when you caution that people who have a notebook have become experts in law etc… Your posts demonstrate that, at least when it comes to the law, you have no idea what you are talking about. Let the big boys who have gone to law school handle these tricky issues for you. Rest your weary head confident in the knowledge that you might be the smartest guy currently in your house.

CalebBoone (profile) says:

Re: Re: Re:2 Apple Must Open The San Bernardino Terrorist's IPhone.

Dear KidOmaha:

I typed my reply to you below, before reading this.

First, I view the order as requiring Apple to do whatever is necessary to open the phone.

If opening a home’s front door at the command of a law enforcement officer holding a court-issued search warrant standing outside the door requires someone sitting down inside the house to stand up and walk to the door and open it with his hand, then the command of that law enforcement officer to open the door implies a command that someone create, assemble and contrive the function of the whole complex array of nerves, blood vessels, bones, sinews, muscles, tendons, skin, etc., which comprise the human body to exist, function, operate and act to open the door.

If there were no other way of opening the door than by the act of a living person inside the house, then that law enforcement officer’s simple spoken command does require all those things, and, implicates the act of Creation, if you will. Surely that is the height of unfathomable scientific complexity.

If that were true, every simple knock-and-announce case would fail because it would require God to create a person.

The novelty and intricacy of computers invites us to explore the complexity of the ordered act of opening an IPhone. However, the intellectual attractiveness of that factual aspect of this legal issue should not cloud our judgment or distract us from the simplicity of the act which has been ordered by the court.

This reminds me of a portion of a children’s song written as a medley from Walt Disney’s “Snow White:”

“Open the door, open the door, cried seven little men;
One at a time they knocked on the door: [knock-knock-knock-knock-knock-knock-knock!]
Open the door, open the door! [etc.]”

All we want the Apple Corporation to do is open the door.

Whatever it takes to open the door can be properly ordered by the court.

The fact that it is a very complicated, computer door, is of no consequence.

It is still a door which must be opened.

If it has to be opened by a copyrighted, trademarked key, invented for the occasion by Alexander Graham Bell, then so be it.

It is still a simple key which is the only device which can actually open the door.

Whenever, however, whoever, whatever must occur to make the key, the court absolutely has the power to command, to accomplish the legitimate ends of federal or state law enforcement.

Now, we may not agree.

You may have a contrary opinion.

If so, I understand that.

And, I consider that our exchange has been a gentlemanly good-natured debate.

Have a Dovely.

Sincerely yours,
Caleb Boone.

KidOmaha (profile) says:

Re: Apple Must Open The San Bernardino Terrorist's IPhone.

Apparently, your concern revealed in your statement, “that everyone with an iPad or a Notebook, who can type, now proclaims himself an expert in law, mathematics, engineering, politics, worldwide business and government” does not apply to your own proclaimed expertise in law, business, government etc… The argument that Apple should trust its employees and various levels of government is misplaced. The court order at issue requires that Apple create a backdoor to encryption. Once that is accomplished, it is not necessarily Apple employees or government officials that Apple needs to be concerned with, it is with the multitudes of hackers that will try to exploit the backdoor that they will now know exists. I would love to know your reasoning that leads you to conclude that “the court order which requires Apple to assist law enforcement to open the San Bernardino Terrorism Defendants’ Apple IPhones and other devices is correct and will be affirmed and enforced.” My opinions regarding the matter come, generally speaking, from practicing law in federal courts for the last 18+ years. From what knowledge base do you arrive at your conclusions?

CalebBoone (profile) says:

Re: Re: Apple Must Open The San Bernardino Terrorist's IPhone.

Dear KidOmaha:

I have read your reply of 4:55 a.m. today.

I am an actively practicing trial lawyer in state and federal courts. I have practiced since 1982. My practice is general, but my cases are concentrated in representation of plaintiffs arising out of torts of negligence causing personal injury and property damage and criminal defense.

In my opinion, Apple should obey a court order to open the IPhone owned by one of the recent San Bernardino terrorists or murderers, who is now deceased.

The issue has prompted many articles which essentially glorify Apple Chariman Tim Cook as a guardian of the Constitution, or a champion of the privacy rights of people worldwide.

I disagree with the thesis of all those articles.

A good example of such articles is one by Steve Petrow, for USA Today. Mr. Petrow wrote the article below on Thursday, February 24, 2016:

http://www.usatoday.com/…/got-hacked-my-mac-while…/80844720/

The article is about Mr. Petrow’s experience of writing and sending anti-government, pro-privacy comments or messages/draft articles on this issue, using a computer during a ‘plane flight. Mr. Petrow used the airline’s on-board or in-flight wireless internet service to send his anti-government, pro-Apple-privacy messages during the flight. At the end of the flight, another passenger on the flight, a stranger, stopped Mr. Petrow and introduced himself. This anonymous passenger told Mr. Petrow he had hacked into all of Mr. Petrow’s in-flight wireless communications and those of other passengers. The hacker discussed his support for Apple’s privacy arguments with Mr. Petrow. That is, the hacker stated he agreed with Apple’s assertion that disclosing its passwords to the government for the San Bernardino investigation would easily allow worldwide hacking into Apple’s customers’ private accounts on an indiscriminate basis by anyone. Afterward, Mr. Petrow reflected upon the poignancy of the hacker’s comments, approving them, and noting how they were vividly illustrated by the hacker’s own act of in-flight hacking of his personal communications.

Mr. Petrow’s article is interesting not for its insight but for its stunning irony and Mr. Petrow’s naiveté.

The hacker, who spoke to Mr. Petrow after the flight, violated the law by obtaining unauthorized access to Mr. Petrow’s communications, whether they were being transmitted or were in electronic storage, or in a stored state.
He admitted he hacked into Mr. Petrow’s communications or hacked into Mr. Petrow’s stored information because he said so, to Mr. Petrow: “I hacked.”
Hacking is getting into something you are not supposed to get into. It is a slang word for electronic pilfering or electronic stealing.

Hacking is electronic pick-pocketing.

Intercepting electronic transmissions is a federal crime under 18 U.S.C. 2511.

Obtaining access to stored electronic information is also a federal crime under 18 U.S.C. 2701.

Please see:

U.S. v. Szymuszkiewicz, 622 F.3d 701 (7th Cir. Wisconsin 2010)

and

Shefts v. Petrakis, 2012 WL 4049484 (D.C., C.D., Ill. 2012)

The irony in Mr. Petrow’s failure to understand that he was the victim of a crime emphasizes the fallacy of his argument.

He portrays the anonymous airline-passenger-hacker as an angel in disguise.

He ignorantly overlooks the hacker’s corrupt, evil nature and misinterprets it as virtue.

He receives the hacker’s words as wisdom when in fact they are mendacious.

The hacker’s message, and that of Mr. Petrow’s article, is that we must resist any and all government access to private information even if it would save our lives.

That is, we should be willing to die at the hands of a terrorist to protect that terrorist’s right to privacy in his personal IPhone messages.

The hacker has failed to recognize the horrible evil inconsistency in his position. Mr. Petrow has also failed to perceive that same self-evident inconsistency in his position, which is the same as the hacker’s.

Terrorism or multiple murder is wrong. It is a horrible crime. The government must be able to conduct the proper investigations necessary to detect it, prosecute it, punish it and prevent it. Such investigations further a basic function of government: enforcement of law for the physical protection of the lives of citizens in our representative republic.

We are citizens of that republic. We rely every day on the integrity of our elected officials and civil servants such as policemen, sheriff’s officers, detectives, United States Marshals, F.B.I. Agents, C.I.A. Agents, N.S.A. Agents, Homeland Security Agents, Treasury Agents and a host of countless other state, federal and local law enforcement agents and officers for our protection.

We repose trust in those persons.

We expect them to perform their tasks with honesty and integrity.

We conduct our daily lives based upon the assumption that they will maintain proper security, secrecy and privacy of the extremely sensitive information which they constantly obtain, utilize, review and read.

The premise of Mr. Petrow’s article is that everyone in the categories I have mentioned above is dishonest, opportunistic, evil and untrustworthy.

He implies, in his article, that each and every one of those persons will use every bit of law enforcement investigative information obtained for improper purposes and will disclose it, disseminate it and spread it abroad.

If we believe Mr. Petrow is right, then there is no safety, no protection, no privacy, no security and no integrity in anything and anarchy reigns at every level of local, state and federal government throughout the United States.
I must state that Mr. Petrow is wrong. The anonymous hacker is wrong.

KidOmaha, I believe your conclusion is the same. That is, I believe that you assume that, if allowed by the courts, the activities of law enforcement in retrieving this information from the Apple Phone will be stupid, bungling, insecure, open to hacking from outside, and completely vulnerable to copying and theft by hackers intruding into the law enforcement computers used to perform the IPhone data extraction.

I believe law enforcement is better than that.

I believe that Apple personnel, cooperating wholeheartedly and expertly with law enforcement personnel, will be able to extract the data and no hackers will be able to avail themselves of the programs or passwords which Apple may have to create or assign to facilitate the law enforcement investigation.

They may have to do it inside a fully self-contained vault buried two miles underneath the ground, lined with sixty feet of solid lead on all six sides with solid lead radiating outward sixty feet from all eight corners, but it can be done.

I must clearly state the law in this area. I believe you will agree with me.

A criminal investigation is the investigation of illegal activity by one or more suspected criminals. If probable cause exists, a law enforcement officer may apply to a judge for the issuance of a search warrant to obtain otherwise private, secret information in the possession of one or more suspected criminals.

A search warrant is a detailed written order signed by a judge. It is based upon detailed written or spoken statements, under oath, provided by law enforcement personnel or private citizens, to the judge. The judge hears or reads and analyzes that sworn testimony or written statements before issuing the warrant.

The judge may either issue a warrant or refuse to do so if he believes the information does not establish probable cause.

Probable cause is a clear, articulable suspicion of criminal activity.

If probable cause does not exist, a warrant will not issue, or, evidence obtained without probable cause can be suppressed by a judge after it is obtained. If all evidence obtained against a criminal defendant was obtained in searches conducted without probable cause, or was the “fruit of the poisonous tree” because it was evidence which was obtained as a result of other unlawfully obtained evidence, then a criminal’s conviction will be overturned or nullified and the convicted criminal will be exonerated and he will go free.

A criminal investigation is limited to the case within which it is conducted.
Computer passwords which are obtained and used in that criminal investigation are limited in use to that criminal investigation.

A search warrant issued in one case cannot be used to conduct a search in another case, with the exception that inadvertently discovered evidence of an unrelated crime (crime B) found during the execution of a search warrant issued for suspected crime A may become the premise of a new probable cause affidavit in support of a new search warrant to investigate suspected crime B.

Each time a new warrant is requested, a judge must make a new determination of probable cause, and a new warrant must be issued based upon new facts and new circumstances relevant to that new case.

We all trust law enforcement officers and judges to follow the law and the procedure I have described generally above.

I am not willing to die to further the aims of terrorists and murderers.

The illogic of Mr. Petrow’s thesis is similar to the illogic of the following argument I have read, made recently in many articles by Christian authors against other Christians.

The argument is as follows.

If an armed mass-murderer who is not Christian, holding a room full of hostages in a high school, says to the hostages, “Stand up if you are a Christian and are a follower of Jesus,” and several of those students being held hostage hide and do not stand up, then Christian authors have written that those hostages who do not stand up are, truly, not Christian.

Of course, any high school student who would stand up in such a situation would surely be shot by the armed hostage-taker and killed.

The fallacy in the argument that the reluctant hostages are not true Christians, is obvious. The argument assumes that the motives of the gunman are pure and holy. Of course his motives are evil. Standing up will do only one thing: further his evil motive to commit a murder. Standing up will bring no glory to God or Jesus. Standing up will not spread the Gospel of Jesus. Standing up will only cause senseless loss of life.

Standing up in that scenario would be exactly like Jesus acceding to Satan’s challenge to Jesus that He cast Himself down from a high point, or pinnacle of the temple:

“And he brought him to Jerusalem, and set him on a pinnacle of the temple, and said unto him, If thou be the Son of God, cast thyself down from hence: for it is written, He shall give his angels charge over thee, to keep thee: and in their hands they shall bear thee up, lest at any time thou dash thy foot against a stone. And Jesus answering said unto him, It is said, Thou shalt not tempt the Lord thy God. And when the devil had ended all the temptation, he departed from him for a season.” Luke
4:9-13.

It is just as illogical to accuse the sensible high-schoolers who did not stand up of impiety as it is to accuse sensible citizens who cooperate with legitimate law enforcement investigations of being unpatriotic.

We must not allow the novelty and intricacy of computers or the cachet of Steve Jobs and Tim Cook to cloud our understanding of the basic principles of good law enforcement essential to wholesome peace and safety.

KidOmaha, thank you for reading my response. I realize you may not agree with everything I have typed. Whether you agree or not, I offer my statements in a spirit of good-natured, gentlemanly debate.

Sincerely yours,
Caleb Boone.

nasch (profile) says:

Re: Re: Re: Apple Must Open The San Bernardino Terrorist's IPhone.

We are citizens of that republic. We rely every day on the integrity of our elected officials and civil servants such as policemen, sheriff’s officers, detectives, United States Marshals, F.B.I. Agents, C.I.A. Agents, N.S.A. Agents, Homeland Security Agents, Treasury Agents and a host of countless other state, federal and local law enforcement agents and officers for our protection.

We repose trust in those persons.

We expect them to perform their tasks with honesty and integrity.

We conduct our daily lives based upon the assumption that they will maintain proper security, secrecy and privacy of the extremely sensitive information which they constantly obtain, utilize, review and read.

Have you been following current events… at all? How can you still believe all those people conduct their jobs with honesty, integrity, and proper attention and competence in security?


I believe law enforcement is better than that.

Based on what??

CalebBoone (profile) says:

Re: Re: Apple Must Open The San Bernardino Terrorist's IPhone.

Dear KidOmaha:

I realize my last letter did not contain an analysis of your argument that Apple should not be required to create or devise anything new, such as a new computer program, for the government’s benefit.

I believe your argument is incorrect in this case.

If the thing which is new is, practically-speaking, nothing more than a key to open a door which cannot be opened any other way, then Apple must be compelled to create that key.

This is no different than requiring Google to open email accounts:

Please see: In Re Search of Google Email Accounts, 99 F.Supp.3d 992 (D. Alaska, 2015).

I understand that in the above case, Google was not required to create a list of messages for the government.

However, Google was required to unlock the door to provide the messages to the government.

In the San Bernardino case, the government cannot open the IPhone without destroying the data on the IPhone, because the government does not know how to do that. The government does not have the right non-destructive programs or passwords.

Those programs, sophisticated though they may be, are, therefore, in this case, functionally, nothing more than simple metal keys to open a standard metal deadbolt lock in a standard wooden door.

Law enforcement can use them and protect them and Apple will not be harmed.

Have a Dovely.

Sincerely yours,
Caleb Boone.

Apple must provide them.

CalebBoone (profile) says:

Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Nasch:

In this case a search warrant has been issued, after a proper application. That application was made by a law enforcement officer under oath in writing, in the form of an affidavit. The affidavit was detailed and provided to the judge a comprehensive recitation of facts. Those facts established a reasonably articulable suspicion of criminal activity. Further, they established that a third party, Apple, had possession of information necessary to obtain access to the desired evidence, which is reasonably believed to exist inside the Defendant’s IPhone or electronic device.

Even if Apple and the deceased murderer or deceased terrorist had a reasonable expectation of privacy in the evidence which is sought, the search warrant procedure, conducted by a neutral judge or magistrate, sufficiently protects that expectation of privacy. It is legal to pierce the expectation of privacy so long as the search warrant procedure is followed via application by sworn affidavit and independent judicial consideration of the affidavit by a neutral magistrate. That procedure is constitutionally sufficient and complies with the Fourth Amendment requirement that no warrant shall issue except upon probable cause.

Probable cause has been established and the Fourth Amendment has been satisfied. There is probable cause to believe that there is evidence of murders within the dead terrorist’s IPhone. The Judiciary has been fully involved in the issuance of the warrant against the IPhone and other associated orders to obtain the proper passwords or access keys from Apple, which is the manufacturer of the IPhone and the software which is part of the IPhone. There is nothing more to do.

This is similar to countless cases which have involved search warrants or criminal court orders against a landlord to provide a key to a barn occupied by the Defendant, or a key to a Defendant’s apartment, or physical access to a treehouse used by a Defendant. The landlords or owners of barns or yards may not be guilty of crimes, but they have access to the places where evidence of crimes generated by Defendants may be found because it has been left there by criminal suspects or criminal Defendants.

In this case the evidence is electronic and the access keys have fancy electronic names. In this case there are hundreds of millions or billions of people worldwide who have similar devices. Further, the news story about this case has been broadcast in excruciating detail on the internet. There is a tremendous financial incentive for Apple executives to make public pronouncements via the internet to make it appear they are zealous to protect the widely-perceived privacy rights of their customers. Therefore, those millions or billions of people who are Apple customers worldwide have the luxury of complete immediate electronic access to each and every detail of this case. They can read about this existing legal dispute and they can know that the accused, dead, San Bernardino terrorists or murderer’s IPhone is about to be opened pursuant to a court order, in each minute detail, moment-by-moment.

However, the widely-perceived privacy rights of Apple and its customers do not exist here. This particular legal scenario involves evidence of multiple, gruesome, highly-public murders which are quite reasonably suspected as having been committed by publicly-known and observed terrorists or murderers. Those terrorists or murderers possessed IPhones and at least one of them still exists and probably contains invaluable information about the murders which they committed.

The dead murderers may have some type of former expectation of privacy in the information at issue, but the government has an overriding interest in its disclosure. Apple may have a business expectation of privacy and secrecy in its passwords, software and computers or computer-like devices, but the government has an overriding interest in its disclosure for the limited purposes of this case to prosecute these known terrorists or known murderers. Those interests are legitimate. The privacy of the murderers and Apple has been properly protected and respected in this excruciatingly-drawn-out and painstakingly-antiseptic legal process. Surely the court has ordered that everything which must be strictly safeguarded and protected, will be very carefully protected, by the conduct of the process of opening the IPhone and other associated devices in total secrecy and security under the watchful eyes of the appropriate Apple personnel and law enforcement officers or agents.

That is good enough. Good heavens.

Now, below, I will provide for you an excerpt from a Tenth Circuit Court of Appeals opinion on this point. It contains statements which recognize both sides of this issue, but the decision in the case quoted was in favor of disclosure.

No two cases are exactly alike, but there is overwhelming logic and legal support for the disclosure of all the information which is sought in this case, using the proper security procedures to protect Apple’s business interests in the copyrights and intellectual property (computer programs, passwords, etc.) it owns.

The case I have chosen is: United States v. Perrine, 518 F.3d 1196, 1204-1205 (10th Cir. Kan. 2008). The Kansas Federal District Court opinion by Senior District Judge Monti Belot which was affirmed, can be found on Westlaw at: 2006 WL 1232852

The District Court opinion was not published in the Federal Supplement but of course is readily available on Westlaw.

Following is the excerpt I have chosen from pages 1204 and 1205 of the published Tenth Circuit Opinion. I have not enclosed it in quotation marks:

Every federal court to address this issue has held that subscriber information provided to an internet provider is not protected by the Fourth Amendment’s privacy expectation. See, e.g., Guest v. Leis, 255 F.3d 325, 336 (6th Cir.2001) (holding, in a non-criminal context, that “computer users do not have a legitimate expectation of privacy in their subscriber information because they have conveyed it to another person-the system operator”); United States v. Hambrick, 225 F.3d 656 (4th Cir.2000) (unpublished), affirming United States v. Hambrick, 55 F.Supp.2d 504, 508–09 (W.D.Va.1999) (holding that there was no legitimate expectation of privacy in noncontent customer information provided to an internet service provider by one of its customers); United States v. D’Andrea, 497 F.Supp.2d 117, 120 (D.Mass.2007) (“The Smith line of cases has led federal courts to uniformly conclude that internet users have no reasonable expectation of privacy in their subscriber information, the length of their stored files, and other noncontent data to which service providers must have access.”); Freedman v. America Online, Inc., 412 F.Supp.2d 174, 181 (D.Conn.2005) (“In the cases in which the issue has been considered, courts have universally found that, for purposes of the Fourth Amendment, a subscriber does not maintain a reasonable expectation of privacy with respect to his subscriber information.”); United States v. Sherr, 400 F.Supp.2d 843, 848 (D.Md.2005) (“The courts that have already addressed this issue … uniformly have found that individuals have no Fourth Amendment privacy interest in subscriber information given to an ISP.”); United States v. Cox, 190 F.Supp.2d 330, 332 (N.D.N.Y.2002) (same); United States v. Kennedy, 81 F.Supp.2d 1103, 1110 (D.Kan.2000) (“Defendant’s constitutional rights were not violated when [internet provider] divulged his subscriber information to the government. Defendant has not demonstrated an objectively reasonable legitimate expectation of privacy in his subscriber information.”). Cf. United States v. Forrester, 512 F.3d 500, 510 (9th Cir.2008) (“e-mail and Internet users have no expectation of privacy in the to/from addresses of their messages or the IP addresses of the websites they visit because they should know that this information is provided to and used by Internet *1205 service providers for the specific purpose of directing the routing of information.”); United States v. Lifshitz, 369 F.3d 173, 190 (2d Cir.2004) (“Individuals generally possess a reasonable expectation of privacy in their home computers…. They may not, however, enjoy such an expectation of privacy in transmissions over the Internet or e-mail that have already arrived at the recipient.”).

Please do not misinterpret the “expectation of privacy” language above. That language is significant only if there has been no judicially-authorized search warrant. An expectation of privacy is adequately protected and can be pierced through, if the warrant procedure is followed and probable cause to issue the warrant exists, and a neutral judge issues the warrant. That surely has occurred here.

The only difference between this situation and countless others is that instead of a small kaffeeklatsch of five lawyers discussing this case at a restaurant near the courthouse, we have an international kaffeeklatsch of two billion Apple IPhone subscribers, all members of the new technology middle-class, who have attended high school or college, and all of whom fancy themselves graduates of Harvard Law School.

Magna Cum Laude.

Have a Dovely.

Sincerely yours,
Caleb Boone.

nasch (profile) says:

Re: Apple Must Open The San Bernardino Terrorist's IPhone.

I will assume you have not read very much of the discussion here, and invite you to do so if you would like to understand the difference between producing a key to a barn and what Apple is being told to do. They are not the same.

Also not the same is providing extant subscriber data. I don’t see how that case has any relevance at all. Apple’s own press release addressed the difference, let alone all the coverage of the issue.

CalebBoone (profile) says:

Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Nasch:

I will read the articles and this discussion in full, if I have enough time after hours.

I will provide one short comment for the moment.

I understand that Apple objects in part because it believes that what the court order requires it to do is create, out of whole cloth, what it contends does not exist: a new program, password or unlocking device which does not exist.

Apple believes that, at most, it should not be required to do anything more than produce what already exists: not do the work of the court system for the courts, or do law enforcement’s work for law enforcement.

(Again, I realize that Apple has many more reasons for its objections than this, but I am just concentrating on this one reason for the moment.)

I believe Apple is clearly wrong. Its objection ignores logic.

Apple created the IPhone. Only Apple can properly open an IPhone and only Apple can engineer a device which completely and perfectly will remove all the data from an IPhone, including all the little scraps, odds and ends, which blunt-force law enforcement techniques would either destroy, lose or never recognize.

Therefore, it is like providing the key to a barn.

The barn is locked and the officer wants to get inside the barn.

He needs a key to do that.

The farmer is the only one who has the key.

The farmer needs to give him the key.

If the farmer has lost the key, the farmer must provide entry to the barn by giving the officer another key, or pointing out a rope which will pull up a sliding wooden door which the officer did not notice, so the officer can go inside through the other, sliding door, without having to use a key (now lost) to the main door.

The farmer did not commit a crime but the farmer has evidence of the crime committed by his farmhand, inside the barn: a gun, or a knife.

The farmer is ordered by the court to open the barn for the officer by whatever means are available.

The farmer may not have to re-program the old wooden barn, but if it were a modern, year-3000 Star-Trek barn and could only be entered using an elaborate computer program, then the farmer would have to rewrite the program, or re-wire the barn, or re-configure a new password, or invent a device which would open the barn, manufacture the device, and give it to the officer, if necessary.

That is what it means to fully comply with a lawful court order.

The farmer built the barn, and the farmer hired the farmhand.

The farmer may have made an error in judgment in hiring the murderous farmhand.

But the farmer must abide the consequences of his poor judgment.

Apple may have exercised poor judgment in selling an IPhone to the terrorist couple.

Apple must now abide the consequences of its decision to sell that IPhone to that terrorist couple.

Of course, it is not a question of poor judgment, but, on the other hand, there is nothing illegal in Apple requiring a criminal, personal or other background check before a prospective customer can purchase an IPhone.

No one has the Constitutional right to buy an IPhone.

Many countries in which people can buy IPhones do not have a Constitution or Bill of Rights anyway.

But of course the Bill of Rights and the Constitution do not apply to this aspect (choice-of-customer) of this private commercial transaction if the restriction on purchasing is legitimate, logical and not otherwise a violation of American (or other countries’) anti-discrimination laws, which it would not be.

An interesting thought.

Have a Dovely.

Sincerely yours,
Caleb Boone.

nasch (profile) says:

Re: Apple Must Open The San Bernardino Terrorist's IPhone.


The farmer may not have to re-program the old wooden barn, but if it were a modern, year-3000 Star-Trek barn and could only be entered using an elaborate computer program, then the farmer would have to rewrite the program, or re-wire the barn, or re-configure a new password, or invent a device which would open the barn, manufacture the device, and give it to the officer, if necessary.

That is the controversy here. You’ve clearly made up your mind, but Apple is going to contest whether the government can make them do this, and we’ll see where it goes.

No one has the Constitutional right to buy an IPhone.

You’re looking at the question backwards. The states and the people reserve all rights not explicitly conferred on the federal government, and the people have the right to do anything not explicitly banned by law. So Apple has the right to sell iPhones and I have the right to buy one, because the government has no right to tell me I cannot.

And… that was a short comment? 😉

katie higgins (profile) says:

Re: Apple Must Open The San Bernardino Terrorist's IPhone.


“Apple must now abide the consequences of its decision to sell that IPhone to that terrorist couple.”

Sorry? Not only have you gone way beyond conjecture to establish * as fact* that this particular iPhone contains evidence that will save lives–You, and most people hearing this fear- mongering -hook attached to a goal the FBI has nurtured for years, fail to employ common sense regarding WHAT A TERRORIST IS LIKELY TO STORE ON A DEVICE THAT HE/SHE CANNOT DISPOSE OF (like these two already did with the cell phones they had in their possession on the day of the attack)–

REALLY! Ultimately one must come to grips with the implications of the secrecy and subversive methods terrorists employ. A freaking idiot might encrypt every detail of a large scale terrorist network on a freaking iPhone– which is why every terrorist likely does not have this information– Or perhaps the FBI believes the key to cracking the global terrorist network is on one of their iPhones– that they leave behind when they stage their attack?

OR perhaps the FBI hopes they can sell this story to obtain what they cannot create — for purposes they are NOT obligated to disclose?

And just maybe, the FBI, CIA and NSA imagine they can only perform their duties IF No One has the ability to secure personal data?

Whatever merit you may find in these more realistic conjectures, Caleb, the fact remains that this case furthers an agenda that many of us have strong reasons to oppose.

BTW, Caleb, there is nothing inherently dangerous about an iPhone– or any cell phone that has security features for the user. However, the vigilance needed to follow up on tips that certain individuals may pose a threat to our safety, and other means long available to these national security oriented organizations that somehow aren’t being employed effectively – all of these matters conveniently fade into the background — don’t they? Maybe not a coincidence.

Criminal background checks required for purchase of an iPhone? You see that as the bottom of the slippery slope Apple is heading towards by refusing to yield to this outrageous court order? Your reasoning is far more frightening than the platform you created to support your argument.

Daniel Gray (profile) says:

Oh PUH-LEAZE

Apple is being a dick. They FBI has a legal court order requiring Apple to break this one phone and give them the data from it. I mean my God how hard would it be to take the phone and get the info off of it and then return the phone and the info and be done with it.

But nooooo Apple wants to be a dick. Ok then when the Federal Judge holds the CEO of Apple in contempt and fines apple 1 million dollars a day until they comply, then what are they going to do?

THEY were the ones demanding that the Feds get a court order to get this info, so they did. And now Apple is refusing to comply even WITH a valid court order.

Apple is going to learn March 22nd what the judge is going to decide and I can assure you that Apple is NOT going to like the end result.

nasch (profile) says:

Re: Oh PUH-LEAZE

I mean my God how hard would it be to take the phone and get the info off of it and then return the phone and the info and be done with it.

They have to update their operating system in ways they never planned for, and make sure to do it in a way that bypasses security features not intended to be bypassed and that doesn’t damage any data. Sounds like it could be tricky.

Anonymous Coward says:

Re: Oh PUH-LEAZE

Lets look at the general case that this is setting a precedent for. Microsoft and Apple have the potential ability to set key loggers, and/or ex filtrate data remotely, they just have to modify their update systems to allow that to be done on targeted phones when presented with a court order. Should the government to able to use the courts to order them to do that?

smaines (profile) says:

Re: Re: Oh PUH-LEAZE

I had a thought about this: the update must be transparent. Once done, the user-in-possession must be able to see and/or hear that this is a seized phone, on the screen and with an audible tone. This can be done even in complying with the present order (where it doesn’t matter).

Whether the phone is in possession of law enforcement, it cannot be made into a wiretap.

Could the Vendor be compelled to enable espionage against persons who don’t enjoy the protection of U.S. privacy law? There is probably settled law to the contrary, but I really don’t know.

Selling a phone in a foreign jurisdiction which would yield to a U.S. federal warrant may violate that jurisdiction’s own privacy laws. I suspect there are settled principles in international law with respect to this, maybe embodied in treaties. Again, I don’t know.

Apple could use the opportunity to narrow the precedent established by the order.

I know what questions to ask to test this off-the-cuff idea.

katie higgins (profile) says:

Re: Re: Apple Must Open The San Bernardino Terrorist's IPhone.

Excellent question-

You refer to a potential ability– one that neither Microsoft or Apple have indicated they have any desire to acquire– Nor, would either relish the publicity that would surround this relationship with the government.

It is not secret that attempts to solicit this type of relationship have failed– And it is fairly easy to assume that this court order is a convenient, fear mongering test case to force their compliance.

BUT– can our government order private industry to serve whatever need it is able to sway the public into believing is vital for national security? This is definitely the case to test those waters. But it is also a case for probing other government agencies– other than law enforcement, for comprehensive approaches to the threat of terrorism.

What other options are available to our government in terms of protecting us against the threat of homeland terrorist acts? Somehow the focus has become
intelligence gathering with increasing violations of our Constitutional rights

Azstec (profile) says:

Rember what Benjamin Franklin wrote

In 1755 Benjamin Franklin wrote “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.” With a government mandated backdoor we’ll be giving liberty to the bad guys and sacrificing our safety based not on Apple’s “marketing strategy” but on the government’s.

https://articles.azstec.com/encryption-backdoor-battle-government-sues-apple/

Uriel-238 (profile) says:

Re: "He was scum."

The DoJ likes to call lots of people scum when it suits their purposes.

Police detection dogs false positive more often than they find contraband. In Chicago some dogs false positive 97% of the time, and yet they’re still grounds for probable cause.

If we can’t trust them not to abuse detection dogs, how can we trust them not to use this?

We can’t, ergo, the battle for your privacy is fought at this front.

CalebBoone (profile) says:

Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Miss Higgins, Uriel-238, Nasch, M2 MVP, Flip, Matt, Azstec and Smaines:

I think this problem is very easily solved.

Open the telephone and allow law enforcement to have every single bit of data which is inside it.

Benjamin Franklin would have recommended the same.

Have a Dovely.

Sincerely yours,
Caleb Boone.

CalebBoone (profile) says:

Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Nurse Chapel:

“I’ll sit on the warp engines and nurse ’em myself!”

Er, I mean, Scotty.

But yes, you’re right.

“He will prick that annual blister: marriage to deceased wife’s sister.”

— W. S. Gilbert, “Iolanthe.”

I will prick this thrice-weekly blister: “Don’t you dare touch my IPhone, Mister!”

I will yank out the full-breach baby with my bare hands, slathered in olive oil to make the delivery that much smoother!

Have a Dovely.

Sincerely Yours,
CALEB BOONE.

CalebBoone (profile) says:

Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Nasch:

I have been following current events.

There are several dramatic specific news stories about horrible crimes in the United States, which have been widely discussed in the American media. Many people have expressed extremely negative opinions about the integrity of law enforcement personnel at all levels.

However, these are anecdotal. They are truly uncharacteristic of the proven, long-term, national trend demonstrated by nationwide statistics gathered during the last twenty-two years.

The FBI publishes national “Crime in the U. S.” statistics in tabular form.

The following table presents the nationwide trend from 1994 to 2014:

https://www.fbi.gov/about-us/cjis/ucr/crime-in-the-u.s/2014/crime-in-the-u.s.-2014/tables/table-1

The next table presents the FBI’s nationwide “preliminary semiannual estimate” change for January to June, when comparing 2014 to 2015:

https://www.fbi.gov/about-us/cjis/ucr/crime-in-the-u.s/2015/preliminary-semiannual-uniform-crime-report-januaryjune-2015/tables/table-1

The first table demonstrates that crime in absolute numbers has dramatically decreased from 1994 to 2014 in all categories. In some categories it has decreased by about 50%. This is true despite an increase in population of about 23% from roughly 260 million to 320 million. In my opinion, that is stunning.

There was an overall increase of 1.7 percent from 2014 to 2015 when comparing the first six months of both years, and this is shown in the second table. However, this hardly diminishes the awe-inspiring dramatic decrease overall from 1994 to 2014, which is the overwhelming trend shown in the first table.

I could not find any FBI tables for the period from July, 2015 to the present.

These tables prove I am exactly right. Law enforcement has reduced crime dramatically during the last twenty-two years in the United States.

The European Institute for Crime Prevention and Control in Helsinki, Finland, published a study of crime worldwide in 2010, and it may be found at:

http://www.unodc.org/documents/data-and-analysis/Crime-statistics/International_Statistics_on_Crime_and_Justice.pdf

I have not viewed the entire document, but I have scanned the first sixty or more pages. The United States is in the lower one-third on some tables such as one table for murder but the United States ranks in the highest quartile in crimes like assault, rape, theft and burglary.

I believe my statements about American law enforcement are right: our law enforcement agencies have made excellent progress in the last twenty-two years in cutting crime in the United States in half in many categories.

This proves my statement that law enforcement personnel in the United States are trustworthy, reliable, honest and ethical.

They can surely be trusted to preserve the secrecy, security and privacy of Apple’s products, designs, programs, passwords and other trade secrets.

I understand that you may not agree with me.

I respect your right to your opinion, and I realize that it may be completely contrary to mine.

I consider that our exchange is conducted as gentlemen and in the spirit of good-natured debate.

Thank you for your reply.

Have a Dovely.

Sincerely yours,
Caleb Boone.

nasch (profile) says:

Re: Apple Must Open The San Bernardino Terrorist's IPhone.


These tables prove I am exactly right. Law enforcement has reduced crime dramatically during the last twenty-two years in the United States.

You have proven nothing, other than crime has gone down. Those tables provide absolutely no evidence for why crime has gone down, and they certainly say nothing about the honesty, integrity, or competence of law enforcement.

I hope the arguments you make on behalf of your clients are better.

CalebBoone (profile) says:

Apple Must Open The San Bernardino Terrorist's IPhone.

Dear Nasch:

We must disagree.

I believe law enforcement officers are doing their job because I can drive to work and not be run over by a herd of buffalo.

I will take anything that I can get.

I would rather live in peace and safety and wear a double-knit plaid suit than live in terror and dress in haute couture.

I would rather make Apple write a Neiman-Marcus program and live in security than allow the San Bernardino terrorists to call Osama Bin Laden one more time.

Collect.

To buy a dozen cruise missiles.

Have a Dovely.

Sincerely yours,
Caleb Boone.

Uriel-238 (profile) says:

Re: Apple Must Open The San Bernardino Terrorist's IPhone.

I believe law enforcement officers are doing their job because I can drive to work and not be run over by a herd of buffalo.

I’m pretty sure the herds of buffalo would not manifest if we didn’t have our law enforcement.

Were I to assume for the moment that you were being metaphoric, I think crime would sharply drop were law enforcement to disappear, given their asset forfeiture programs displace more money from innocent civilians than all the burglaries combined.

And the police do love their guns, and shooting people they don’t like, and then not reporting it.

I think the peace and safety you think you live by is a false product, that the police has demonstrated that they are more interested attacking people on the presumption of guilt rather than seeing justice done. They are, now, part of the problem.

Perhaps you don’t have the capacity to care beyond your own well being, about the rest of America’s pluralist population. Or perhaps you’re just ignorant.

SteveChase says:

I cant believe they let you write.

You sir are a moron lost in the detail to such an extent that you cannot see any part of the picture at large.

The very thing that makes encryption functional, is that it cannot be broken without the key, and enabling a time limit free brute force renders it useless.

You seem to be aware of this, so one must assume that you are either too stupid to be considered an expert of any kind, or that you are a willing participant in what amounts to a flexing of government muscle.

Leave a Reply to Eponymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...