Apple Responds To Order To Help Decrypt Phone, As More Details Come To Light

from the both-not-as-bad-and-just-as-bad dept

Update: Please see our more recent article detailing why it appears that this attack could apply to more modern iPhones as well.

Last night, we wrote about a judge’s order commanding Apple to help the FBI effectively decrypt the contents of Syed Farook’s iPhone 5C. Farook, of course, along with his wife, was responsible for the San Bernardino attacks a few months ago. Many of the initial reports about the order suggested that it simply ordered Apple to break the encryption — which made many people scoff. However, as we noted, that was not accurate. Instead, it was ordering something much more specific: that Apple create a special firmware that would disable two distinct security features within iOS — one that would effectively wipe the encrypted contents following 10 failed attempts at entering the unlocking PIN (by throwing away the stored decryption key) and a second one that would progressively slow down the amount of time between repeated attempts at entering the PIN.

Late last night, Apple’s Tim Cook also posted a very direct Message to Our Customers that highlights the importance of strong encryption and why this order is such a problem (some of which we’ll discuss below).

Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

For many years, we have used encryption to protect our customers? personal data because we believe it?s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

He notes — as I did in my original post — that the FBI is demanding (and the court has agreed) to force Apple to create a backdoor and that creates a number of concerns:

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software ? which does not exist today ? would have the potential to unlock any iPhone in someone?s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today?s digital world, the ?key? to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that?s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks ? from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

Having spent a bunch of time overnight reading through the details of the DOJ’s original application, as well as reading through a few public discussions by security experts and, especially, Robert Graham’s useful thoughts about the order as well, I have some further thoughts — including that I think Cook may be slightly overstating his case with his comments, because it’s not actually clear that the backdoor the FBI is asking for would actually work on most modern iPhones, though it might work on older phones. However, the larger concerns about the order are still very much valid.

  1. One of the concerns I raised last night was probably inaccurate: that this could force Apple into creating a tool that would put many people’s privacy at risk. The order does seem fairly specific to just this phone. Yes, as Cook notes, if Apple is forced to do this and Apple does it successfully, that would open up similar orders for other phones, but the impact of that may be somewhat limited in that it only applies to older phones, and quite possibly older iPhones that have not updated their operating systems.
  2. It does seem clear that if this were a newer iPhone, which includes Apple’s “Security Enclave” system, this request would likely be impossible to meet. It’s quite interesting to read the details of how Apple’s security now works, where the Security Enclave basically cuts off this possibility, because the firmware update itself would wipe out the encryption key, effectively making it impossible to decrypt the content. It’s also possible that even in the older phone this order is still impossible, if the operating system was properly updated — in part because they may not be able to update the firmware without putting in the passcode, which is the problem they want this new firmware to solve.
  3. I keep seeing people say “why can’t they just copy the contents of the memory and brute force it elsewhere” but that’s not possible with the iPhone, since a part of the key comes from the hardware itself, and there doesn’t appear to be any way to extract it (and Apple does not keep it).
  4. The whole focus seems to be on allowing the FBI to bruteforce the passcode, which is in the realm of possibility should the two impediments above be removed, as opposed to bruteforce cracking AES encryption, which is not currently in the realm of possibility.

Graham disagrees with me over whether this is about decrypting or a backdoor — but to some extent that’s just semantics (and Cook agrees with me that it’s a backdoor). They’re not asking for a systemwide backdoor — and, indeed, it appears this approach wouldn’t work at all with more recent iPhones. However, the reason I focused on a backdoor, rather than direct decryption, is that the way most people were discussing “decryption” indicated that they seemed to think the court was ordering the impossible, which was “crack the keys you don’t have access to.” Instead, they are asking for a backdoor — just a narrow one that can only be used for this phone and would be ineffective against most modern iPhones. And then that backdoor will be used to brute force the passcode, which would then decrypt the content of Farook’s iPhone.

That said, there are still serious concerns. While the DOJ insists that its use of the 18th Century All Writs Act in this case is pretty ordinary and standard, they’re exaggerating in the extreme. Some of the previous examples they discuss do include requirements to use certain machinery in order to execute a warrant, but that’s quite different from ordering Apple to write entirely new software. The DOJ again insists that there are examples of All Writs Act requests in the past that have required software, but it’s notable that when the DOJ says this, it does not immediately cite any cases, but rather says just that sometimes providers have to “write code in order to gather information in response to subpoenas or other process.” There’s a pretty big difference between writing some scraping or search code, which this implies, and creating a special firmware as the DOJ is asking for in this case.

Also, as Cook notes, the unprecedented nature of this is that it’s not at all similar to previous cases, because this would involve actively undermining the security of devices, rather than just helping to gather information that is readily available.

In fact, even more ridiculous is the idea, as laid out in the DOJ’s application for this order, that this will not be burdensome to Apple simply because Apple writes software:

While the order in this case requires Apple to provide modified software, modifying an operating system – writing software code – is not an unreasonable burden for a company that writes software code as part of its regular business.

Uh, yeah, it can be when what you’re asking for is fairly complex and may not even be possible depending on the specifics of the way the security in the iPhone 5C is designed. And, seriously, just saying “Apple writes software, therefore any request for it to write software is not burdensome” is ridiculous on its face.

There’s a separate question, raised by people such as Chris Soghoian, about whether or not this particular use of the All Writs Act to force Apple to use its code signing keys to “sign” this new firmware violates the First Amendment in compelling speech. It will be interesting to see if Apple raises this issue in its inevitable appeal of the order.

In the end, this is both a big deal and potentially not a big deal. It’s a big deal in that after a few previous attempts to use the All Writs Act to force Apple to “decrypt” content on a phone, a court has not only done so, but done so with fairly specific instructions on what Apple has to do to create a very specific bit of software that removes a couple security features. That raises a bunch of legal questions, that I would imagine Apple will quickly raise in response as well. However, from a technological standpoint, it appears that many of these questions will soon be moot, so long as people have more modern and updated phones. But the bigger concern, as Cook notes, is the precedent here that a court can order, at the behest of the FBI, that a tech company undermine the security of a device. As he notes, once you start down that slippery slope it’s not hard to see where that can lead:

The implications of the government?s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone?s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone?s microphone or camera without your knowledge.

This particular legal battle is going to get very, very interesting.

Filed Under: , , , , , , , , ,
Companies: apple

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Apple Responds To Order To Help Decrypt Phone, As More Details Come To Light”

Subscribe: RSS Leave a comment
That One Guy (profile) says:

Not just a promise, a /pinky/ promise

As I understand it the FBI is basically asking Apple to create the equivalent of a key manufacturing device, able to create a key that will allow them to unlock a particular lock. The problem is that once they have the device, absolutely nothing is stopping them from re-purposing it and using it to create other keys, to other locks, other than a laughable promise/assertion that they won’t do so.

If the code can be used to unlock one device, it can be used to do the same to any other device with similar security, and given how utterly pathetic government computer security seems to be I wouldn’t give it so much as a month, max, before someone else got their hands on the code and started using it.

Kenneth Michaels (profile) says:

Re: No Stone Unturned?

The DOJ says: “We will leave no stone unturned as we gather as much information and evidence as possible. These victims and families deserve nothing less.” (NYT link below.) What about the shredded documents that the FBI left behind in the shooters’ apartment?

The DOJ and FBI ignored possible critical evidence in the shooters’ apartment when they released it to the landlord just two days after the shooting.

NYPD Detective Harry Houck said he was “shocked” the FBI released the apartment. Houck told Anderson Cooper on CNN: “This apartment clearly is full of evidence. I don’t see any fingerprint dust on the walls where they went in there and checked for fingerprints for other people that might have been connected to these two. You’ve got documents laying all over the place — you’ve got shredded documents that need to be taken out of there and put together to see what was shredded.” “You have passports, drivers licenses — now you have thousands of fingerprints all over inside this crime scene.” (

Yea, the victims’ families should be angry that their loved ones are being used as an excuse to undermine civil liberties.

NYT article:

Anonymous Coward says:

Re: Re: No Stone Unturned?

Yea, the victims’ families should be angry that their loved ones are being used as an excuse to undermine civil liberties.

Just the Families?

I think it is fair to say WE ALL should be a bit miffed about this whole problem. Terrorism has become the wet dream of our TYRANT leaders and 3 letter agencies!

Jason says:

Re: Re: No Stone Unturned?

Why bother combing through the physical premises when you can just dump everything off a phone (because surely the suspects put every last detail of their master plan on their smart phone)? After all, they seem committed to the idea that they shouldn’t need to trouble themselves for a warrant since “it’s just a pen register” or whatever other decades- (or century-) old law is their favorite investigative tool these days, not actual detective work.

Anonymous Coward says:

There is another more problematic issue here

Such as a government entity telling a business how to code and develop their products.

This is essentially tyranny, abuse of power, and reckless to the point that the Judge should be considered to have willfully sullied the bench on which he sits and should be removed, disbarred, and criminally charged! This is no different than a court telling a safe manufacturer that they must use a specific set of locks that will allow a key they must then give to the court to use on said safe.

There are so many things wrong with this “Order” that it is despicable! The fact that this type of stuff will continue is why those idiots in the federal land cases are going to gain traction. All of the governments branches are beginning to lean into a full assault on “The People” and their “Liberties”. It’s like the US Government is wanting another Civil War.

Richard (profile) says:

Re: There is another more problematic issue here

uch as a government entity telling a business how to code and develop their products.

Actually that issue is not raised by this order.

The reason is imple.

If it is possible for Apple to comply with this order in respect of the particular device – then it is also possible for someone else to do it – in other words this particular cat is already out of the bag.

On the other hand if Apple cannot do it then they cannot comply.

What would be problematic would be for the government to order Apple to make sure that its future phones are crackable – in which case the issue that you raise would be valid – and very troubling – however it doesn’t seem that that is what is being requested here.

Anonymous Coward says:

Re: Re: There is another more problematic issue here

The builder (manufacturer) of a home(phone) sells the house (phone) to someone. Later the government demands that the builder (manufacturer) bypasses the security of the home (phone) so they can enter it.

How is that even legal?

Sure the government could hire a locksmith (developer/cryptographer) and pay them to break through the security.

But to demand that someone perform the work for them, even with compensation, just seems like something that should not be legal in a free society.

Anonymous Coward says:

Re: Re: There is another more problematic issue here

If it is possible for Apple to comply with this order in respect of the particular device – then it is also possible for someone else to do it – in other words this particular cat is already out of the bag.

No, because the program has to be digitally signed by Apple in order to run on the phone. Unless they give away their digital signature key (which would pretty much render any security on their phones useless, because anyone could just update the phone with whatever code they wanted) they are, in fact, the only ones who can do this.

Anonymous Coward says:

This reminds me of how in science fiction movies/shows people with super speed can crack electronic combination keypads to high security places by simply trying a thousand passcodes a second. There are so many problems with that.

A: Even if you are fast enough to try a thousand passwords a second the hardware probably couldn’t withstand you pushing so many buttons in so little time. Chances are you’ll physically break something.

B: Any half decent security system will introduce a delay between successive password attempts after you failed to enter the correct password three times or so.

C: Any half decent security system will then proceed to disable you from attempting any more passwords and alert the administrator of your many failed attempts.

But those science fiction movies never consider these things.

DannyB (profile) says:

Re: Re: Re:

I realize that you are making fun of the government for having abysmally poor security.

This kind of incompetence is not reserved to government.

AT&T for example, had a web site that would show you your information. If you increased or decreased a number in the address bar by typing over it, you could see another customer’s information. In fact, you could easily get information for hundreds or thousands of customers.

If you were to tell anyone of this bad security — then YOU ARE THE CRIMINAL!!!

Anonymous Coward says:

Metaphors equating encrypted digital data to physical objects (i.e., accessing digital data == opening a safe)seem to fail at a very low level. But, because metaphors are so useful in understanding complex subjects, this action might be described as a judge ordering a doctor to reanimate a corpse in order to further question it.

Anonymous Coward says:

If there is a security problem here the problem isn’t whether the court is asking to create a backdoor to existing hardware. If there is a ‘back door’ or vulnerability (that is Apple’s ability to exploit a weakness) then said backdoor was already there for apple to exploit. The problem is that said vulnerability is already there to begin with.

Bluedarky says:

Re: Re:

There’s a few issues with your claim, first is that Apple’s options are expanded with having physical access to the device, it’s entirely plausible that with physical access Apple could force a device update.

Secondly this isn’t about using it to brute force one iPhone, it’s about that if this is forced to happen, the government could start using the same legal tools to force developers to include backdoors in all their software “In the name of national security” and would have a precedence that could force this through.

Anonymous Coward says:

Re: Re: Re:

There is a difference between requiring manufacturers to include a backdoor that enables decryption prior to encryption and requiring manufacturers to create a ‘backdoor’ after the fact. If the device is already encrypted any ‘backdoor’ that’s been created after that is not really a back door it’s exploiting an existing vulnerability.

Anonymous Coward says:

Re: Re: Re:2 Re:

and lets not forget the definition of a backdoor to distinguish it from a vulnerability.

A vulnerability is a security weakness that wasn’t necessarily put there intentionally.

A backdoor is a vulnerability that was intentionally put there prior to deployment so that it can later be exploited.

The government is not asking Apple to create a backdoor. They are either asking apple to exploit an existing vulnerability or to exploit an existing backdoor (an already existing way that enabled apple to weaken a security feature). Apple can’t create a backdoor after deployment, they can only exploit an existing vulnerability or backdoor.

nasch (profile) says:

Re: Re: Re:3 Re:

Apple can’t create a backdoor after deployment

If they replace the firmware and/or OS with a new version, isn’t that creating a new backdoor? You could say that the fact that it’s possible to do is an existing vulnerability, but I don’t see how that makes this any less a new backdoor installed on the phone after release.

Anonymous Coward says:

Re: Re: Re:4 Re:

The possibility to do this in a way that weakens security after deployment, after something has been encrypted and ‘secured’, is either an existing backdoor or an existing vulnerability. It’s not a new backdoor. A backdoor refers to a weakness that the manufacturer intentionally placed there prior to deployment intended to enable the manufacturer to more easily snoop on your data.

Anonymous Coward says:

Re: Re: Re:3 Re:

A vulnerability that requires physical possession of the phone by an adverse party AND requires Apple itself to custom-build software to target your phone isn’t the type of vulnerability most people are concerned about, though. (If you’re worried about Apple accessing your phone, then you can’t ever install any updates from them – and if you never install updates, then you possibly aren’t fixing other, more important, vulnerabilities.) And I think I read somewhere here that they’ve corrected even this vulnerability in future phones.

Anonymous Coward says:

Re: Re: Re:4 Re:

“A vulnerability that requires physical possession of the phone by an adverse party AND requires Apple itself to custom-build software to target your phone isn’t the type of vulnerability most people are concerned about, though.”

and that point can be argued separately. But to claim that this is bad because it’s a backdoor distracts from the other points that could be argued and ends up just discrediting the source that’s conflating a back door with something that’s not a backdoor. A backdoor has a specific meaning and we shouldn’t misuse it just because of headlines.

kallethen says:

I very much applaud Cook for not just giving ‘privacy’ reasons, but highlighting how it’s a security problem to create a backdoor. Too often, it seems the argument is focused on privacy vs. national security and leaves private security out of it.

While I’m not a fan of Apple’s walled garden approach to their products, I have huge respect for them taking a stance for private security and it’s certainly something I’ll keep in mind when I’m eventually shopping for a new phone.

Anonymous Coward says:

It all starts with just one...

It will not be long before it is “urgently” needed again for another shooter or terrorist. After a while it will be child molesters. After that they will start to complain that Apple refuses to play ball with iPhone 6 and demand they write new software for that. When that isn’t possible, some judge who don’t know a smartphone from a laptop will expand it to demand that Apple and others remove those features that makes it impossible to rewrite the software.
I don’t fancy Apple that much, but I value the stand they take here immensely, because if they give in, the escalating effect is sure to come to the detriment of us all.

DannyB (profile) says:

Re: It all starts with just one...

On a serious note, I don’t fancy Apple either. (Although I once was a card carrying fanboy of the old Apple in the 80’s and 90’s.)

I am inclined to believe that it is already too late. The escalating effect is sure to come to the detriment of us all, as you say. Whether Apple wins or loses this one. It will come up again. This battle with the clipper chip and mandatory weak encryption with a government key already happened in the 90’s. Now the government wants more. Much more.

Sorry to sound like a pessimist. But I think I am actually a realist.

Anonymous Coward says:

On the one hand, if Apple gives “a file” to the FBI that they can install on the subject device, that file *will* end up in the hands of the NSA, be decompiled, analyzed, and repurposed.

On the other hand, if Apple insists they have to have the device in-shop to do the install, the courts will insist they keep on doing this until hell freezes over. “You did it once, it can’t be burdensome to do it again.”

On the gripping hand, if there’s a flaw with the implementation of the fix that causes the device to be wiped anyway, well, “oops, these things happen in software development. Sorry.” And “no, we HAD to test it on the target device and none other.” And maybe, “so get YOUR software development experts to prove this was deliberate, if you feel so strongly about it. We’ll see you in court. Again.”

Citizen says:

I don’t see what the controversy is here. Apple can’t be compelled to turn over the modified version of the firmware, but only to use it themselves to unlock the phone. They could then provide all of the extracted information from the phone, and restore the previous version of the firmware before returning the phone. That firmware version would be kept entirely under Apple’s control. That would ensure, (as long as Apple adheres to their current philosophy), that a fully vetted warrant would be needed for any subsequent use of that version. It would be the challenge of another watchdog to ensure that Apple doesn’t abuse that authority.

Anonymous Coward says:

Re: Re:

The controversy is threefold:

‘We want you to fork your entire OS to change an underlying security feature, run it through all your normal processes, force the phone to update to it, and then dump the fork.’

‘Also, we’re not going to pay you for this.’

‘Also, once we’ve done this once, we’ll come back again and again, and eventually try to force you to make this a standard feature in your OS. After all, won’t that be easier?’

Anonymous Coward says:

Re: Re:

I believe what the FBI is asking Apple to do is create a weak version of iOS, install it on a similar phone, then swap firmware chips between the two devices. This would allow the FBI’s phone to be booted and brute forced. If the phone has a good password, the FBI will likely not be successful.

For the FBI to brute force the phone, Apple has to give it to them with the compromised firmware in place and once it’s out of Apple’s control, all bets are off.

DannyB (profile) says:

Re: Re:

EVEN IF what you describe were true, and it would not be. The FIB would grab that new firmware at gunpoint under color of law the first chance it gets.

But even if, then this simply becomes the camel’s nose under the tent. Or the foot in the door.

All law enforcement agencies will now want a revolving door into Apple for an endless stream of ‘break into this iPhone’ demands.

Of course, with the newest phone hardware this is simply impossible. What makes this case more interesting is that this is an older iPhone.

Anonymous Coward says:

Re: Re: Re:

I think law enforcement just wants to return to the days of iPhone 4 where Apple could (and did) brute force phones for law enforcement.

I hope Apple sticks to their guns, but I think in the end Apple will lose. If the FBI can’t force Apple to comply, new telecom rules will come into place that bans wireless operators from activating devices that can’t be decrypted by law enforcement. Telecom companies are notoriously friendly to the feds.

Lisa says:

Re: Re:

That’s not entirely accurate. The Order does require that the firmware be turned over to the FBI. In addition, the back doors is created by the mere existence of the firmware, which does not currently exist now. As long as it has not been created, there is no risk. Which is what Apple has promised its customers…that even the company can’t bypass the encryption. Once they create it, then that no longer holds true. Not to mention the fact that there is now one or more people running around that knows exactly how to do it.

Ben (profile) says:

Two sides to this for Apple

they are asking for a backdoor — just a narrow one that can only be used for this phone and would be ineffective against most modern iPhones

Which would imply that if Apple seriously considers this everyone will be dumping their old iPhones for the “modern iPhones” (i.e. more sales!) — if they trust Apple at all after this (i.e. fewer sales!).

As for brute forcing:
I would think they could copy all the data off the existing phone, encrypted though it may be, just in case they “accidentally” cause a data wipe; whether they could copy that (encrypted) data back may be an issue, but it would seem to be a way to start.

DannyB (profile) says:

Re: Two sides to this for Apple

If you read the link about how iOS security works, at least in the latest phones, newer than this present case, a data wipe simply consists of the Secure Enclave destroying the decryption key. No need to wipe anything. It’s all just a bunch of encrypted data that statistically resembles random noise. Good luck decrypting it without the key.

What the FBI wants here is to get hold of the key. But wouldn’t it be easier to simply torture the defendant? It seems the court could simply order that, it is more likely to be effective, and would cost significantly less. And it is the in accordance with the values of the Untied Police States of America.

JBDragon says:

Re: Re: Two sides to this for Apple

The Government can already get a Court issued warrant and make you unlock your phone. If you don’t do it you get thrown in jail. it’s been done in the past. In this case, the TERRORISTS were killed!!! So they can’t get anyone to unlock it. Why they’re making some big stink about this one? No one going to trial, they’re DEAD!!!

nasch (profile) says:

Re: Two sides to this for Apple

I would think they could copy all the data off the existing phone, encrypted though it may be, just in case they “accidentally” cause a data wipe; whether they could copy that (encrypted) data back may be an issue, but it would seem to be a way to start.

Because that wouldn’t do them any good. They are currently trying to brute force the phone’s access key, which requires the phone hardware. If all they had was the encrypted data, they would have to directly brute force the encryption, which isn’t feasible in any useful time frame. I don’t know if it’s hundreds of years, or millions, or billions, but way too long. That’s my understanding anyway.

DannyB (profile) says:

Unduly Burdensome

Who would pay for this new software development? Apple? The Taxpayers?

Will Apple be compensated for the opportunity cost of lost time to market? What if Apple were to miss a major market opportunity because they have to divert significant resources into a development effort? Would Apple be compensated for that?

Can the court require the FBI to post a several billion dollar bond to cover this possibility?

Or maybe the FBI can use their own resources to recruit and employ the developers who would do this work? (Oh, yeah, a government run software development project. Those always work out well.)

Is there some standard dollar amount to how high a barrier Apple should be required to jump over? What if Apple designs a device where the cost to break into it is absurdly high? Why should Apple be considered more capable of breaking it than our good fiends at the NSA? [sic]

In this present case, is the cost to absurdly high to require Apple to comply with this order without compensation?

The ultimate question: can the government require that nobody is allowed to build devices that are secure enough that it is infeasible to break into them? Or should the world be aware that US devices are mandated to be insecure by design?

Anonymous Coward says:

Re: Unduly Burdensome

To address your points:

According to the order, “Apple shall advise the government of the reasonable cost of providing this service.” So I would guess that means the government (and therefore the taxpayers) are paying.

According to the order, “To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief within five business days of receipt of the Order.” So if they stood to lose a billion dollars from a missed market opportunity, they could inform the court and get this order delayed or something.

The FBI cannot create this program – even if they had the knowledge allowing them to create it, they cannot provide the signature that would allow it to run, since only Apple has that. And Apple, given the choice, would much rather write this themselves than blindly sign something that the FBI gives them. There’s a huge difference in security risk.

Some government officials do think that backdoors should be mandated, but that’s not currently the law.

mcinsand (profile) says:

risks and penalties

Cellphones are far, far beyond a simple phone, and most of us conduct transactions as well as communications on ours. So, if someone with ill intent gets access to such a backdoor, imagine the damage that could result. What would happen if millions of bank accounts were compromised?

Now, let’s consider just who is supposed to have custody of said backdoor. We’re now learning how our past four secretaries of state mishandled critical state secrets. This isn’t a partisan issue, and it’s more than just one person. Powell, Rice, Clinton, and Kerry seem to have issues with keeping critical information to appropriately secure communications pathways. Those guys are at the top of the security food chain, and they can’t stick to procedure.

Such backdoors should never happen, and even pressuring for such a security weakness should be considered providing material support when that backdoor is eventually misused. A government that cannot protect the secrets that it has certainly can’t be trusted to protect secrets when we weaken security even further.

Anonymous Coward says:

Here’s what I don’t get… Even if this is the newest iPhone 6S, couldn’t someone open up the phone and jtag the chip to get the secret key? Apple has the schematics so the FBI could ask for a warrant on those and not have to labor in reverse engineering like a jail breaking (think mod chips for various consoles) community would. I don’t agree with them getting the data, as we know this will be used against various others, like Angela Merkel again, but it definitely seems possible.

seedeevee (profile) says:

Let's blame Sheri Pym

“Prior to her appointment, she served as an Assistant U.S. Attorney and Chief of the Riverside branch office of the United States Attorney’s Office, doing mostly criminal prosecution work.”

Need we say that the Riverside branch is one of the most corrupt and abusive branches in one of he most corrupt and abusive federal jurisdictions?

Adam (profile) says:

The scariest part is...

The scariest part is not even this order but when reading the various outlets on this news you find there are literally masses of people that are completely acceptable to what the FBI wants… some going so far as to bash others for being against it and claiming they support terrorism. Sorry, I’m not willing to give up my rights to our own government because they promise it helps protect us when in fact they are slowly dousing the rag with ether with hopes to surprise us and slip it over our face and drag us off to conformity.

Anonymous Coward says:

I have often thought that there are 3 things that would tell me that my Federal Government is no longer representative of me, nor cared one whit about the Constitution, and that would presage the implementation of even darker policies that are indistinguishable from Fascism.

At the point any or all of the 3 are implemented, I would thus have to make a decision as to whether I wished to support the Government that implemented them any longer. Support with my votes, with my taxes, or with my consent that this Government represented me.

They are:

– Banning of guns
– Banning of physical cash
– Banning encryption

It sure appears that several of these are soon to be on the table to be banned. And when one goes, the next…and all 3…are likely soon to follow.

If my nightmare scenario comes true, there may be a point at which all those dildos and lube sent to the Malheur Mormon Crazies are no longer so damn humorous.

They may have been (in my opinion) the wrong people, at the wrong time, in the wrong place, but at their heart…all they are saying is that the Federal Government is overreaching.

Reddit bathroom humor and derision aside…is there not the slightest grain of truth to this fear?

Pronounce (profile) says:

It's All About the Attitude Here

The FBI NOW seems to be concerned about preserving the evidence on the phone. Why weren’t they concerned about preserving the life of the owner of the phone?

And since enhanced interrogation seems to be OK in the U.S. I’m sure (not really, but some think it works) they could have extracted the password in time.

Some will say the terrorists were heavily armed and maybe had explosives with them, and so force begets force. But I think if you think more tactically and with the a bit of logic the SUV could have easily have been trapped and the police could have hid inside their armored vehicles and easily have wait out the terrorists if they thought about this before they “Bonnie and Clyde” the crap of them.

And maybe they would have martyred themselves anyway, but at least try to use non-lethal means in a situation. What is the real cost of a little bit time and and a little bit of non-lethal persuasion?

Anonymous Coward says:

PR and Propaganda circle jerk.

That statement by Cook is so stuffed full of deliberately implied BS it’s staggering that no one called out any of it- an average uninformed person would conclude that iphones can’t be spied on or tracked.

Here’s a much better press release:

I wonder how many favours a company has to do to get away with shit like that….

Apple doesn’t need to protect customers- they need customers to believe they’re trying to protect them; and they need plausible deniability when it’s shown they didn’t.

Anonymous Coward says:

Re: PR and Propaganda circle jerk.

Except that we’re discussing encryption of a phone that is already turned off, and creating a new operating system that disables bricking a phone for EVERY customer on the planet using iOS, not to mention the eventual requirement to unlock EVERY OS on EVERY telephone manufactured, all of which has absolutely NOTHING to do with the article you linked. Nice try.

Rich Kulawiec (profile) says:

Here's an idea for the FBI

Instead of using time, money and personnel to create ersatz terrorists by entrapping impressionable and unstable people, why not — and I know this is highly unorthodox — spend those resources on good old-fashioned detective work (the kind that sufficed nicely before cell phones existed) directed at real live actual bona fide terrorists?

Anonymous Coward says:

What scares me the most, is that they WANT to decrypt what is a essentially, our lives, like an open book, not out in the open where the search without warrant is obvious

What scares me, is their mentalities, they see absolutely nothing wrong with this…….so i ask myself, what are these people gonna turn our world into

THAT’s, whats scary! A million times so then the distractions they pile on us………THIS, going down this road, has the potential to be SOOOO much worse then anything we face today

Absolute power, non restrained, and no meaningfull safeguards………and even then, they should only be granted say 0.1% of everything their asking……as some of the shit their asking to have the ability too do, should be straight up denied……..ON INSTINCT

Anonymous Coward says:

Just to put things in perspective, this is ths amount of level of trust i have with “civilised” governments these days

With an observation like

“DOJ again insists that there are examples of All Writs Act requests in the past that have required software, but it’s notable that when the DOJ says this, it does not immediately cite any cases”

I imediately think
…because their lying, or stretching a “truth”
….or their afraid of opening another can of secret worms

Whatever (profile) says:

I think part of the issue here is that Apple doesn’t want to admit it can be done. If they follow the court’s order and do in fact unlock the phone, they will have blown away more than half a decade of hype on the topic.

Moreover, if Apple can do it easily (say a supplier does keep a list of keys, or Apple has in fact kept a list or an algo which can be re-run to obtain the key), then they would take a seriously huge hit.

See, if the truth is that Apple cannot do it, they would just comply with the court order by attempting to do it, prove that they cannot do it, and it would end at some point (say a year or two from now) when Apple proves it would take years of machine time to try to brute force because there is no other choice.

The choice to fight against the court order may be philisophical, may be part of their belief system, but the vigor and aggressive nature of the response makes me think that they already know the answer, and that they don’t want you to know it.

Anonymous Coward says:

Re: Re:

See, if the truth is that Apple cannot do it, they would just comply with the court order by attempting to do it, prove that they cannot do it, and it would end at some point (say a year or two from now) when Apple proves it would take years of machine time to try to brute force because there is no other choice.

Why would Apple spend a year trying to do something they know they can’t do, rather than tell the court now that it can’t be done? That would be insane. And the order itself tells Apple that it should tell the court within 5 days if compliance is not reasonable. Why shouldn’t they just do that?

I think part of the issue here is that Apple doesn’t want to admit it can be done. If they follow the court’s order and do in fact unlock the phone, they will have blown away more than half a decade of hype on the topic.

Even if Apple did secretly have the key or a way to regenerate it, this order doesn’t even ask for that. The order does not direct them to attempt to unlock the phone or provide the key; it orders them to give the government a program to do specific things to make it easier for the government to brute-force the passcode.

“Apple’s reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.”

Whatever (profile) says:

Re: Re: Re:

Part of the issue here is that while Apple makes all sorts of noise about the “secret” UID used as part of the salt in the encoding and decoding process, they don’t happen to mention that the actual security is really limited to two things: 5 second delay per pin code entered, and a maximum of 10 tries. Both of those things are entirely artificial and should be able to be removed or circumvented. Once you remove them, the relatively small size of the passcodes (4 digits on older phone, up to 6 on newer versions of their OS) is an absolute dawdle to get past. Literally less than 1 million possibilities, and at 1 per second it would take less than a half a month to plow through them.

The key of course is that this would render Apple’s encryption scheme moot for law enforcement, as having the physical device (and an applied patch) would allow any device to be viewed within a reasonable period of time.

Apple may find itself fighting a losing battle on this one.

Dan says:

I do not agree with Masnick’s confidence that this will only affect older devices. Once the FBI gets the firmware from Apple, they will not settle with just using it on Farook’s phone. They will reverse engineer it to make it work on all Apple devices. I doubt that Apple would totally change the security architecture of their devices between releases.

nasch (profile) says:

Re: Re:

They will reverse engineer it to make it work on all Apple devices.

Do you have any references indicating that this is possible?

I doubt that Apple would totally change the security architecture of their devices between releases.

ARM (which Apple uses) substantially changed the security architecture of their chipset, and Apple took advantage of that new architecture.

I’m sure there are plenty of other references on Secure Enclave if you’re interested.

Don't quote me (profile) says:

What if Apple fails?

What if Apple complies after losing on appeal but ends up bricking the phone or destroying the contents? Is the government then going to accuse them of criminal contempt for the fail and if so, how could they prove it without threatening apple programmers with jail unless they testify against others to help the government get convictions? There’s a whole other layer of evil lurking here depending on how this plays out.

Anonymous Coward says:


just ask the NSA, Facebook, and a couple of other entities. ATT probably has the text messages still (isn’t it required that ISP’s hold all data for 3+ years in case of this kind of stuff), and there are other things on the phone.

turn on the phone (no password), and see what it is communicating with (MITM attack).

this way, we the people, can truly see who our friends are (right now, apparently Apple)

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...