As we pointed out earlier this week, it's pretty obvious that the Justice Department lied to a federal magistrate judge in saying that it had exhausted all possible opportunities to get into the work iPhone of Syed Farook, given that it has now put the case about it on hold to test out a "new way" to get into the phone. The DOJ had made a filing claiming that Apple's help was the only way to get into the phone, yet now is saying that's probably not true. However, the FBI is insisting that the DOJ wasn't lying. In a letter to the Editor at the Wall Street Journal, FBI Director James Comey reacts angrily to a similar opinion piece at the WSJ suggesting the DOJ lied:
You are simply wrong to assert that the FBI and the Justice Department lied about our ability to access the San Bernardino killer’s phone. I would have thought that you, as advocates of market forces, would realize the impact of the San Bernardino litigation. It stimulated creative people around the world to see what they might be able to do. And I’m not embarrassed to admit that all technical creativity does not reside in government. Lots of folks came to us with ideas. It looks like one of those ideas may work and that is a very good thing, because the San Bernardino case was not about trying to send a message or set a precedent; it was and is about fully investigating a terrorist attack.
James B. Comey
It's difficult to take much of that at face value -- especially as the government continues to push for similar court orders in other cases. And especially as Comey has been whining on and on about "going dark" for well over a year and a half now. At the very least, it does seem clear that the FBI failed to truly explore all possible options. As some iPhone forensics folks have noted, if this were truly a brand new solution, the FBI would need a hell of a lot more than two weeks of testing to make sure it really worked.
In the meantime, I'd heard from a few folks, and now others are reporting as well, that the assumptions that many had made about the Israeli company Cellebrite providing the solution are simply not true -- along with the idea that the solution involves reflashing the chip. The FBI itself now says it's a "software-based" solution.
FBI Director James Comey, in response to a reporter's question at a briefing, said making a copy of the iPhone’s chip in an effort to circumvent the password lockout “doesn’t work.” Comey wouldn't identify the company that's helping it or discuss details of the technique.
Law enforcement officials speaking on background debunked another report that had named Israeli forensics firm Cellebrite as the mystery firm helping it break into the phone.
Of course, this is after Cellebrite got a ton of free publicity from press reports claiming that it was the company (all of which was based on a few rumors from within the forensics world):
At this point it's not clear that you can trust the FBI or DOJ on anything about these issues, as they've managed the messaging very, very carefully, and at times have made statements that are somewhere in that gray zone between "misleading" and "outright lies." But Comey's actions over the last year and a half make it quite clear that this is not just about this one iPhone and he very, very much wants a precedent that will effectively stop the possibility of encryption that the FBI can't easily circumvent.
So now that there's been a little time to process the Justice Department's last minute decision to bail out on the hearing in the San Bernardino case, claiming it was because some mysterious third party had demonstrated a way to hack into Syed Farook's iPhone, it's becoming increasingly clear that (1) the DOJ almost certainly lied at some point in this case and (2) this move was almost entirely about running away from a public relations battle that it was almost certainly losing (while also recognizing that it had a half-decent chance of also losing the court case). Just replace "Sir Robin" with "the DOJ" in the following video.
That said, there are still some things to clear up. First, did the DOJ lie? It seems pretty obvious that it must have. After all, it insisted earlier in the case, multiple times, that it had "exhausted" all other possibilities and "the only" way to get into the phone was with Apple's help. That's certainly raised some eyebrows:
The DOJ and its supporters, of course, will argue that "new shit has come to light, man," but that seems... doubtful. My first thought was that when the FBI said that it had been alerted to a way in over the weekend, it potentially was using the announcement from researchers at Johns Hopkins about a flaw in iMessage encryption. If so, that would be particularly bogus, since everyone admits that the vulnerability found would not apply to this case.
However, there's now a ton of speculation going around about the likely method (and the likely third party) that the FBI is probably using, involving copying the storage off the chip and then copying it back to brute force the passcode without setting off the security features or deleting the data. But, again, this possible solution isn't really new. Just a few weeks ago, during a Congressional hearing, Rep. Darrell Issa quizzed FBI Director James Comey about this very technique (which was so deep in the technical weeds, that many reporters and other policy folks were left scratching their heads):
That video is worth watching, because Director Comey insists, pretty clearly, that there is no way to get into the phone:
Comey: We wouldn't be litigating it if we could [get in ourselves]. We've engaged all parts of the US government to see 'does anyone have a way -- short of asking Apple to do it -- with a 5c running iOS 9 to do this?' and we do not.
At that point Issa starts asking really technical questions about can't the FBI remove the data from the phone to make copies of the storage, putting it with the encryption chip, trying passcodes, and then reflashing the memory before the 10 chance are used up -- thus brute forcing the passcode without setting off the security features. As Issa notes:
If you haven't asked that question, how can you come before this committee and before a federal judge and demand that somebody else invent something if you can't answer the question that your people have tried this? ... I'm asking who did you go to? Have you asked these questions? Because you're expecting to get an order and have somebody obey something they don't want to do and you haven't even figured out if you can do it yourself.
Comey is clearly befuddled by the questions and basically says that he's sure that his people must have thought about this, but he assumes that they're watching and if they haven't thought of this then they'll test it out. But, really, a few people had suggested similar things early on, so if that is the solution then it only adds weight to the idea that the FBI didn't do everything it could possibly do before running to the judge.
Others have questioned the "two week" timeframe for the DOJ to issue a status report to the court, noting that a brand new solution would almost certainly take much longer to test thoroughly before using it on the iPhone in question.
And then there's the other question: if the FBI really has tracked down a new "vulnerability" in Apple's encryption... will it tell Apple about it so that Apple can patch it? Remember, the White House has told the various parts of the federal government that they should have a "bias" towards revealing the flaws so they can be patched... but leaving a "broad exception for 'a clear national security or law enforcement need.'" It's pretty clear from how the DOJ has acted that it believes this kind of hole is a "law enforcement need."
So, if the FBI really did figure out a vulnerability in Apple's encryption, it probably won't actually reveal it -- but I'd imagine that Apple's security engineers are scrambling just the same to see if they can patch whatever flaws there may be here, because that's their job. And, again, that gets back to the point here: there are always some vulnerabilities in encryption schemes, and part of the job of security folks is to keep patching them. And one of the worries with the demand for backdoors is that the introduce a whole bunch of vulnerabilities that they're then not allowed to patch.
Either way, the DOJ's actions here are highly questionable, and it seems pretty clearly an attempt to save face in this round. But the overall fight is far from over.
For many years, if you mentioned the term "cybersecurity czar" in the federal government, it only meant Richard Clarke. He was one of the earliest people to focus on computer security as an issue, and as such, became an advisor to multiple presidents on the issue. I haven't always agreed with him -- there have been points in the past where he's appeared as a leading voice in support of greater surveillance and exaggerated claims about a coming "cyberwar". However, in the past few years, Clarke has become much better on these issues, warning (just prior to Snowden's revelations) that the US's focus on surveillance has actually made the public less safe by leaving vulnerabilities open, rather than closing them.
And now he's weighed in, quite vocally, on the whole Apple v. the FBI thing, strongly in support of Apple. To some extent, this isn't a huge surprise as he was among a large group of smart folks who signed onto a letter a year ago opposing encryption backdoors, but his NPR interview gave him a chance to be quite explicit in just how dumb the FBI/DOJ's requests are. It's worth listening to the whole thing, or at least reading the transcript, but here are a few key highlights. Specifically, he argues that the FBI is lying in saying that it can't get access to the content on the phone, and just wants to set a precedent:
If I were in the job now, I would have simply told the FBI to call Fort Meade, the headquarters of the National Security Agency, and NSA would have solved this problem for them. They're not as interested in solving the problem as they are in getting a legal precedent.... Every expert I know believes that NSA could crack this phone. They want the precedent that the government can compel a computer device manufacturer to allow the government in.
Earlier in the interview, he totally dismisses the idea that there's a big dispute in the administration about this, saying that it's just the FBI and DOJ exaggerating:
Well, I don't think it's a fierce debate. I think the Justice Department and the FBI are on their own here. You know, the secretary of defense has said how important encryption is when asked about this case. The National Security Agency director and three past National Security Agency directors, a former CIA director, a former Homeland Security secretary have all said that they're much more sympathetic with Apple in this case. You really have to understand that the FBI director is exaggerating the need for this and is trying to build it up as an emotional case, organizing the families of the victims and all of that. And it's Jim Comey and the attorney general is letting him get away with it.
It's good to see more officials speaking out and calling bullshit on the FBI/DOJ claims on all of this.
One of the key lines that various supporters of backdooring encryption have repeated in the last year, is that they "just want to have a discussion" about the proper way to... put backdoors into encryption. Over and over again you had the likes of James Comey insisting that he wasn't demanding backdoors, but really just wanted a "national conversation" on the issue (despite the fact we had just such a conversation in the 90s and concluded: backdoors bad, let's move on.):
My goal today isn’t to tell people what to do. My goal is to urge our fellow citizens to participate in a conversation as a country about where we are, and where we want to be, with respect to the authority of law enforcement.
And, yet, now we're having that conversation. Very loudly. And while the conversation really has been going on for almost two years, in the last month it moved from a conversation among tech geeks and policy wonks into the mainstream, thanks to the DOJ's decision to force Apple to write some code that would undermine security features on the work iPhone of Syed Farook, one of the San Bernardino attackers. According to some reports, the DOJ and FBI purposely chose this case in the belief that it was a perfect "test" case for its side: one that appeared to involve "domestic terrorists" who murdered 14 people. There were reports claiming that Apple was fine fighting this case under seal, but that the DOJ purposely chose to make this request public.
Officials had hoped the Apple case involving a terrorist’s iPhone would rally the public behind what they see as the need to have some access to information on smartphones. But many in the administration have begun to suspect that the F.B.I. and the Justice Department may have made a major strategic error by pushing the case into the public consciousness.
Many senior officials say an open conflict between Silicon Valley and Washington is exactly what they have been trying to avoid, especially when the Pentagon and intelligence agencies are trying to woo technology companies to come back into the government’s fold, and join the fight against the Islamic State. But it appears it is too late to confine the discussion to the back rooms in Washington or Silicon Valley.
While the various public polling on the issue has led to very mixed results, it's pretty clear that the public did not universally swing to the government's position on this. In fact, it appears that the more accurately the situation is described to the public, the more likely they are to side with Apple over the FBI. Given that, John Oliver's recent video on the subject certainly isn't good news for the DOJ.
Either way, the DOJ and FBI insisted they wanted a conversation on this, and now they're getting it. Perhaps they should have been more careful what they wished for.
Not surprisingly, Oliver's take is much clearer and much more accurate than many mainstream press reports on the issues in the case, appropriately mocking the many law enforcement officials who seem to think that, just because Apple employs smart engineers, they can somehow do the impossible and "safely" create a backdoor into an encrypted iPhone that won't have dangerous consequences. He even spends a bit of time reviewing the original Crypto Wars over the Clipper Chip and highlights cryptographer Matt Blaze's contribution in ending those wars by showing that the Clipper Chip could be hacked.
But the biggest contribution to the debate -- which I hope that people pay most attention to -- is the point that Oliver made in the end with his faux Apple commercial. Earlier in the piece, Oliver noted that this belief among law enforcement that Apple engineers can somehow magically do what they want is at least partially Apple's own fault, with its somewhat overstated marketing. So, Oliver's team made a "more realistic" Apple commercial which noted that Apple is constantly fighting security cracks and vulnerabilities and is consistently just half a step ahead of hackers with malicious intent (and, in many cases, half a step behind them).
This is the key point: Building secure products is very, very difficult and even the most secure products have security vulnerabilities in them that need to be constantly watched and patched. And what the government is doing here is not only asking Apple to not patch a security vulnerability that it has found, but actively forcing Apple to make a new vulnerability and then effectively forcing Apple to keep it open. For all the talk of how Apple can just create the backdoor just this once and throw it away, this more like asking Apple to set off a bomb that blows the back off all houses in a city, and then saying, "okay, just throw away the bomb after you set it off."
Hopefully, as in cases like net neutrality, Oliver's piece does it's job in informing the public what's really going on.
Apple didn't need to reply until tomorrow, but has now released its Motion to Vacate the magistrate judge's order from last week, compelling Apple to create a new operating system that undermines a couple of key security features, so that the FBI could then brute force the passcode on Syed Farook's work iPhone. It's clearly a bit of a rush job as there are a few typos (and things like incorrect page numbers in the table of contents). However, it's not too surprising to see the crux of Apple's argument. In summary it's:
The 1789 All Writs Act doesn't apply at all to this situation for a whole long list of reasons that most of this filing will explain.
Even if it does, the order is an unconstitutional violation of the First Amendment (freedom of expression) and the Fifth Amendment (due process).
I really do recommend reading the 65 page filing (it goes fast!). But on the assumption that you have more of a life than we do, let's dig in and detail what Apple's argument is. The brief is quite well written (other than the typos) in making the issues pretty clear:
This is not a case about one isolated iPhone. Rather, this case is about the
Department of Justice and the FBI seeking through the courts a dangerous power that
Congress and the American people have withheld: the ability to force companies like
Apple to undermine the basic security and privacy interests of hundreds of millions of
individuals around the globe. The government demands that Apple create a back door
to defeat the encryption on the iPhone, making its users’ most confidential and
personal information vulnerable to hackers, identity thieves, hostile foreign agents, and
unwarranted government surveillance. The All Writs Act, first enacted in 1789 and on
which the government bases its entire case, “does not give the district court a roving
commission” to conscript and commandeer Apple in this manner. Plum Creek Lumber
Co. v. Hutton, 608 F.2d 1283, 1289 (9th Cir. 1979). In fact, no court has ever
authorized what the government now seeks, no law supports such unlimited and
sweeping use of the judicial process, and the Constitution forbids it.
The motion also notes the importance of strong encryption in keeping people safe and secure:
Since the dawn of the computer age, there have been malicious people dedicated
to breaching security and stealing stored personal information. Indeed, the government
itself falls victim to hackers, cyber-criminals, and foreign agents on a regular basis,
most famously when foreign hackers breached Office of Personnel Management
databases and gained access to personnel records, affecting over 22 million current and
former federal workers and family members. In the face of this daily siege, Apple is
dedicated to enhancing the security of its devices, so that when customers use an
iPhone, they can feel confident that their most private personal information—financial
records and credit card information, health information, location data, calendars,
personal and political beliefs, family photographs, information about their children—will be safe and secure. To this end, Apple uses encryption to protect its customers
from cyber-attack and works hard to improve security with every software release
because the threats are becoming more frequent and sophisticated. Beginning with
iOS 8, Apple added additional security features that incorporate the passcode into the
encryption system. It is these protections that the government now seeks to roll back
by judicial decree.
And the filing makes it clear that the government is lying in claiming that this is all just about this phone:
The government says: “Just this once” and “Just this phone.” But the
government knows those statements are not true; indeed the government has filed
multiple other applications for similar orders, some of which are pending in other
courts.2 And as news of this Court’s order broke last week, state and local officials
publicly declared their intent to use the proposed operating system to open hundreds of
other seized devices—in cases having nothing to do with terrorism. If this order is
permitted to stand, it will only be a matter of days before some other prosecutor, in
some other important case, before some other judge, seeks a similar order using this
case as precedent. Once the floodgates open, they cannot be closed, and the device
security that Apple has worked so tirelessly to achieve will be unwound without so
much as a congressional vote. As Tim Cook, Apple’s CEO, recently noted: “Once
created, the technique could be used over and over again, on any number of devices.
In the physical world, it would be the equivalent of a master key, capable of opening
hundreds of millions of locks—from restaurants and banks to stores and homes. No
reasonable person would find that acceptable.”
There's a footnote in the middle of that which points to Manhattan DA Cyrus Vance already talking about why he supports the FBI, and how he has 155 to 160 phones that he wants to force Apple to help unlock.
Apple also details how accepting the government's interpretation of the All Writs Act here could easily extend in absolutely crazy ways:
Finally, given the government’s boundless interpretation of the All Writs Act, it
is hard to conceive of any limits on the orders the government could obtain in the
future. For example, if Apple can be forced to write code in this case to bypass
security features and create new accessibility, what is to stop the government from
demanding that Apple write code to turn on the microphone in aid of government
surveillance, activate the video camera, surreptitiously record conversations, or turn on
location services to track the phone’s user? Nothing.
Unfortunately, the FBI, without consulting Apple or reviewing its public
guidance regarding iOS, changed the iCloud password associated with one of the
attacker’s accounts, foreclosing the possibility of the phone initiating an automatic
iCloud back-up of its data to a known Wi-Fi network... which could have obviated the need
to unlock the phone and thus for the extraordinary order the government now seeks.21
Had the FBI consulted Apple first, this litigation may not have been necessary.
Apple's filing also does a good job debunking the DOJ's ridiculous "this is no burden, because it's just software and Apple writes software" argument:
The compromised operating system that the government demands would require
significant resources and effort to develop. Although it is difficult to estimate, because
it has never been done before, the design, creation, validation, and deployment of the
software likely would necessitate six to ten Apple engineers and employees dedicating
a very substantial portion of their time for a minimum of two weeks, and likely as
many as four weeks.... Members of the team would
include engineers from Apple’s core operating system group, a quality assurance
engineer, a project manager, and either a document writer or a tool writer....
No operating system currently exists that can accomplish what the government
wants, and any effort to create one will require that Apple write new code, not just
disable existing code functionality.... Rather, Apple will need to design and
implement untested functionality in order to allow the capability to enter passcodes
into the device electronically in the manner that the government describes.... In
addition, Apple would need to either develop and prepare detailed documentation for
the above protocol to enable the FBI to build a brute-force tool that is able to interface
with the device to input passcode attempts, or design, develop and prepare
documentation for such a tool itself.... Further, if the tool is utilized remotely
(rather than at a secure Apple facility), Apple will also have to develop procedures to
encrypt, validate, and input into the device communications from the FBI.... This
entire development process would need to be logged and recorded in case Apple’s
methodology is ever questioned, for example in court by a defense lawyer for anyone
charged in relation to the crime....
Once created, the operating system would need to go through Apple’s quality
assurance and security testing process.... Apple’s software ecosystem is
incredibly complicated, and changing one feature of an operating system often has
ancillary or unanticipated consequences.... Thus, quality assurance and
security testing would require that the new operating system be tested on multiple devices and validated before being deployed.... Apple would have to undertake
additional testing efforts to confirm and validate that running this newly developed
operating system to bypass the device’s security features will not inadvertently destroy
or alter any user data.... To the extent problems are identified (which is almost
always the case), solutions would need to be developed and re-coded, and testing
would begin anew.... As with the development process, the entire quality
assurance and security testing process would need to be logged, recorded, and
preserved.... Once the new custom operating system is created and validated, it
would need to be deployed on to the subject device, which would need to be done at an
Apple facility.... And if the new operating system has to be destroyed and
recreated each time a new order is issued, the burden will multiply.
From there we dig into the meat of the filing: that the All Writs Act doesn't apply.
The All Writs Act (or the “Act”) does not provide the judiciary with the
boundless and unbridled power the government asks this Court to exercise. The Act is
intended to enable the federal courts to fill in gaps in the law so they can exercise the
authority they already possess by virtue of the express powers granted to them by the
Constitution and Congress; it does not grant the courts free-wheeling authority to
change the substantive law, resolve policy disputes, or exercise new powers that
Congress has not afforded them. Accordingly, the Ninth Circuit has squarely rejected
the notion that “the district court has such wide-ranging inherent powers that it can
impose a duty on a private party when Congress has failed to impose one. To so rule
would be to usurp the legislative function and to improperly extend the limited federal
Congress has never authorized judges to compel innocent third parties to
provide decryption services to the FBI. Indeed, Congress has expressly withheld that
authority in other contexts, and this issue is currently the subject of a raging national
policy debate among members of Congress, the President, the FBI Director, and state
and local prosecutors. Moreover, federal courts themselves have never recognized an
inherent authority to order non-parties to become de facto government agents in
ongoing criminal investigations. Because the Order is not grounded in any duly
enacted rule or statute, and goes well beyond the very limited powers afforded by
Article III of the Constitution and the All Writs Act, it must be vacated.
In short, Apple is leaning heavily on the idea that CALEA pre-empts the All Writs Act here, and that CALEA explicitly says that companies can't be forced into helping to decrypt encrypted content. Beyond that, Apple is claiming that it's "too far removed" from the case for the All Writs Act to apply and mocks the idea (put forth by the DOJ) that because Apple licenses its software instead of selling it, that makes it okay:
Apple is no more connected to this phone than General Motors is to a
company car used by a fraudster on his daily commute. Moreover, that Apple’s
software is “licensed, not sold,”..., is “a total red herring,” as Judge
Orenstein already concluded.... A licensing
agreement no more connects Apple to the underlying events than a sale. The license
does not permit Apple to invade or control the private data of its customers. It merely
limits customers’ use and redistribution of Apple’s software. Indeed, the government’s
position has no limits and, if accepted, would eviscerate the “remoteness” factor
entirely, as any company that offers products or services to consumers could be
conscripted to assist with an investigation, no matter how attenuated their connection
to the criminal activity. This is not, and never has been, the law.
From there, Apple attacks the argument that there is no undue burden on Apple if it's forced to build this system, which Apple calls GovtOS. It starts out by noting that the idea that Apple can just create the software for this one phone and delete it appears nonsensical when put in context:
Moreover, the government’s flawed suggestion to delete the program and erase
every trace of the activity would not lessen the burden, it would actually increase it
since there are hundreds of demands to create and utilize the software waiting in the
wings..... If Apple creates new software to open a back door, other federal
and state prosecutors—and other governments and agencies—will repeatedly seek
orders compelling Apple to use the software to open the back door for tens of
thousands of iPhones. Indeed, Manhattan District Attorney Cyrus Vance, Jr., has made
clear that the federal and state governments want access to every phone in a criminal
investigation.... [Charlie Rose, Television Interview of Cyrus Vance (Feb. 18, 2016)]
(Vance stating “absolutely” that he “want[s] access to all those phones that [he thinks]
are crucial in a criminal proceeding”). This enormously intrusive burden—building
everything up and tearing it down for each demand by law enforcement—lacks any
support in the cases relied on by the government, nor do such cases exist.
The alternative—keeping and maintaining the compromised operating system
and everything related to it—imposes a different but no less significant burden, i.e.,
forcing Apple to take on the task of unfailingly securing against disclosure or
misappropriation the development and testing environments, equipment, codebase,
documentation, and any other materials relating to the compromised operating system.... Given the millions of iPhones in use and the value of the data on them,
criminals, terrorists, and hackers will no doubt view the code as a major prize and can
be expected to go to considerable lengths to steal it, risking the security, safety, and
privacy of customers whose lives are chronicled on their phones. Indeed, as the
Supreme Court has recognized, “[t]he term ‘cell phone’ is itself misleading shorthand;
. . . these devices are in fact minicomputers” that “could just as easily be called
cameras, video players, rolodexes, calendars, tape recorders, libraries, diaries, albums,
televisions, maps, or newspapers.”...By forcing Apple to write code to compromise its encryption defenses, the
Order would impose substantial burdens not just on Apple, but on the public at large.
And in the meantime, nimble and technologically savvy criminals will continue to use
other encryption technologies, while the law-abiding public endures these threats to
their security and personal liberties—an especially perverse form of unilateral
disarmament in the war on terror and crime.
That last point is key. Criminals will still use other forms of encryption, while forcing Apple to do this harms everyone else by putting them more at risk.
Here Apple goes even deeper in questioning what are the limits to the All Writs Act:
For example, under the
same legal theories advocated by the government here, the government could argue
that it should be permitted to force citizens to do all manner of things “necessary” to
assist it in enforcing the laws, like compelling a pharmaceutical company against its
will to produce drugs needed to carry out a lethal injection in furtherance of a lawfully
issued death warrant, or requiring a journalist to plant a false story in order to help
lure out a fugitive, or forcing a software company to insert malicious code in its autoupdate
process that makes it easier for the government to conduct court-ordered
Next, Apple calls bullshit on the DOJ's claim that it absolutely needs Apple's help here. First, the FBI messed things up with the whole resetting iCloud password thing, and then what about the NSA? Why can't the NSA just hack in? That's what the following is saying in a more legalistic way:
... the government has failed to demonstrate that the requested
order was absolutely necessary to effectuate the search warrant, including that it
exhausted all other avenues for recovering information. Indeed, the FBI foreclosed
one such avenue when, without consulting Apple or reviewing its public guidance
regarding iOS, the government changed the iCloud password associated with an
attacker’s account, thereby preventing the phone from initiating an automatic iCloud back-up.... Moreover, the government has not made any showing that it
sought or received technical assistance from other federal agencies with expertise in
digital forensics, which assistance might obviate the need to conscript Apple to create
the back door it now seeks. See... (Judge Orenstein asking the government “to make a representation for
purposes of the All Writs Act” as to whether the “entire Government,” including the
“intelligence community,” did or did not have the capability to decrypt an iPhone, and
the government responding that “federal prosecutors don’t have an obligation to
consult the intelligence community in order to investigate crime”).
From there, we move onto the Constitutional arguments, which the court might not even address if it decides the All Writs Act doesn't apply. But, here, Apple starts with the First Amendment concerns of "compelled" speech.
Under well-settled law, computer code is treated as speech within the meaning
of the First Amendment.... The Supreme Court has made clear that where, as here, the government seeks to
compel speech, such action triggers First Amendment protections..... Compelled speech is a content-based restriction subject to exacting
scrutiny... and so may only be upheld if it is narrowly tailored to obtain a compelling state interest....
The government cannot meet this standard here. Apple does not question the
government’s legitimate and worthy interest in investigating and prosecuting terrorists,
but here the government has produced nothing more than speculation that this iPhone
might contain potentially relevant information... It is well known that terrorists and other criminals use highly sophisticated
encryption techniques and readily available software applications, making it likely that
any information on the phone lies behind several other layers of non-Apple encryption....
This argument feels a bit weakly supported. Then there's the Fifth Amendment argument, concerning due process:
In addition to violating the First Amendment, the government’s requested order,
by conscripting a private party with an extraordinarily attenuated connection to the
crime to do the government’s bidding in a way that is statutorily unauthorized, highly
burdensome, and contrary to the party’s core principles, violates Apple’s substantive
due process right to be free from “‘arbitrary deprivation of [its] liberty by
Again, this feels a bit weakly developed, but not surprisingly so. Apple is betting heavily that its main argument, concerning the All Writs Act not applying, will win the day (which seems to have a strong likelihood of being true). The Constitutional arguments are just being thrown in there so that they're in the case at this stage, and can then be raised on appeal, should it get to that level.
I imagine the DOJ will respond to this before long as well, so stay tuned (we certainly will).
Earlier this week, we highlighted a questionable poll done by the Pew Research folks concerning the Apple/FBI fight, and noted that the actual questions it asked were wrong and misleading (and also leading...) and that resulted in fairly meaningless results, which were then spun by reporters into false claims that the public backed the FBI in this fight:
And, once again, the poll is basically meaningless when it comes to the actual issues in this case. You can read the details of the questions in the linked document, which shows that, before asking the key question, the pollsters asked a bunch of questions about whether or not people were willing to "give up privacy" to help the US government on a variety of things. And lots of people said no. These questions more or less framed the issue as one about protecting your own privacy -- as compared to the Pew poll that framed it more as being about investigating the San Bernardino attacks. Then after all those questions, the poll asks about the specifics of the Apple case, where they frame the question much more broadly than Pew's. Here's Reuters question:
Apple is opposing a court order to unlock a smart phone
that was used by one of the shooters in the San Bernardino attack.
Apple is concerned that if it helps the FBI this time, it will be forced to
help the government in future cases that may not be linked to
national security, opening the door for hackers and potential future data breaches for smartphone users.
Do you agree or disagree with Apple’s decision to oppose the court
And, to refresh your memory, here's how Pew asked it:
As you may know, RANDOMIZE: [the FBI has said that accessing the iPhone is an important part of their ongoing investigation into the San Bernardino attacks] while [Apple has said that unlocking the iPhone could compromise the security of other users’ information] do you think Apple [READ; RANDOMIZE]?
(1) Should unlock the iPhone (2) Should not unlock the iPhone (3) Don't Know.
Notice that the Reuters/Ipsos version focuses solely on the downsides laid out by Apple, and not the supposed intent of the FBI. The Pew poll tries to "balance" the two. Meanwhile both polls get the basic facts wrong, because the request is not to "unlock a smart phone" because Apple cannot "unlock it." The actual ask is that it build a new operating system (which has some big challenges) that has purposely undermined two key security features on the iPhone, so that the FBI can then hack the passcode and access the phone. The specifics here matter and neither poll gets them right.
So while I, personally, think Apple is the one to support in this fight, I don't think either poll really says much about anything, other than that depending on how you word a poll, you can get very, very different results. That's really not particularly interesting as it pertains to the actual debate here. Stupid polls get stupid answers.
There are all sorts of interesting (and frustrating and challenging) legal questions raised by the FBI's use of the All Writs Act to try to force Apple to build a system to allow the FBI to hack Apple's customers. But there's one interesting one raised by Albert Gidari that may cut through a lot of the "bigger" questions (especially the Constitutional ones that everyone leaps to) and just makes a pretty simple point: the DOJ is simply wrong that the All Writs Act applies here, rather than the existing wiretapping statute, the Communications Assistance for Law Enforcement Act, or 47 USC 1002, better known by basically everyone as CALEA. CALEA is the law that some (including the DOJ) have wanted "updated" in ways that might force internet companies and mobile phone companies to make their devices more wiretap-ready. But that hasn't happened.
And, as Gidari points out, it seems clear that CALEA preempts the All Writs Act and explicitly forbids what the FBI is requesting here. The DOJ is claiming that CALEA doesn't apply to Apple:
Put simply, CALEA is entirely inapplicable to the present dispute [because] Apple is not acting as a telecommunications carrier, and the Order concerns access to stored data rather than real time interceptions and call-identifying information
But Gidari notes that's misrepresenting CALEA, which also does apply to "manufacturers and providers of telecommunications support services" and Apple could be seen as qualifying, since it's providing the "equipment" here. And then if CALEA, rather than the All Writs Act applies, the DOJ's argument is basically dead on arrival. As many have noted, CALEA already says that you can't force a provider to decrypt encrypted communications:
A telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication.
Now, some may argue that in this case Apple "possesses the information necessary," but that's not actually the case. Apple doesn't possess the information necessary to decrypt. It's being asked to build a system that would let the FBI then hack the system to decrypt. And that's different. And on that point, there's this in CALEA as well:
(1) Design of features and systems configurations. This subchapter does not authorize any law enforcement agency or office
(a) to require any specific design of equipment, facilities, services, features, or system configurations to be adopted by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services;
(b) to prohibit the adoption of any equipment, facility, service, or feature by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services.
In a follow up post, Gidari looks at the legislative history of CALEA as well, and notes that it was a compromise between law enforcement (who wanted access to everything) and telcos (who didn't want to give that much access). And the end result was that CALEA was designed to be clear that, no, law enforcement can't always get anything, and certainly can't force companies to build new tools:
Indeed, Congress outright rejected the government’s initial CALEA proposal to actually prevent deployment of new technologies that didn’t have a wiretap back door. As Congress noted, “[t]his is the exact opposite of the original versions of the legislation, which would have barred introduction of services or features that could not be tapped.” In other words, Congress accepted the fact that some new technologies would put some evidence that law enforcement wanted, needed, and may have had access to in the past, beyond its reach in some cases.
Congress also determined that carriers would have no responsibility to decrypt encrypted communications unless the carrier provided the encryption and could in fact decrypt it. CALEA did not prohibit a carrier from deploying an encryption service for which it did not retain the ability to decrypt communications for law enforcement access, period. Here again, CALEA recognized that some evidence that may be necessary to an investigation will not be available to the government because it is encrypted and the provider lacks the key to access it.
So while CALEA provided law enforcement with some surveillance capabilities on phone networks (which the Federal Communications Commission later extended to broadband Internet access and two-way Voice over IP), it precluded the government from requiring “any specific design of equipment, facilities, services, features, or system configurations to be adopted by any manufacturer of telecommunications equipment.” Requiring Apple by court order to create and implement a work-around for the iPhone’s security features is, in fact, doing what CALEA prohibited.
While a big Constitutional battle may be more interesting (and more long lasting), it's possible that an argument like this one might win the actual lawsuit.
Of course, then the battle will shift back to Congress to try to update CALEA...
Look, let's face facts here. For all the talk coming from the law enforcement community that they need backdoors into encryption to stop crime, they absolutely know that the reverse is true: strong encryption prevents crime. Lots of it. Strong encryption on phones makes stealing those phones a lot less worthwhile, because all the information on them is locked up. As we noted back in 2014, the FBI had a webpage advocating for mobile encryption to protect your phone's data:
Of course, after that started to look inconvenient for the FBI, they quietly removed that page. I have a FOIA request in asking why, but the FBI has told me I shouldn't expect an answer for another year or two.
But it's not just the FBI. Trevor Timm alerts us to the amazing fact that just a couple of years ago, the New York City Police Department (NYPD) was literally roaming the streets, giving people fliers telling them to upgrade their iPhones to enable greater security features to protect against crime. Michael Hoffman tweeted a picture of the flier he received:
As of Wednesday, September 18, 2013 the new iOS7 software update available for your Apple product brings added security to your devices.
By downloading the new operating system, should your device be lost or stolen it cannot be reprogrammed without an Apple ID and Password.
The download is FREE from Apple.
In other words, law enforcement in NYC absolutely knows that stronger security on phones prevents crime. And yet, Manhattan District Attorney Cyrus Vance is running around pretending that these phones have created a crime wave in NY?
And, it appears that the data absolutely supports what the FBI and the NYPD apparently used to know, but are now pretending to forget. An article last summer by Kevin Bankston, laid out the details, noting that phone theft is a massive epidemic, with criminals swiping millions of phones -- and many of them then seeking to get access to the data on those phones. While the introduction of remote kill switches has helped reduce some of that, encryption is a much better solution.
So what happened? Did the FBI and NYPD really "forget" everything they knew two and a half years ago about encryption and how it stops crime?
So... over the past couple days, plenty of folks (including us) have reported that the backdoor demanded by the FBI (and currently granted by a magistrate judge) would likely work on the older iPhone model in question, the iPhone 5C, but that it would not work on modern iPhones that have Apple's "Secure Enclave" -- basically a separate chip that stores the key.
Plenty of reports -- including the Robert Graham post that we linked to, and a story by Bruce Schneier -- suggested that an attempt to follow through with the FBI's request in the presence of the Secure Enclave would effectively eliminate the key and make decryption nearly impossible.
However, earlier this morning Apple started telling a bunch of people, including reporters, that this is not true. Effectively they're saying that, yes, the new software could update the Secure Enclave firmware and keep the key intact -- meaning that this backdoor absolutely can be used against modern iPhones. One of the guys who helped design the whole Secure Enclave setup in the first place, John Kelley, has basically said the same thing, admitting that updating the firmware will not delete the key:
@AriX Not true, if Apple can be forced to modify iOS, they can be forced to modify SEP firmware as well. @trailofbits has SEP details wrong
A blog post by Dan Guido -- which originally asserted that the Secure Enclave would be wiped on update -- now admits that's not true and, yes, this backdoor likely works on modern iPhones as well:
Apple can update the SE firmware, it does not require the phone passcode, and it does not wipe user data on update. Apple can disable the passcode delay and disable auto erase with a firmware update to the SE. After all, Apple has updated the SE with increased delays between passcode attempts and no phones were wiped.
I've asked some security folks if it's possible that future iPhones could be designed to work the way people thought the Secure Enclave worked, and the basic answer appears to be "that's a fairly difficult problem." People have some ideas of how it might work, but all came back with reasons why it might not. I asked one security expert if there was a way for Apple to build a more secure version that was immune to such an FBI request, and the response was: "I don't know. I sure hope so."
Update: I should add that this backdoor still just makes it easier for the FBI to then try to brute force a user's PIN or passcode. If the user sets a significantly strong passcode, you have a better chance of protecting your data, but that's on the user (and, also, many users likely find it hellishly inconvenient to have a strong passcode on their phone).