One of the more ridiculous claims in the DOJ's filing against Apple last week, was its decision to pick up on former NSA lawyer Stewart Baker's conspiracy theory that Apple had built backdoors into its products for China (side note: I met Stewart in person for the first time recently, and he mocked me about this, saying that I should agree with him on this point). However, as we noted in our post last week, there doesn't seem to be much evidence to support Baker's claims. The two key issues were using the Chinese wireless standard WAPI -- which some have claimed includes some sort of backdoor, but it was also the only real local area wireless tech in China for a while -- and the decision to store iCloud data in China. However, as we noted, there have been reports that the Chinese government tried to then conduct a man in the middle attack against the iCloud servers. If Apple had actually given the government a backdoor, then why would it need to do that?
Apple uses the same security protocols everywhere in the world.
Apple has never made user data, whether stored on the iPhone or in iCloud, more technologically accessible to any country's government. We believe any such access is too dangerous to allow. Apple has also not provided any government with its proprietary iOS source code. While governmental agencies in various countries, including the United States, perform regulatory reviews of new iPhone
releases, all that Apple provides in those circumstances is an unmodified iPhone device.
It is my understanding that Apple has never worked with any government agency from any country to create a "backdoor" in any of our products or services.
Now, some may push back on the point about WAPI, but again, making use of a third party technology that potentially has backdoors (some of which could be protected against) and being told by the government to build special backdoors just for that government are still vastly different scenarios.
We already covered Apple's reply brief in the fight over getting into Syed Farook's encrypted work iPhone, highlighting a number of lies by the DOJ's filing. But I wanted to focus on a few more highlighted in the additional declarations filed by Apple as well. The DOJ kept insisting that Apple built this feature specifically to keep law enforcement out, which is ridiculous. Apple notes repeatedly that it built the feature to keep its customers safer from malicious attacks, most of which are not from law enforcement. But the DOJ keeps pretending that it was a deliberate attempt to mock law enforcement. In the DOJ's filing:
Here, Apple has deliberately used its control over its software to block law-enforcement requests for access to the contents of its devices, and it has advertised that feature to sell its products.
Since the introduction of iOS 8 in October 2014, Apple has placed
approximately 1,793 advertisements worldwide—627 in the United States alone—of
different types, including, print ads, television ads, online ads, cinema ads, radio ads
and billboards. Those advertisements have generated an estimated 253 billion
impressions worldwide and 99 billion impressions in the United States alone (an
impression is an estimate of the number of times an ad is viewed or displayed online).
Of those advertisements, not a single one has ever advertised or promoted
the ability of Apple’s software to block law enforcement requests for access to the
contents of Apple devices.
Indeed, only three of those advertisements reference security at all, and all three related to the Apple Pay service, and then only to say that Apple Pay is "safer than a credit card, and keeps your info yours."
I'm assuming the DOJ, if it decides to push this point, will argue that it wasn't talking about those kinds of advertisements, but Apple's statements to the press, but still, there's a strong point here. Contrary to what the DOJ is saying, no, the company does not proactively advertise the encryption as a way to keep law enforcement out. Or, in short, no, FBI, strong encryption on the iPhone just isn't about you.
As expected, Apple has now responded to the DOJ in the case about whether or not it can be forced to write code to break its own security features to help the FBI access the encrypted work iPhone of Syed Farook, one of the San Bernardino attackers. As we noted, the DOJ's filing was chock-full of blatantly misleading claims, and Apple was flabbergasted by just how ridiculous that filing was. That comes through in this response.
The government attempts to rewrite history by portraying the Act as an all-powerful magic wand rather than the limited procedural tool it is. As theorized by the government, the Act can authorize any and all relief except in two situations: (1) where Congress enacts a specific statute prohibiting the precise action (i.e., says a court may not “order a smartphone manufacturer to remove barriers to accessing stored data on a particular smartphone,” ... or (2) where the government seeks to “arbitrarily dragoon” or “forcibly deputize” “random citizens” off the street.... Thus, according to the government, short of kidnapping or breaking an express law, the courts can order private parties to do virtually anything the Justice Department and FBI can dream up. The Founders would be appalled.
The Founders would be appalled. That's quite a statement.
Apple also slams the DOJ for insisting that this really is all about the one iPhone and that the court should ignore the wider precedent, citing FBI Director James Comey's own statements:
It has become crystal clear that this case is not about a “modest” order and a “single iPhone,” Opp. 1, as the FBI Director himself admitted when testifying before Congress two weeks ago. Ex. EE at 35 [FBI Director James Comey, Encryption Hr’g] (“[T]he broader question we’re talking about here goes far beyond phones or far beyond any case. This collision between public safety and privacy—the courts cannot resolve that.”). Instead, this case hinges on a contentious policy issue about how society should weigh what law enforcement officials want against the widespread repercussions and serious risks their demands would create. “Democracies resolve such tensions through robust debate” among the people and their elected representatives, Dkt. 16-8 [Comey, Going Dark], not through an unprecedented All Writs Act proceeding.
Apple then, repeatedly, points out where the DOJ selectively quoted, misquoted or misleadingly quoted arguments in its favor. For example:
The government misquotes Bank of the United States v. Halstead,..., for the proposition that “‘[t]he operation of [the Act]’” should not be limited
“‘to that which it would have had in the year 1789.’” ... (misquoting Halstead, 23 U.S. (10 Wheat.) at 62) (alterations are the government’s). But what the Court actually said was that the “operation of an execution”—the ancient common law writ of “venditioni exponas”—is not limited to that “which it would have had in the year 1789.” ... see also... (“That executions are among the writs hereby authorized to be issued, cannot admit of a doubt . . . .”). The narrow holding of Halstead was that the Act (and the Process Act of 1792) allowed courts “to alter the form of the process of execution.” ... (courts are not limited to the form of the writ of execution “in use in the Supreme Courts of the several States in the year 1789”). The limited “power given to the Courts over their process is no more than authorizing them to regulate and direct the conduct of the Marshal, in the execution of the process.”
The authority to alter the process by which courts issue traditional common law writs is not authority to invent entirely new writs with no common law analog. But that is precisely what the government is asking this Court to do: The Order requiring Apple to create software so that the FBI can hack into the iPhone has no common law analog.
The filing then goes step by step in pointing out how the government is wrong about almost everything. The DOJ, for example, kept insisting that CALEA doesn't apply at all to Apple, but Apple points out that the DOJ just seems to be totally misreading the law:
Contrary to the government’s assertion that its request merely “brush[es] up against similar issues” to CALEA..., CALEA, in fact, has three critical limitations—two of which the government ignores entirely—that preclude the relief the government seeks.... First, CALEA prohibits law enforcement agencies from requiring “electronic communication service” providers to adopt “any specific design of equipment, facilities, services, features, or system configurations . . . .” The term “electronic communication service” provider is broadly defined to encompass Apple. ... (“any service which provides to users thereof the ability to send or receive wire or electronic communications”). Apple is an “electronic communication services” provider for purposes of the very services at issue here because Apple’s software allows users to “send or receive . . . communications” between iPhones through features such as iMessage and Mail....
The government acknowledges that FaceTime and iMessage are electronic communication services, but asserts that this fact is irrelevant because “the Court’s order does not bear at all upon the operation of those programs.” ... Not so. The passcode Apple is being asked to circumvent is a feature of the same Apple iOS that runs FaceTime, iMessage, and Mail, because an integral part of providing those services is enabling the phone’s owner to password-protect the private information contained within those communications. More importantly, the very communications to which law enforcement seeks access are the iMessage communications stored on the phone.... And, only a few pages after asserting that “the Court’s order does not bear at all upon the operation of” FaceTime and iMessage for purposes of the CALEA analysis..., the government spends several pages seeking to justify the Court’s order based on those very same programs, arguing that they render Apple “intimately close” to the crime for purposes of the New York Telephone analysis.
Second, the government does not dispute, or even discuss, that CALEA excludes “information services” providers from the scope of its mandatory assistance provisions.... Apple is indisputably an information services provider given
the features of iOS, including Facetime, iMessage, and Mail....
Finally, CALEA makes clear that even telecommunications carriers (a category of providers subject to more intrusive requirements under CALEA, but which Apple is not) cannot be required to “ensure the government’s ability” to decrypt or to create decryption programs the company does not already “possess.”... If companies subject to CALEA’s obligations cannot be required to bear this burden, Congress surely did not intend to allow parties specifically exempted by CALEA (such as Apple) to be subjected to it. The government fails to address this truism.
Next, Apple rebuts the DOJ saying that since CALEA doesn't address this specific situation, that means Congress is just leaving it up to the courts to use the All Writs Act. As Apple points out, in some cases, Congress not doing something doesn't mean it rejected certain positions, but in this case, the legislative history is quite clear that Congress did not intend for companies to be forced to help in this manner.
Here, Congress chose to require limited third-party assistance in certain statutes designed to aid law enforcement in gathering electronic evidence (although none as expansive as what the government seeks here), but it has declined to include similar provisions in other statutes, despite vigorous lobbying by law enforcement and notwithstanding its “prolonged and acute awareness of so important an issue” as the one presented here.... Accordingly, the lack of statutory authorization in CALEA or any of the complementary statutes in the “comprehensive federal scheme” of surveillance and telecommunications law speaks volumes.... To that end, Congress chose to “greatly narrow” the “scope of [CALEA],” which ran contrary to the FBI’s interests but was “important from a privacy standpoint.” ... Indeed, CALEA’s provisions were drafted to “limit the scope of [industry’s] assistance requirements in several important ways.”....
That the Executive Branch recently abandoned plans to seek legislation expanding CALEA’s reach... provides renewed confirmation that Congress has not acceded to the FBI’s wishes, and belies the government’s view that it has possessed such authority under the All Writs Act since 1789.
In fact, in a footnote, Apple goes even further in not just blasting the DOJ's suggestion that Congress didn't really consider a legislative proposal to update CALEA to suck in requirements for internet communications companies, but also highlighting the infamous quote from top intelligence community lawyer Robert Litt about how they'd just wait for the next terrorist attack and get the law passed in their favor at that point.
The government’s attempts to minimize CALEA II, saying its plans consisted of “mere vague discussions” that never developed into a formal legislative submission ..., but federal officials familiar with that failed lobbying effort confirmed that the FBI had in fact developed a “draft proposal” containing a web of detailed provisions, including specific fines and compliance timelines, and had floated that proposal with the White House..... As
The Washington Post reported, advocates of the proposal within the government dropped the effort, because they determined they could not get what they wanted from Congress at that time: “Although ‘the legislative environment is very hostile today,’ the intelligence community’s top lawyer, Robert S. Litt, said to colleagues in an August  e-mail, which was obtained by The Post, ‘it could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement.’ There is value, he said, in ‘keeping our options open for such a situation.’”
Next Apple goes through the arguments for saying that, even if the All Writs Act does apply, and even if the court accepts the DOJ's made up three factor test, Apple should still prevail. It notes, again, that it is "far removed" from the issue and reminds the court that the order sought here is very different from past cases where Apple has cooperated:
The government argues that “courts have already issued AWA orders” requiring manufacturers to “unlock” phones ... but those cases involved orders requiring “unlocking” assistance to provide access through existing means, not the extraordinary remedy sought here, i.e., an order that requires creating new software to undermine the phones’ (or in the Blake case, the iPad’s) security safeguards.
It also mocks that weird argument from the DOJ that said because Apple "licenses" rather than "sells" its software, that means Apple is more closely tied to the case:
The government discusses Apple’s software licensing and data policies at length, equating Apple to a feudal lord demanding fealty from its customers (“suzerainty”). ... But the government does not cite any authority, and none exists, suggesting that the design features and software that exist on every iPhone somehow link Apple to the subject phone and the crime. Likewise, the government has cited no case holding that a license to use a product constituted a sufficient connection under New York Telephone. Indeed, under the government’s theory, any ongoing postpurchase connection between a manufacturer or service provider and a consumer suffices to connect the two in perpetuity—even where, as here, the data on the iPhone is inaccessible to Apple.
From there, Apple dives in on the question of how much of a "burden" this would be. This is the issue that Judge Pym has indicated she's most interested in, and Apple goes deep here -- again and again focusing on how the DOJ was blatantly misleading in its motion:
Forcing Apple to create new software that degrades its security features is unprecedented and unlike any burden ever imposed under the All Writs Act. The government’s assertion that the phone companies in Mountain Bell and In re Application of the U.S. for an Order Authorizing the Installation of a Pen Register or Touch-Tone Decoder and a Terminating Trap ..., were conscripted to “write” code, akin to the request here... mischaracterizes the actual assistance required in those cases. The government seizes on the word “programmed” in those cases and superficially equates it to the process of creating new software..... But the “programming” in those cases—back in 1979 and 1980—consisted of a “technician” using a “teletypewriter” in Mountain Bell ..., and “t[ook] less than one minute” in Penn Bell... Indeed, in Mountain Bell, the government itself stated that the only burden imposed “was a large number of print-outs on the teletype machine”—not creating new code..... More importantly, the phone companies already had and themselves used the tracing capabilities the government wanted to access.... And although relying heavily on Mountain Bell, the government neglects to point out the court’s explicit warning that “[t]his holding is a narrow one, and our decision today should not be read to authorize the wholesale imposition upon private, third parties of duties pursuant to search warrants.” ...This case stands light years from Mountain Bell. The government seeks to commandeer Apple to design, create, test, and validate a new operating system that does not exist, and that Apple believes—with overwhelming support from the technology community and security experts—is too dangerous to create.
Seeking to belittle this widely accepted policy position, the government grossly mischaracterizes Apple’s objection to the requested Order as a concern that “compliance will tarnish its brand”..., a mischaracterization that both the FBI Director and the courts have flatly rejected. [See Comey] (“I don’t question [Apple’s] motive”);... (disagreeing “with the government’s contention that Apple’s objection [to being compelled to decrypt an iPhone] is not ‘conscientious’ but merely a matter of ‘its concern with public relations’”). As Apple explained in its Motion, Apple prioritizes the security and privacy of its users, and that priority is reflected in Apple’s increasingly secure operating systems, in which Apple has chosen not to create a back door.
Apple also calls out the DOJ's technical ignorance.
The government’s assertion that “there is no reason to think that the code Apple writes in compliance with the Order will ever leave Apple’s possession” ... simply shows the government misunderstands the technology and the nature of the cyber-threat landscape. As Apple engineer Erik Neuenschwander states:
I believe that Apple’s iOS platform is the most-attacked software platform in existence. Each time Apple closes one vulnerability, attackers work to find another. This is a constant and never-ending battle. Mr. Perino’s description of third-party efforts to circumvent Apple’s security demonstrates this point. And the protections that the government now asks Apple to compromise are the most security-critical software component of the iPhone—any vulnerability or back door, whether introduced intentionally or unintentionally, can represent a risk to all users of Apple devices simultaneously.
... The government is also mistaken in claiming that the crippled iOS it wants Apple to build can only be used on one iPhone:
Mr. Perino’s characterization of Apple’s process . . . is inaccurate. Apple does not create hundreds of millions of operating systems each tailored to an individual device. Each time Apple releases a new operating system, that operating system is the same for every device of a given model. The operating system then gets a personalized signature specific to each device. This personalization occurs as part of the installation process after the iOS is created.
Once GovtOS is created, personalizing it to a new device becomes a simple process. If Apple were forced to create GovtOS for installation on the device at issue in this case, it would likely take only minutes for Apple, or a malicious actor with sufficient access, to perform the necessary engineering work to install it on another device of the same model.
. . . [T]he initial creation of GovtOS itself creates serious ongoing burdens and risks. This includes the risk that if the ability to install GovtOS got into the wrong hands, it would open a significant new avenue of attack, undermining the security protections that Apple has spent years developing to protect its customers.
And, not surprisingly, Apple angrily attacks the DOJ's bogus misleading use of Apple's transparency report statements about responding to lawaful requests for government information in China, by pointing out how that's quite different than this situation:
Finally, the government attempts to disclaim the obvious international implications of its demand, asserting that any pressure to hand over the same software to foreign agents “flows from [Apple’s] decision to do business in foreign countries . . . .”. Contrary to the government’s misleading statistics ..., which had to do with lawful process and did not compel the creation of software that undermines the security of its users, Apple has never built a back door of any kind into iOS, or otherwise made data stored on the iPhone or in iCloud more technically accessible to any country’s government.... The government is wrong in asserting that Apple made “special accommodations” for China, as Apple uses the same security protocols everywhere in the world and follows the same standards for responding to law enforcement requests.
Apple also points out that the FBI appears to be contradicting itself as well:
Moreover, while they now argue that the FBI’s changing of the iCloud passcode—which ended any hope of backing up the phone’s data and accessing it via iCloud—“was the reasoned decision of experienced FBI agents”, the FBI Director himself admitted to Congress under oath that the decision was a “mistake”.... The Justice Department’s shifting, contradictory positions on this issue—first blaming the passcode change on the County, then admitting that the FBI told the County to change the passcode after the County objected to being blamed for doing so, and now trying to justify the decision in the face of Director Comey’s admission that it was a mistake—discredits any notion that the government properly exhausted all viable investigative alternatives before seeking this extraordinary order from this Court.
On the Constitutional questions, again Apple points out that the DOJ doesn't appear to understand what it's talking about:
The government begins its First Amendment analysis by suggesting that “[t]here is reason to doubt that functional programming is even entitled to traditional speech protections” ... , evincing its confusion over the technology it demands Apple create. Even assuming there is such a thing as purely functional code, creating the type of software demanded here, an operating system that has never existed before, would necessarily involve precisely the kind of expression of ideas and concepts protected by the First Amendment. Because writing code requires a choice of (1) language, (2) audience, and (3) syntax and vocabulary, as well as the creation of (4) data structures, (5) algorithms to manipulate and transform data, (6) detailed textual descriptions explaining what code is doing, and (7) methods of communicating information to the user, “[t]here are a number of ways to write code to accomplish a given task.”... As such, code falls squarely within the First Amendment’s protection, as even the cases cited by the government acknowledge...
Later it points out that the DOJ's claim that since Apple can write such code however it wants it's not compelled speech, Apple points out that their argument says the exact opposite:
The government attempts to evade this unavoidable conclusion by insisting that, “[t]o the extent [that] Apple’s software includes expressive elements . . . the Order permits Apple to express whatever it wants, so long as the software functions” by allowing it to hack into iPhones.... This serves only to illuminate the broader speech implications of the government’s request. The code that the government is asking the Court to force Apple to write contains an extra layer of expression unique to this case. When Apple designed iOS 8, it consciously took a position on an issue of public importance.... The government disagrees with Apple’s position and asks this Court to compel Apple to write new code that reflects its own viewpoint—a viewpoint that is deeply offensive to Apple.
The filing is basically Apple, over and over again, saying, "uh, what the DOJ said was wrong, clueless, technically ignorant, or purposely misleading." Hell, they even attack the DOJ's claim that the All Writs Act was used back in 1807 to force Aaron Burr's secretary to decrypt one of Burr's cipher-protected letters. Apple points out that the DOJ is lying.
The government contends that Chief Justice Marshall once ordered a third party to “provide decryption services” to the government.... He did nothing of the sort, and the All Writs Act was not even at issue in Burr. In that case, Aaron Burr’s secretary declined to state whether he “understood” the contents of a certain letter written in cipher, on the ground that he might incriminate himself.... The Court held that the clerk’s answer as to whether he understood the cipher could not incriminate him, and the Court thus held that “the witness may answer the question now propounded”—i.e., whether he understood the letter.... The Court did not require the clerk to decipher the letter.
If anything, to be honest, I'm surprised that Apple didn't go even harder on the DOJ for misrepresenting things. Either way, Apple is pretty clearly highlighting just how desperate the DOJ seems in this case.
Sen. Lindsey Graham (R-S.C.), who last December called on Silicon Valley to stop selling encrypted devices, expressed serious concern on Wednesday about the precedent the Department of Justice would set if it successfully compels Apple to break iPhone security features.
“I was all with you until I actually started getting briefed by the people in the Intel Community,” Graham told Attorney General Loretta Lynch during an oversight hearing in the Senate Judiciary Committee. “I will say that I’m a person that’s been moved by the arguments about the precedent we set and the damage we might be doing to our own national security.”
This is what happens when legislators stop following their gut instincts on subjects they know little about and actually seek input from those who do know what's involved and what's at stake. Graham -- without speaking to "people in the Intel Community" -- originally presented terrorism as Apple's problem. With the benefit of technically-adept hindsight, Graham is now seeing this for what it is: a push for a dangerous precedent that won't end with this one iPhone and Apple. It will move on to other manufacturers, service providers and communications platforms. Because this one iPhone (which is actually twelve iPhones) is just the foot in the door. Apple does not hold a monopoly on encrypted communications.
“One of the arguments Apple makes is that there are other companies that make encryption,” Graham said to Lynch during the hearing. “So from a terrorist point of view, you’re not limited to Apple’s iPhone to communicate are you?”
“I think the terrorists use any device they can to communicate,” the Attorney General responded.
“So this encryption issue, if you require Apple to unlock that phone that doesn’t deny terrorist the ability to communicate privately does it, there are others ways they can do this,” Graham noted.
The FBI -- which sees any communications it can't access as nothing more than a collection of smoking guns comprised of 0s and 1s -- will not stop with Apple. It already has its eyes on WhatsApp, one of the biggest messaging apps in the world -- one that also features end-to-end encryption.
The underlying point Graham is making -- having now spoken with those with the most at stake -- is that a successful push to force American companies to provide unprecedented access to law enforcement does little to stop global terrorism, while causing tremendous damage to those forced into complicity. If the FBI manages to pry open the front door, every other nation in the world is going expect Apple to hold the door open for them as well. And if they can't find a way to force Apple to do that, they may block it from selling its products in their countries. Or Apple may decide the market isn't worth the security hit. Either way, it hurts Apple, and terrorists will just move on to the next service/platform/manufacturer.
It's heartening to see Graham come around on this, especially considering he's spent the last few months coming down harshly on phone manufacturers for refusing to immediately comply with every ridiculous government demand.
Not surprisingly, Oliver's take is much clearer and much more accurate than many mainstream press reports on the issues in the case, appropriately mocking the many law enforcement officials who seem to think that, just because Apple employs smart engineers, they can somehow do the impossible and "safely" create a backdoor into an encrypted iPhone that won't have dangerous consequences. He even spends a bit of time reviewing the original Crypto Wars over the Clipper Chip and highlights cryptographer Matt Blaze's contribution in ending those wars by showing that the Clipper Chip could be hacked.
But the biggest contribution to the debate -- which I hope that people pay most attention to -- is the point that Oliver made in the end with his faux Apple commercial. Earlier in the piece, Oliver noted that this belief among law enforcement that Apple engineers can somehow magically do what they want is at least partially Apple's own fault, with its somewhat overstated marketing. So, Oliver's team made a "more realistic" Apple commercial which noted that Apple is constantly fighting security cracks and vulnerabilities and is consistently just half a step ahead of hackers with malicious intent (and, in many cases, half a step behind them).
This is the key point: Building secure products is very, very difficult and even the most secure products have security vulnerabilities in them that need to be constantly watched and patched. And what the government is doing here is not only asking Apple to not patch a security vulnerability that it has found, but actively forcing Apple to make a new vulnerability and then effectively forcing Apple to keep it open. For all the talk of how Apple can just create the backdoor just this once and throw it away, this more like asking Apple to set off a bomb that blows the back off all houses in a city, and then saying, "okay, just throw away the bomb after you set it off."
Hopefully, as in cases like net neutrality, Oliver's piece does it's job in informing the public what's really going on.
As recently as this past week, officials said, the Justice Department was discussing how to proceed in a continuing criminal investigation in which a federal judge had approved a wiretap, but investigators were stymied by WhatsApp’s encryption.
The Justice Department and WhatsApp declined to comment. The government officials and others who discussed the dispute did so on condition of anonymity because the wiretap order and all the information associated with it were under seal. The nature of the case was not clear, except that officials said it was not a terrorism investigation. The location of the investigation was also unclear.
And, as long as we're operating on hearsay and conjecture, there's also this:
You’re getting useless data,” said Joseph DeMarco, a former federal prosecutor who now represents law enforcement agencies that filed briefs supporting the Justice Department in its fight with Apple. “The only way to make this not gibberish is if the company helps.”
“As we know from intercepted prisoner wiretaps,” he added, “criminals think that advanced encryption is great.”
You'd think that access to prisoner wiretaps would somewhat negate the need to break encryption, but maybe these mouthy inmates spend more time chatting about encryption than the allegations against them. And while I understand law enforcement's complaint that they used to be able to get all of this data with a warrant, they also used to have to run license plates by hand and perform stakeouts in person. So, it's not as though advances in technology have delivered no concurrent benefits.
Make no mistake about it: given the multitude of choices, the DOJ would rather have unfettered access to phones and all they contain. WhatsApp may have a billion or so users -- all protected by end-to-end encryption -- but if the FBI can crack open a phone, it can likely get to the content of the messages.
In the case of Amy Fletcher’s son Justin Bloxom, privacy advocates question whether phone evidence was critical to the cases. But Ms. Fletcher said: “Everything that was done was done through texts from a damn cell phone.”
“Had we not had that information, you wouldn’t realize how evil this man was,” said Ms. Fletcher, who didn’t know her son’s 2010 murder in Mansfield, La., had become part of the national debate until contacted by The Wall Street Journal.
There's no mention of WhatsApp in the Wall Street Journal's article, so it may be that all the recovered texts were of the SMS variety. But WhatsApp is supplanting SMS and the DOJ is definitely interested in the heavily-used messaging app. Last year, its requests to Facebook (which owns WhatsApp) for the contents of these messages jumped astronomically.
In the first six months of 2015, US law enforcement agencies sent Facebook 201 wiretap requests (referred to as “Title III” in the report) for 279 users or accounts. In all of 2014, on the other hand, Facebook only received 9 requests for 16 users or accounts.
Motherboard notes that this number, while still seemingly small, represents a 2133% increase. Not only that, but the total number of requests to Facebook for this data dwarfs similar requests from Google, which only saw 30 total for 2013-2014 combined.
The FBI and DOJ have yet to say much publicly about this particular case, probably feeling it's better to fight only one heavily-opposed battle at a time. But whatever the result of the Apple case, it will hardly be the end of the DOJ's efforts to force service providers to assist them in undermining their own protective efforts.
This is not all that surprising, but President Obama, during his SXSW keynote interview, appears to have joined the crew of politicians making misleading statements pretending to be "balanced" on the question of encryption. The interview (the link above should start at the very beginning) talks about a variety of issues related to tech and government, but eventually the President zeroes in on the encryption issue. The embed below should start at that point (if not, it's at the 1 hour, 16 minute mark in the video). Unfortunately, the interviewer, Evan Smith of the Texas Tribune, falsely frames the issue as one of "security v. privacy" rather than what it actually is -- which is "security v. security."
In case you can't watch that, the President says he won't comment directly on the Apple legal fights, but then launches into the standard politician talking point of "yes, we want strong encryption, but bad people will use it so we need to figure out some way to break in."
If you watch that, the President is basically doing the same thing as all the Presidential candidates, stating that there's some sort of equivalency on both sides of the debate and that we need to find some sort of "balanced" solution short of strong encryption that will somehow let in law enforcement in some cases.
This is wrong. This is ignorant.
To his at least marginal credit, the President (unlike basically all of the Presidential candidates) did seem to acknowledge the arguments of the crypto community, but then tells them all that they're wrong. In some ways, this may be slightly better than those who don't even understand the actual issues at all, but it's still problematic.
Let's go through this line by line.
All of us value our privacy. And this is a society that is built on a Constitution and a Bill of Rights and a healthy skepticism about overreaching government power. Before smartphones were invented, and to this day, if there is probable cause to think that you have abducted a child, or that you are engaging in a terrorist plot, or you are guilty of some serious crime, law enforcement can appear at your doorstep and say 'we have a warrant to search your home' and they can go into your bedroom to rifle through your underwear to see if there's any evidence of wrongdoing.
Again, this is overstating the past and understating today's reality. Yes, you could always get a warrant to go "rifle through" someone's underwear, if you could present probable cause that such a search was reasonable to a judge. But that does not mean that the invention of smartphones really changed things so dramatically as President Obama presents here. For one, there has always been information that was inaccessible -- such as information that came from an in-person conversation or information in our brains or information that has been destroyed.
In fact, as lots of people have noted, today law enforcement has much more recorded evidence that it can obtain, totally unrelated to the encryption issue. This includes things like location information or information on people you called. That information used to not be available at all. So it's hellishly misleading to pretend that we've entered some new world of darkness for law enforcement when the reality is that the world is much, much brighter.
And we agree on that. Because we recognize that just like all our other rights, freedom of speech, freedom of religion, etc. there are going to be some constraints that we impose in order to make sure that we are safe, secure and living in a civilized society. Now technology is evolving so rapidly that new questions are being asked. And I am of the view that there are very real reasons why we want to make sure that government cannot just willy nilly get into everyone's iPhones, or smartphones, that are full of very personal information and very personal data. And, let's face it, the whole Snowden disclosure episode elevated people's suspicions of this.
That was a real issue. I will say, by the way, that -- and I don't want to go to far afield -- but the Snowden issue, vastly overstated the dangers to US citizens in terms of spying. Because the fact of the matter is that actually that our intelligence agencies are pretty scrupulous about US persons -- people on US soil. What those disclosures did identify were excesses overseas with respect to people who are not in this country. A lot of those have been fixed. Don't take my word for it -- there was a panel that was constituted that just graded all the reforms that we set up to avoid those charges. But I understand that that raised suspicions.
Again, at least some marginal kudos for admitting that this latest round was brought on by "excesses" (though we'd argue that it was actually unconstitutional, rather than mere overreach). And nice of him to admit that Snowden actually did reveal such "excesses." Of course, that raises a separate question: Why is Obama still trying to prosecute Snowden when he's just admitted that what Snowden did was clearly whistleblowing, in revealing questionable spying?
Also, the President is simply wrong that it was just about issues involving non-US persons. The major reform that has taken place wasn't about US persons at all, but rather about Section 215 of the PATRIOT Act, which was used almost entirely on US persons to collect all their phone records. So it's unclear why the President is pretending otherwise. The stuff outside of the US is governed by Executive Order 12333, and there's been completely no evidence that the President has changed that at all. I do agree, to some extent, that many do believe in an exaggerated view of NSA surveillance, and that's distracting. But the underlying issues about legality and constitutionality -- and the possibilities for abuse -- absolutely remain.
But none of that actually has to do with the encryption fight, beyond the recognition -- accurately -- that the government's actions, revealed by Snowden, caused many to take these issues more seriously. And, on that note, it would have been at least a little more accurate for the President to recognize that it wasn't Snowden who brought this on the government, but the government itself by doing what it was doing.
So we're concerned about privacy. We don't want government to be looking through everybody's phones willy-nilly, without any kind of oversight or probable cause or a clear sense that it's targeted who might be a wrongdoer.
What makes it even more complicated is that we also want really strong encryption. Because part of us preventing terrorism or preventing people from disrupting the financial system or our air traffic control system or a whole other set of systems that are increasingly digitalized is that hackers, state or non-state, can just get in there and mess them up.
So we've got two values. Both of which are important.... And the question we now have to ask is, if technologically it is possible to make an impenetrable device or system where the encryption is so strong that there's no key. There's no door at all. Then how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot? What mechanisms do we have available to even do simple things like tax enforcement? Because if, in fact, you can't crack that at all, government can't get in, then everybody's walking around with a Swiss bank account in their pocket. So there has to be some concession to the need to be able get into that information somehow.
The answer to those questions in that final paragraph are through good old fashioned detective work. In a time before smartphones, detectives were still able to catch child pornographers or disrupt terrorist plots. And, in some cases, the government failed to stop either of those things. But it wasn't because strong enforcement stymied them, but because there are always going to be some plots that people are able to get away with. We shouldn't undermine our entire security setup just because there are some bad people out there. In fact, that makes us less safe.
Also: tax enforcement? Tax enforcement? Are we really getting to the point that the government wants to argue that we need to break strong encryption to better enforce taxes? Really? Again, there are lots of ways to go after tax evasion. And, yes, there are lots of ways that people and companies try to hide money from the IRS. And sometimes they get away with it. To suddenly say that we should weaken encryption because the IRS isn't good enough at its job just seems... crazy.
Now, what folks who are on the encryption side will argue, is that any key, whatsoever, even if it starts off as just being directed at one device, could end up being used on every device. That's just the nature of these systems. That is a technical question. I am not a software engineer. It is, I think, technically true, but I think it can be overstated.
This is the part that's most maddening of all. He almost gets the point right. He almost understands. The crypto community has been screaming from the hills for ages that introducing any kind of third party access to encryption weakens it for all, introducing vulnerabilities that ensure that those with malicious intent will get in much sooner than they would otherwise. The President is mixing up that argument with one of the other arguments in the Apple/FBI case, about whether it's about "one phone" or "all the phones."
But even assuming this slight mixup is a mistake, and that he does recognize the basics of the arguments from the tech community, to have him then say that this "can be overstated" is crazy. A bunch of cryptography experts -- including some who used to work for Obama -- laid out in a detailed paper the risks of undermining encryption. To brush that aside as some sort of rhetorical hyperbole -- to brush aside the realities of cryptography and math -- is just crazy.
Encryption expert Matt Blaze (whose research basically helped win Crypto War 1.0) responded to this argument by noting that the "nerd harder, nerds" argument fundamentally misunderstands the issue:
Figuring out how to build the reliable, secure systems required to "compromise" on crypto has long been a central problem in CS.
If you can't read that, Blaze is basically saying that all crypto includes backdoors -- they're known as vulnerabilities. And the key focus in crypto is closing those backdoors, because leaving them open is disastrous. And yet the government is now demanding that tech folks purposely put in more backdoors and not close them, without recognizing the simple fact that vulnerabilities in crypto always lead to disastrous results.
So the question now becomes that, we as a society, setting aside the specific case between the FBI and Apple, setting aside the commercial interests, the concerns about what could the Chinese government do with this, even if we trust the US government. Setting aside all those questions, we're going to have to make some decisions about how do we balance these respective risks. And I've got a bunch of smart people, sitting there, talking about it, thinking about it. We have engaged the tech community, aggressively, to help solve this problem. My conclusions so far is that you cannot take an absolutist view on this. So if your argument is "strong encryption no matter what, and we can and should in fact create black boxes," that, I think, does not strike the kind of balance that we have lived with for 200, 300 years. And it's fetishizing our phones above every other value. And that can't be the right answer.
This is not an absolutist view. It is not an absolutist view to say that anything you do to weaken the security of phones creates disastrous consequences for overall security, far beyond the privacy of individuals holding those phones. And, as Julian Sanchez rightly notes, it's ridiculous that it's the status quo on the previous compromise that is now being framed as an "absolutist" position:
CALEA--with obligations on telecoms to assist, but user-side encryption protected--WAS the compromise. Now that's "absolutism".
Also, the idea that this is about "fetishizing our phones" is ridiculous. No one is even remotely suggesting that. No one is even suggesting -- as Obama hints -- that this is about making phones "above and beyond" what other situations are. It's entirely about the nature of computer security and how it works. It's about the risks to our security in creating deliberate vulnerabilities in our technologies. To frame that as "fetishizing our phones" is insulting.
There's a reason why the NSA didn't want President Obama to carry a Blackberry when he first became President. And there's a reason the President wanted a secure Blackberry. And it's not because of fetishism in any way, shape or form. It's because securing data on phones is freaking hard and it's a constant battle. And anything that weakens the security puts people in harm's way.
I suspect that the answer is going to come down to how do we create a system where the encryption is as strong as possible. The key is as secure as possible. It is accessible by the smallest number of people possible for a subset of issues that we agree are important. How we design that is not something that I have the expertise to do. I am way on the civil liberties side of this thing. Bill McCraven will tell you that I anguish a lot over the decisions we make over how to keep this country safe. And I am not interested in overthrowing the values that have made us an exceptional and great nation, simply for expediency. But the dangers are real. Maintaining law and order and a civilized society is important. Protecting our kids is important.
You suspect wrong. Because while your position sounds reasonable and "balanced" (and I've seen some in the press describe President Obama's position here as "realist"), it's actually dangerous. This is the problem. The President is discussing this like it's a political issue rather than a technological/math issue. People aren't angry about this because they're "extremists" or "absolutists" or people who "don't want to compromise." They're screaming about this because "the compromise" solution is dangerous. If there really were a way to have strong encryption with a secure key where only a small number of people could get in on key issues, then that would be great.
But the key point that all of the experts keep stressing is: that's not reality. So, no the President's not being a "realist." He's being the opposite.
So I would just caution against taking an absolutist perspective on this. Because we make compromises all the time. I haven't flown commercial in a while, but my understanding is that it's not great fun going through security. But we make the concession because -- it's a big intrusion on our privacy -- but we recognize that it is important. We have stops for drunk drivers. It's an intrusion. But we think it's the right thing to do. And this notion that somehow our data is different and can be walled off from those other trade-offs we make, I believe is incorrect.
Again, this is not about "making compromises" or some sort of political perspective. And the people arguing for strong encryption aren't being "absolutist" about it because they're unwilling to compromise. They're saying that the "compromise" solution means undermining the very basis of how we do security and putting everyone at much greater risk. That's ethically horrific.
And, also, no one is saying that "data is different." There has always been information that is "walled off." What people are saying is that one consequence of strong encryption is that it has to mean that law enforcement is kept out of that information too. That does not mean they can't solve crimes in other ways. It does not mean that they don't get access to lots and lots of other information. It just means that this kind of content is harder to access, because we need it to be harder to access to protect everyone.
It's not security v. privacy. It's security v. security, where the security the FBI is fighting for is to stop the 1 in a billion attack and the security everyone else wants is to prevent much more likely and potentially much more devastating attacks.
Meanwhile, of all the things for the President to cite as an analogy, TSA security theater may be the worst. Very few people think it's okay, especially since it's been shown to be a joke. Setting that up as the precedent for breaking strong encryption is... crazy. And, on top of that, using the combination of TSA security and DUI checkpoints as evidence for why we should break strong encryption with backdoors again fails to recognize the issue at hand. Neither of those undermine an entire security setup.
We do have to make sure, given the power of the internet and how much our lives are digitalized, that it is narrow and that it is constrained and that there's oversight. And I'm confident this is something that we can solve, but we're going to need the tech community, software designers, people who care deeply about this stuff, to help us solve it. Because what will happen is, if everybody goes to their respective corners, and the tech community says "you know what, either we have strong perfect encryption, or else it's Big Brother and Orwellian world," what you'll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties, because the people who understand this best, and who care most about privacy and civil liberties have disengaged, or have taken a position that is not sustainable for the general public as a whole over time.
I have a lot of trouble with the President's line about everyone going to "their respective corners," as it suggests a ridiculous sort of tribalism in which the natural state is the tech industry against the government and even suggests that the tech industry doesn't care about stopping terrorism or child pornographers. That, of course, is ridiculous. It's got nothing to do with "our team." It has to do with the simple realities of encryption and the fact that what the President is suggesting is dangerous.
Furthermore, it's not necessarily the "Orwellian/big brother" issue that people are afraid of. That's a red herring from the "privacy v. security" mindset. People are afraid of this making everyone a lot less safe. No doubt, the President is right that if there's "something really bad" happening then the politics moves in one way -- but it's pretty ridiculous for him to be saying that, seeing as the latest skirmish in this battle is being fought by his very own Justice Department, he's the one who jumped on the San Bernardino attacks as an excuse to push this line of argument.
If the President is truly worried about stupid knee-jerk reactions following "something bad" happening, rather than trying to talk about "balance" and "compromise," he could and should be doing more to fairly educate the American public, and to make public statements about this issue and how important strong encryption is. Enough of this bogus "strong encryption is important, but... the children" crap. The children need strong encryption. The victims of crimes need encryption. The victims of terrorists need encryption. Undermining all that because just a tiny bit of information is inaccessible to law enforcement is crazy. It's giving up the entire ballgame to those with malicious intent, just so that we can have a bit more information in a few narrow cases.
President Obama keeps mentioning trade-offs, but it appears that he refuses to actually understand the trade-offs at issue here. Giving up on strong encryption is not about finding a happy middle compromise. Giving up on strong encryption is putting everyone at serious risk.
We've written a few times now about the somewhat bizarre Matthew Keys case. While he still denies having done anything, he has been found guilty under the CFAA for sharing the login information to the Tribune Company's computer systems, which apparently resulted in someone hacking a story on the LA Times website. The hack was nonsensical and lasted for all of about 40 minutes. There's no indication that this bit of vandalism did any actual harm -- or even that very many people saw it. And yet... the Feds had to work overtime to figure out how to turn this minor bit of vandalism (which everyone agrees Keys did not actually do directly) into nearly $1 million in damages (thanks to emails that the Tribune Company says were worth $200+ each, and random claims about "ratings declines" due to a separate incident involving Keys and the Tribune-owned TV station Keys used to work for).
The United States recommends that the Court impose a sentence of sixty months imprisonment. The Probation Department’s recommendation of eighty-seven months is reasonable and the best way to promote sentencing uniformity. But this prosecutor has been with this case since its inception in 2010 and submits that a five-year sentence as sufficient, but not greater than necessary, to comply with the purposes of sentencing. A sentence of five years imprisonment reflects Keys’s culpability and places his case appropriately among those of other white collar criminals who do not accept responsibility for their crimes.
Even assuming that the DOJ's claims about what Keys did are entirely accurate, Keys comes off as an immature jackass, rather than some sort of criminal hacking mastermind. It's difficult to see how that is worth five years in jail. Much of the DOJ's reasoning is because Keys has continued to assert his innocence. And while the DOJ does seem to have a fair amount of evidence that Keys did some fairly stupid and immature stuff, insisting he is innocent shouldn't be a reason to lock him up longer. The DOJ keeps going back to some stupid stuff Keys was accused of doing with a database of information he apparently had access to from the TV station -- including emailing people in the TV station's database with misleading messages -- but even the DOJ admits that this wasn't what the trial was about, and much of the evidence related to it was "redacted before it went to the jury" because it "created a substantial risk that the jury might read it and return a verdict based on something other than the elements of the offense."
So the DOJ recognizes that... but still argues he should be sentenced for those very same things that are not elements of the offense he's been charged with.
“Worried” would be an understatement for the emotions of at least one Fox 40 viewer. The Court will recall that Mercer told “Cancer Man” that Mercer had just talked down a tearful elderly woman who had been having a “panic attack” over the emails while her husband was in kidney failure. The Court ordered this and Keys’s reaction redacted before it went to the jury. The Rule 403 exclusion meant that the Court thought the unredacted email would have created a substantial risk that the jury might read it and return a verdict based on something other than the elements of the offense. The Government respects the Court’s ruling. The way Keys reacted shows he is a different and worse kind of person. Keys ridiculed the station for “reporting to the old folks home” and casually passed judgment on the poor woman for having her priorities “in the wrong place there.” ... Keys’s haughty, cold reaction to that woman’s suffering was the other reason that Mercer said he came to take a personal interest in the outcome of this case.... The Court should sentence to reflect the characteristics of the Defendant.... Keys’s characteristics include narcissism and an arrogant indifference to the suffering of innocent and vulnerable people.
Again, from all of this, Keys clearly comes off as an immature jackass -- but that's not necessarily a reason to lock someone up.
Meanwhile, Keys' lawyer has filed a much longer sentencing memorandum arguing for no jail time at all for Keys, still arguing that Keys' actions were all part of an investigative reporting effort. Considering that the court has pretty much already rejected this line of thinking, I'm not sure it's going to be very effective here either. There's a huge section in the memorandum about Keys' history working in journalism (going back to school), most of which I'm guessing the court will ignore as well -- though the detailed explanations of his more recent investigative reporting does act as something of a counterbalance to the DOJ presenting him as nothing more than a petty internet vandal, annoyed at his former employer.
Despite his indictment, Matthew continued to report on matters of crucial public interest, bringing to light important facts on critical matters that, without his reporting, may never have seen the light of day. Taken as a whole, his commitment to journalism also demonstrates a commitment to public service. At a time when other journalists concern themselves with which burrito restaurant a presidential candidate patrons or the numerous antics of a real estate mogul-turned-politician, it someone who has dedicated serious personal and professional effort, sometimes at his own considerable expense, to research and publish impactful stories on topics that matter to the public, should not be incarcerated. If he were to be sentenced to any prison term, people in positions of authority who will go unchecked and stories of public importance that will go untold.
Frankly, I find this stuff to be about as relevant as the stuff about Keys being kind of a jerk to his former employer. Neither thing is at issue in this case. So it shouldn't be reflected in the sentencing either.
What seems much more relevant is discussions about people who actually were breaking into computers and doing forms of computer vandalism... and who weren't penalized nearly as much as the DOJ is seeking for Keys, who is charged with just handing over a login and encouraging people to hack stuff (which only resulted in very minor vandalism).
But the biggest issue of all is the fact that, despite all of this, no one has ever gone after the actual hacker who did the vandalism, who goes by the name "sharpie." UK officials apparently know who sharpie is and told the FBI (it's someone in Scotland), and yet, still, no one has done anything about it If what Keys did is deserving of five years in jail, why does no one even think about going after the person who actually made the edits to the Tribune website?
George David “Sharpie”. Sharpie was the individual who actually accessed the Tribune Companies CMS and caused the damage Matthew was convicted for. Sharpe was never charged on either side of the Atlantic. He was visited once at his home in Scotland by the FBI and Scotland Yard. He spoke to them and that was the last of his contact with this case.
Either way, this case, yet again, demonstrates the ridiculousness of the Computer Fraud and Abuse Act (CFAA). Even if we accept that Keys did some immature things, this case is about a minor vandalism of a website. Hell, many years back, someone hacked into Techdirt and did much more serious vandalism (deleting the most recent 10 stories and all the comments), and if whoever did that was ever found, I wouldn't even want them sent to jail at all. That seems like a pretty extreme punishment for what honestly amounts to little more than internet graffiti.
If I had to guess, I'd predict that the judge will side with the DOJ, because that's what judges quite frequently do. The DOJ has done a good job distracting from the actual issues involved in this case and focusing it on other, unrelated issues, while painting Keys as something of a jerk. But on the actual issue of the CFAA, the whole thing seems like a massive stretch. Unfortunately, I think Keys' lawyers own filing is somewhat weak. It should have focused much more clearly on a few issues, rather than overloading it with what feels like a rambling attempt to throw every possible idea at the wall to reduce the jail term. His lawyer correctly notes that if the DOJ's focus is on "deterrence" that has already happened. Keys was fired from his job at Reuters, and no major news organization will hire him these days. That seems like plenty of deterrence for his activities. What, exactly, is five years in jail going to do at this point?
It must be admitted that the Apple/FBI fight over iPhone encryption has had much more "outside the courtroom" drama than most cases -- what with both sides putting out their own blog posts and commenting publicly at length on various aspects. But things have been taken up a notch, it seems, with the latest. We wrote about the DOJ's crazy filing in the case, which is just chock full of incredibly misleading claims. Most of the time, when we call out misleading claims in lawsuits, the various parties stay quiet about it. But this one was apparently so crazy that Apple's General Counsel Bruce Sewell called a press conference where he just blasted the DOJ through and through. It's worth looking at his whole statement (highlights by me):
First, the tone of the brief reads like an indictment. We've all heard Director Comey and Attorney General Lynch thank Apple for its consistent help in working with law enforcement. Director Comey's own statement that "there are no demons here." Well, you certainly wouldn't conclude it from this brief. In 30 years of practice I don't think I've seen a legal brief that was more intended to smear the other side with false accusations and innuendo, and less intended to focus on the real merits of the case.
For the first time we see an allegation that Apple has deliberately made changes to block law enforcement requests for access. This should be deeply offensive to everyone that reads it. An unsupported, unsubstantiated effort to vilify Apple rather than confront the issues in the case.
Or the ridiculous section on China where an AUSA, an officer of the court, uses unidentified Internet sources to raise the spectre that Apple has a different and sinister relationship with China. Of course that is not true, and the speculation is based on no substance at all.
To do this in a brief before a magistrate judge just shows the desperation that the Department of Justice now feels. We would never respond in kind, but imagine Apple asking a court if the FBI could be trusted "because there is this real question about whether J. Edgar Hoover ordered the assassination of Kennedy — see ConspiracyTheory.com as our supporting evidence."
We add security features to protect our customers from hackers and criminals. And the FBI should be supporting us in this because it keeps everyone safe. To suggest otherwise is demeaning. It cheapens the debate and it tries to mask the real and serious issues. I can only conclude that the DoJ is so desperate at this point that it has thrown all decorum to the winds....
We know there are great people in the DoJ and the FBI. We work shoulder to shoulder with them all the time. That's why this cheap shot brief surprises us so much. We help when we're asked to. We're honest about what we can and cannot do. Let's at least treat one another with respect and get this case before the American people in a responsible way. We are going before court to exercise our legal rights. Everyone should beware because it seems like disagreeing with the Department of Justice means you must be evil and anti-American. Nothing could be further from the truth.
Somehow, I don't think Apple and the DOJ will be exchanging holiday cards this year. Apple's reply brief is due on Tuesday. I imagine it'll be an interesting weekend in Cupertino.
The Justice Department has now filed its response to Apple's motion to vacate being forced to undermine the security features of Syed Farook's work iPhone. It's... quite a piece of work. The DOJ is pulling out all the stops in this one, and it seems to be going deeper and deeper into the ridiculous as it does so. Of course, it repeats many of the arguments in its earlier filings (both its original application for the All Writs Order as well as its Motion to Compel -- which even the judge told the DOJ she didn't think it should file). For example, it continues to assert that this should be judged on the "three factor test" that it made up from a Supreme Court decision that doesn't actually have a three factor test.
But the crux of the DOJ's argument is basically "how dare Apple make a warrant-proof phone" and thus it's Apple's fault that they haven't made it easy for the FBI to get what it wants. This argument is bonkers on many levels. Let's dig in:
By Apple’s own reckoning, the corporation—which grosses hundreds of billions of dollars a year—would need to set aside as few as six of its 100,000 employees for perhaps as little as two weeks. This burden, which is not unreasonable, is the direct result of Apple’s deliberate marketing decision to engineer its products so that the government cannot search them, even with a warrant.
This is a purposeful misrepresentation. The issue here is that the judge has made it clear that the key issue that she's concerned with is whether or not the request from the DOJ represents an "unreasonable burden" on Apple -- as the "burden" is the only actual test laid out in the US v. NY Telephone case the DOJ keeps pointing to. But Apple didn't present the time and manpower to show that it's the resources that are the unreasonable burden, but the potential impact on the safety and security of its customers. Focusing on the time is not the issue, but of course, the DOJ pretends it is.
Second, the DOJ's continued its ridiculous insistence that making your products safe and secure is a "deliberate marketing decision" -- which somehow makes it offensive in some way. Apple didn't engineer its products "so that the government cannot search them," it's so that your information is safe and secure from anyone, including criminals. You would think that law enforcement people in the FBI and DOJ would appreciate more secure devices that reduce crime. There was a time that they did. To sneeringly suggest that better protecting the public is nothing more than a "marketing decision" is ridiculous. Hell, even if it was a "marketing decision," a big part of the reason that "the market" wanted such features so badly was because the US government itself overstepped its bounds with mass surveillance.
The Court’s Order is modest. It applies to a single iPhone, and it allows Apple to decide the least burdensome means of complying. As Apple well knows, the Order does not compel it to unlock other iPhones or to give the government a universal “master key” or “back door.” It is a narrow, targeted order that will produce a narrow, targeted piece of software capable of running on just one iPhone, in the security of Apple’s corporate headquarters.
It has been explained -- at length -- by both Apple and various amicus briefs, how ridiculous this is. Everyone -- including the FBI -- has now admitted that this case is almost entirely about the precedent, and that a win for the DOJ will inevitably mean a long line of law local and federal law enforcement lining up outside Apple's headquarters in Cupertino with court orders in their hands, demanding that Apple help them crack into iPhones. That's a big deal. It also sets a precedent even beyond Apple, that companies can be forced to deliberately (1) weaken security on their devices and services and (2) lie to the public about it by "signing" the devices as legit.
The government and the community need to know what is on the terrorist’s phone, and the government needs Apple’s assistance to find out.
Instead of complying, Apple attacked the All Writs Act as archaic, the Court’s Order as leading to a “police state,” and the FBI’s investigation as shoddy, while extolling itself as the primary guardian of Americans’ privacy.... Apple’s rhetoric is not only false, but also corrosive of the very institutions that are best able to safeguard our liberty and our rights: the courts, the Fourth Amendment, longstanding precedent and venerable laws, and the democratically elected branches of government.
Apple didn't attack the AWA as "archaic" so much as inapplicable in this situation. Once again, the DOJ is doing some serious misrepresentation in this filing (and we're just three paragraphs in).
This case—like the three-factor Supreme Court test on which it must be decided—is about specific facts, not broad generalities. Here, Apple deliberately raised technological barriers that now stand between a lawful warrant and an iPhone containing evidence related to the terrorist mass murder of 14 Americans. Apple alone can remove those barriers so that the FBI can search the phone, and it can do so without undue burden. Under those specific circumstances, Apple can be compelled to give aid. That is not lawless tyranny. Rather, it is ordered liberty vindicating the rule of law. This Court can, and should, stand by the Order. Apple can, and should, comply with it.
Three factors! Drink! And, yes, Apple put in place these "barriers," but not as barriers to the government, but as security for everyone -- and there's a very big question, which the DOJ so desperately wishes to avoid with the mumble jumble above, which is whether or not a company can be forced to purposely write and sign code that deliberately undermines security features.
In deciding New York Telephone, the Supreme Court directly confronted and expressly rejected the policy arguments Apple raises now. Like Apple, the telephone company argued: that Congress had not given courts the power to issue such an order in its prior legislation; that the AWA could not be read so broadly; that it was for Congress to decide whether to provide such authority; and that relying on the AWA was a dangerous step down a slippery slope ending in arbitrary police powers.
Once again, the DOJ is misrepresenting the issues at play both in this case and in NY Telephone. In that case, a key part of the SCOTUS decision was based on the fact that NY Telephone was a public utility and therefore had certain responsibilities. That's not true of Apple. The DOJ also misrepresents the Congressional situation, which is different here, in that Congress did pass a specific law in this area, CALEA, which explicitly says that Apple need not help in this situation. The All Writs Act is a "gap filling" law, for when Congress has not spoken. But on this issue, it has.
The Supreme Court’s approach to the AWA does not create an unlimited source of judicial power, as Apple contends. The Act is self-limiting because it can only be invoked in aid of a court’s jurisdiction. Here, that jurisdiction rests on a lawful warrant, issued by a neutral magistrate pursuant to Rule 41. And New York Telephone provides a further safeguard, not through bright-line rules but rather through three factors courts must consider before exercising their discretion: (1) how far removed a party is from the investigative need; (2) how unreasonable a burden would be placed on that party; and (3) how necessary the party’s assistance is to the government. This three-factor analysis respects Congress’s mandate that the Act be flexible and adaptable, while eliminating the concern that random citizens will be forcibly deputized.
The DOJ insists that even with CALEA not saying it can do this, that doesn't matter, because CALEA is all about what companies can be forced to do prior to a warrant, not after.
CALEA, passed in 1994, does not “meticulously,” “intricately,” or “specifically” address when a court may order a smartphone manufacturer to remove barriers to accessing stored data on a particular smartphone. Rather, it governs what steps telecommunications carriers involved in transmission and switching must take in advance of court orders to ensure their systems can isolate information to allow for the real-time interception of network communications
But of course, under that interpretation, then the All Writs Act grants tremendous powers -- exactly the kinds of powers the DOJ insists elsewhere in this brief that isn't at issue in this case. I don't see how the DOJ can have it both ways.
As Apple recognizes, this Court must consider three equitable factors: (1) how “far removed” Apple is “from the underlying controversy”; (2) how “unreasonable [a] burden” the Order would place on Apple; and (3) how “necessary” its assistance is to searching Farook’s iPhone.
Apple is not so far removed from the underlying controversy that it should be excused from assisting in the execution of the search warrant. In New York Telephone, the phone company was sufficiently close to the controversy because the criminals used its phone lines. See 434 U.S. at 174. The Court did not require that the phone company know criminals were using its phone lines, or that it be involved in the crime. See id. Here, as a neutral magistrate found, there is probable cause to believe that Farook’s iPhone contains evidence related to his crimes. That alone would be sufficient proximity under the AWA and New York Telephone, even if Apple did not also own and control the software on Farook’s iPhone.
But again, under such an interpretation, the AWA can be used to force basically any tech company to figure out ways to spy on users if the FBI comes calling and gets a magistrate judge to rubber stamp an order. That's... crazy. Just because they use your technology does not mean that you're somehow legally on the hook for helping the FBI investigate their usage.
As Apple’s business model and its representations to its investors and customers make clear, Apple intentionally and for commercial advantage retains exclusive control over the software that can be used on iPhones, giving it monopoly-like control over the means of distributing software to the phones. As detailed below, Apple does so by: (1) firmly controlling iPhones’ operating systems and first-party software; (2) carefully managing and vetting third-party software before authenticating it for use on iPhones; and (3) continually receiving information from devices running its licensed software and its proprietary services, and retaining continued access to data from those devices about how its customers are using them. Having established suzerainty over its users’ phones—and control over the precise features of the phones necessary for unlocking them—Apple cannot now pretend to be a bystander, watching this investigation from afar.
This is kind of an incredible argument when you think about it: because Apple makes sure that its devices have updated software to keep it safe from vulnerabilities, that means that Apple is somehow connected to any use of the phone and responsible for helping the FBI crack into the phone. Does the FBI really want to encourage companies to stop offering any follow on support for software? Because that's the argument they're making here.
Thus, by its own design, Apple remains close to its iPhones through careful management and constant vigil over what software is on an iPhone and how that software is used. Indeed, Apple is much less “removed from the controversy”—in this case, the government’s inability to search Farook’s iPhone—than was the New York Telephone company because that company did not deliberately place its phone lines to prevent inconspicuous government access.... Here, Apple has deliberately used its control over its software to block law-enforcement requests for access to the contents of its devices, and it has advertised that feature to sell its products.
This argument is particularly maddening: basically continuing the ridiculous line of thinking that protecting user privacy is some sort of deliberate marketing strategy against the government, rather than in favor of protecting customers' own security and privacy.
And then we get even more maddening. In discussing the "burden" the DOJ literally tries to argue that if there is a burden, it's Apple's fault for designing a system so secure.
Apple is one of the richest and most tech-savvy companies in the world, and it is more than able to comply with the AWA order. Indeed, it concedes it can do so with relatively little effort. Even this modest burden is largely a result of Apple’s own decision to design and market a nearly warrant-proof phone.
This is monumentally misleading. The whole DOJ premise is that Apple deliberately is trying to interfere with legal investigations. But that's bonkers. Apple is just trying to build a secure phone for its users -- and a natural and unavoidable consequence of that is that it makes it more difficult for law enforcement to get access to that info. But that's because the whole point of such security is to make it more difficult for everyone who is not the phone's owner to get access, because that's how you protect them.
The DOJ is so vain it thinks Apple's security is all about them.
Then we get back to the lying:
Apple’s primary argument regarding undue burden appears to be that it should not be required to write any amount of code to assist the government.
Not really. Its primary argument is that the burden is in writing any amount of code that undermines the safety and security of its customers. That last part is kind of the important part. No wonder the DOJ ignores it.
Apple asserts that it would take six to ten employees two to four weeks to develop new code in order to carry out the Court’s Order.... Even taking Apple at its word, this is not an undue burden, especially given Apple’s vast resources and the government’s willingness to find reasonable compromises and provide reasonable reimbursement.
Apple is a Fortune 5 corporation with tremendous power and means: it has more than 100,000 full-time-equivalent employees and had an annual income of over $200 billion dollars in fiscal year 2015—more than the operating budget for California.... Indeed, Apple’s revenues exceed the nominal GDPs of two thirds of the world’s nations. To build the ordered software, no more than ten employees would be required to work for no more than four weeks, perhaps as little as two weeks.
Again, this is misleading (sense a theme?). First, as noted above, the "burden" is not so much in the time or engineers allotted to this issue. Second, even if we accept the DOJ's assertions here, it's misleading. The Apple filing noted that it would take that much effort just to create the initial code and to test it, but then noted -- quite rightly -- that if in the testing any problems arose, as they almost certainly would, it would need to basically redo the process. Part of the point, which can slip by non-technical people who have no experience developing and deploying code, is that this process could take a long, long time, and involve a lot of effort before it's actually safe to use on the actual phone.
Next up, the DOJ continues to insist that there can't possibly be any danger in creating this code, because Apple surely knows how to guard it, and further, that even if the code got out, that it wouldn't matter because it's asking for code that will only run on the Farook phone.
Next, contrary to Apple’s stated fears, there is no reason to think that the code Apple writes in compliance with the Order will ever leave Apple’s possession. Nothing in the Order requires Apple to provide that code to the government or to explain to the government how it works. And Apple has shown it is amply capable of protecting code that could compromise its security. For example, Apple currently protects (1) the source code to iOS and other core Apple software and (2) Apple’s electronic signature, which as described above allows software to be run on Apple hardware.... Those —which the government has not requested—are the keys to the kingdom. If Apple can guard them, it can guard this.
But, again, that leaves out the reality of testing this particular code and how that makes it much more likely the code will get out. This argument was presented in the amicus brief filed by iPhone forensics and security experts.
Next up, the DOJ totally misrepresents Apple's current assistance to government requests for information from the Chinese government. The DOJ is trying to argue, misleadingly, that Apple has no problem doing the same stuff for China, so that its worries about this case, creating a precedent for authoritarian regimes, is nonsense. But it's the DOJ's argument that's truly nonsense:
Apple suggests that, as a practical matter, it will cease to resist foreign governments’ efforts to obtain information on iPhone users if this Court rules against it. It offers no evidence for this proposition, and the evidence in the public record raises questions whether it is even resisting foreign governments now. For example, according to Apple’s own data, China demanded information from Apple regarding over 4,000 iPhones in the first half of 2015, and Apple produced data 74% of the time.... Apple appears to have made special accommodations in China as well: for example, moving Chinese user data to Chinese government servers, and installing a different WiFi protocol for Chinese iPhones.... Such accommodations provide Apple with access to a huge, and growing, market.... This Court’s Order changes neither the carrots nor the sticks that foreign governments can use on Apple. Thus, it does not follow that if America forgoes Apple’s assistance in this terrorism investigation, Apple will refuse to comply with the demands of foreign governments. Nor does it follow that if the Court stands by its Order, Apple must yield to foreign demands, made in different circumstances without the safeguards of American law.
What the DOJ is referring to here is Apple's latest transparency report in which it notes that it complied with 74% of government requests for information from China. You can see it here:
But again, Apple has always been willing to respond to legitimate government requests for information that it has access to. That's why that same chart shows that it complied with 81% of US requests as well. But that says absolutely nothing about the requirement to build a special system to hack in and access data that it does not currently have access to.
The rest of the China stuff, about servers and WAPI, is just the DOJ picking up on Stewart Baker's conspiracy theory that he posted a few weeks back. Lots of countries (stupidly) demand local storage, not necessarily because of surveillance reasons, but because they think it's good for their economy. And the reason Apple used WAPI was because that was the standard used in China for WiFi-like wireless. And as for the idea that Apple magically gave access to the Chinese, that makes no sense, given that Apple then had to fight a man in the middle attack against iCloud in China that was claimed to have originated from the Chinese government. If Apple gave it access, why would the government need to run a MiTM attack? The whole argument makes no sense.
In the first half of 2015 alone, Apple handled 27,000 “device requests”—often covering multiple devices—and provided data approximately 60% of the time.... If Apple can provide data from thousands of iPhones and Apple users to China and other countries, it can comply with the AWA in America. (Id.) This is not speculation because, in fact, Apple complied for years with American court orders to extract data from passcode-locked iPhones, dedicating infrastructure and personnel in order to do so.
Again that's different. That's about supplying the info that Apple had access to and not about writing code to undermine security features. Apples and oranges.
Finally, the DOJ mocks Apple's constitutional arguments on the First and Fifth Amendments.
Apple’s claim is particularly weak because it does not involve a person being compelled to speak publicly, but a for-profit corporation being asked to modify commercial software that will be seen only by Apple. There is reason to doubt that functional programming is even entitled to traditional speech protections....
To the extent Apple’s software includes expressive elements—such as variable names and comments—the Order permits Apple to express whatever it wants, so long as the software functions.
We're not "compelling" you to say this exactly, we're letting you say whatever you want... so long as it does what we want it to. That still seems like compelled speech, no?
Apple lastly asserts that the Order violates its Fifth Amendment right to due process. Apple is currently availing itself of the considerable process our legal system provides, and it is ludicrous to describe the government’s actions here as “arbitrary.”
Once again, it appears that many of the DOJ's arguments here are misleading in the extreme. Apple's response is due next week, and I imagine it will be quite a read as well.