Sen. Lindsey Graham (R-S.C.), who last December called on Silicon Valley to stop selling encrypted devices, expressed serious concern on Wednesday about the precedent the Department of Justice would set if it successfully compels Apple to break iPhone security features.
“I was all with you until I actually started getting briefed by the people in the Intel Community,” Graham told Attorney General Loretta Lynch during an oversight hearing in the Senate Judiciary Committee. “I will say that I’m a person that’s been moved by the arguments about the precedent we set and the damage we might be doing to our own national security.”
This is what happens when legislators stop following their gut instincts on subjects they know little about and actually seek input from those who do know what's involved and what's at stake. Graham -- without speaking to "people in the Intel Community" -- originally presented terrorism as Apple's problem. With the benefit of technically-adept hindsight, Graham is now seeing this for what it is: a push for a dangerous precedent that won't end with this one iPhone and Apple. It will move on to other manufacturers, service providers and communications platforms. Because this one iPhone (which is actually twelve iPhones) is just the foot in the door. Apple does not hold a monopoly on encrypted communications.
“One of the arguments Apple makes is that there are other companies that make encryption,” Graham said to Lynch during the hearing. “So from a terrorist point of view, you’re not limited to Apple’s iPhone to communicate are you?”
“I think the terrorists use any device they can to communicate,” the Attorney General responded.
“So this encryption issue, if you require Apple to unlock that phone that doesn’t deny terrorist the ability to communicate privately does it, there are others ways they can do this,” Graham noted.
The FBI -- which sees any communications it can't access as nothing more than a collection of smoking guns comprised of 0s and 1s -- will not stop with Apple. It already has its eyes on WhatsApp, one of the biggest messaging apps in the world -- one that also features end-to-end encryption.
The underlying point Graham is making -- having now spoken with those with the most at stake -- is that a successful push to force American companies to provide unprecedented access to law enforcement does little to stop global terrorism, while causing tremendous damage to those forced into complicity. If the FBI manages to pry open the front door, every other nation in the world is going expect Apple to hold the door open for them as well. And if they can't find a way to force Apple to do that, they may block it from selling its products in their countries. Or Apple may decide the market isn't worth the security hit. Either way, it hurts Apple, and terrorists will just move on to the next service/platform/manufacturer.
It's heartening to see Graham come around on this, especially considering he's spent the last few months coming down harshly on phone manufacturers for refusing to immediately comply with every ridiculous government demand.
One of the key lines that various supporters of backdooring encryption have repeated in the last year, is that they "just want to have a discussion" about the proper way to... put backdoors into encryption. Over and over again you had the likes of James Comey insisting that he wasn't demanding backdoors, but really just wanted a "national conversation" on the issue (despite the fact we had just such a conversation in the 90s and concluded: backdoors bad, let's move on.):
My goal today isn’t to tell people what to do. My goal is to urge our fellow citizens to participate in a conversation as a country about where we are, and where we want to be, with respect to the authority of law enforcement.
And, yet, now we're having that conversation. Very loudly. And while the conversation really has been going on for almost two years, in the last month it moved from a conversation among tech geeks and policy wonks into the mainstream, thanks to the DOJ's decision to force Apple to write some code that would undermine security features on the work iPhone of Syed Farook, one of the San Bernardino attackers. According to some reports, the DOJ and FBI purposely chose this case in the belief that it was a perfect "test" case for its side: one that appeared to involve "domestic terrorists" who murdered 14 people. There were reports claiming that Apple was fine fighting this case under seal, but that the DOJ purposely chose to make this request public.
Officials had hoped the Apple case involving a terrorist’s iPhone would rally the public behind what they see as the need to have some access to information on smartphones. But many in the administration have begun to suspect that the F.B.I. and the Justice Department may have made a major strategic error by pushing the case into the public consciousness.
Many senior officials say an open conflict between Silicon Valley and Washington is exactly what they have been trying to avoid, especially when the Pentagon and intelligence agencies are trying to woo technology companies to come back into the government’s fold, and join the fight against the Islamic State. But it appears it is too late to confine the discussion to the back rooms in Washington or Silicon Valley.
While the various public polling on the issue has led to very mixed results, it's pretty clear that the public did not universally swing to the government's position on this. In fact, it appears that the more accurately the situation is described to the public, the more likely they are to side with Apple over the FBI. Given that, John Oliver's recent video on the subject certainly isn't good news for the DOJ.
Either way, the DOJ and FBI insisted they wanted a conversation on this, and now they're getting it. Perhaps they should have been more careful what they wished for.
Not surprisingly, Oliver's take is much clearer and much more accurate than many mainstream press reports on the issues in the case, appropriately mocking the many law enforcement officials who seem to think that, just because Apple employs smart engineers, they can somehow do the impossible and "safely" create a backdoor into an encrypted iPhone that won't have dangerous consequences. He even spends a bit of time reviewing the original Crypto Wars over the Clipper Chip and highlights cryptographer Matt Blaze's contribution in ending those wars by showing that the Clipper Chip could be hacked.
But the biggest contribution to the debate -- which I hope that people pay most attention to -- is the point that Oliver made in the end with his faux Apple commercial. Earlier in the piece, Oliver noted that this belief among law enforcement that Apple engineers can somehow magically do what they want is at least partially Apple's own fault, with its somewhat overstated marketing. So, Oliver's team made a "more realistic" Apple commercial which noted that Apple is constantly fighting security cracks and vulnerabilities and is consistently just half a step ahead of hackers with malicious intent (and, in many cases, half a step behind them).
This is the key point: Building secure products is very, very difficult and even the most secure products have security vulnerabilities in them that need to be constantly watched and patched. And what the government is doing here is not only asking Apple to not patch a security vulnerability that it has found, but actively forcing Apple to make a new vulnerability and then effectively forcing Apple to keep it open. For all the talk of how Apple can just create the backdoor just this once and throw it away, this more like asking Apple to set off a bomb that blows the back off all houses in a city, and then saying, "okay, just throw away the bomb after you set it off."
Hopefully, as in cases like net neutrality, Oliver's piece does it's job in informing the public what's really going on.
Judd also has made news for falsely arresting and then publicly shaming men, saying that they're "sexual predators" and parading them in front of the press, seizing their money and possessions and then "negotiating" to only give them back some of what they seized. Oh, and then there was the time that Judd used Craigslist to help arrest prostitutes... but then blamed Craigslist for the problem.
"Let me tell you, the first time we do have trouble getting into a cell phone, we're going to seek a court order from Apple and when they deny us I'm going to go lock the CEO of Apple up," Judd said in a press conference Wednesday.
Another report of the press conference said that Judd followed this up, for emphasis, with: "I'll lock the rascal up."
Yeah, you see, that's not how the law actually works. And you'd think, as Sheriff, Judd should know that. But he doesn't. Or he does and he doesn't care. Neither of which is a good sign in a sheriff.
"You cannot create a business model to go, 'we're not paying attention to the federal judge or to the state judge, because we're above the law,'" Judd said.
Of course, that's not the issue at all. It's not about ignoring a judge, it's about building a secure product, and what kinds of things a court can or cannot force a company to do to the security of its products. No one is saying they're "above the law." Except, it seems, Sheriff Grady Judd, who thinks that he can put Apple's CEO in jail based on his own desires, rather than what the law actually says.
This is not all that surprising, but President Obama, during his SXSW keynote interview, appears to have joined the crew of politicians making misleading statements pretending to be "balanced" on the question of encryption. The interview (the link above should start at the very beginning) talks about a variety of issues related to tech and government, but eventually the President zeroes in on the encryption issue. The embed below should start at that point (if not, it's at the 1 hour, 16 minute mark in the video). Unfortunately, the interviewer, Evan Smith of the Texas Tribune, falsely frames the issue as one of "security v. privacy" rather than what it actually is -- which is "security v. security."
In case you can't watch that, the President says he won't comment directly on the Apple legal fights, but then launches into the standard politician talking point of "yes, we want strong encryption, but bad people will use it so we need to figure out some way to break in."
If you watch that, the President is basically doing the same thing as all the Presidential candidates, stating that there's some sort of equivalency on both sides of the debate and that we need to find some sort of "balanced" solution short of strong encryption that will somehow let in law enforcement in some cases.
This is wrong. This is ignorant.
To his at least marginal credit, the President (unlike basically all of the Presidential candidates) did seem to acknowledge the arguments of the crypto community, but then tells them all that they're wrong. In some ways, this may be slightly better than those who don't even understand the actual issues at all, but it's still problematic.
Let's go through this line by line.
All of us value our privacy. And this is a society that is built on a Constitution and a Bill of Rights and a healthy skepticism about overreaching government power. Before smartphones were invented, and to this day, if there is probable cause to think that you have abducted a child, or that you are engaging in a terrorist plot, or you are guilty of some serious crime, law enforcement can appear at your doorstep and say 'we have a warrant to search your home' and they can go into your bedroom to rifle through your underwear to see if there's any evidence of wrongdoing.
Again, this is overstating the past and understating today's reality. Yes, you could always get a warrant to go "rifle through" someone's underwear, if you could present probable cause that such a search was reasonable to a judge. But that does not mean that the invention of smartphones really changed things so dramatically as President Obama presents here. For one, there has always been information that was inaccessible -- such as information that came from an in-person conversation or information in our brains or information that has been destroyed.
In fact, as lots of people have noted, today law enforcement has much more recorded evidence that it can obtain, totally unrelated to the encryption issue. This includes things like location information or information on people you called. That information used to not be available at all. So it's hellishly misleading to pretend that we've entered some new world of darkness for law enforcement when the reality is that the world is much, much brighter.
And we agree on that. Because we recognize that just like all our other rights, freedom of speech, freedom of religion, etc. there are going to be some constraints that we impose in order to make sure that we are safe, secure and living in a civilized society. Now technology is evolving so rapidly that new questions are being asked. And I am of the view that there are very real reasons why we want to make sure that government cannot just willy nilly get into everyone's iPhones, or smartphones, that are full of very personal information and very personal data. And, let's face it, the whole Snowden disclosure episode elevated people's suspicions of this.
That was a real issue. I will say, by the way, that -- and I don't want to go to far afield -- but the Snowden issue, vastly overstated the dangers to US citizens in terms of spying. Because the fact of the matter is that actually that our intelligence agencies are pretty scrupulous about US persons -- people on US soil. What those disclosures did identify were excesses overseas with respect to people who are not in this country. A lot of those have been fixed. Don't take my word for it -- there was a panel that was constituted that just graded all the reforms that we set up to avoid those charges. But I understand that that raised suspicions.
Again, at least some marginal kudos for admitting that this latest round was brought on by "excesses" (though we'd argue that it was actually unconstitutional, rather than mere overreach). And nice of him to admit that Snowden actually did reveal such "excesses." Of course, that raises a separate question: Why is Obama still trying to prosecute Snowden when he's just admitted that what Snowden did was clearly whistleblowing, in revealing questionable spying?
Also, the President is simply wrong that it was just about issues involving non-US persons. The major reform that has taken place wasn't about US persons at all, but rather about Section 215 of the PATRIOT Act, which was used almost entirely on US persons to collect all their phone records. So it's unclear why the President is pretending otherwise. The stuff outside of the US is governed by Executive Order 12333, and there's been completely no evidence that the President has changed that at all. I do agree, to some extent, that many do believe in an exaggerated view of NSA surveillance, and that's distracting. But the underlying issues about legality and constitutionality -- and the possibilities for abuse -- absolutely remain.
But none of that actually has to do with the encryption fight, beyond the recognition -- accurately -- that the government's actions, revealed by Snowden, caused many to take these issues more seriously. And, on that note, it would have been at least a little more accurate for the President to recognize that it wasn't Snowden who brought this on the government, but the government itself by doing what it was doing.
So we're concerned about privacy. We don't want government to be looking through everybody's phones willy-nilly, without any kind of oversight or probable cause or a clear sense that it's targeted who might be a wrongdoer.
What makes it even more complicated is that we also want really strong encryption. Because part of us preventing terrorism or preventing people from disrupting the financial system or our air traffic control system or a whole other set of systems that are increasingly digitalized is that hackers, state or non-state, can just get in there and mess them up.
So we've got two values. Both of which are important.... And the question we now have to ask is, if technologically it is possible to make an impenetrable device or system where the encryption is so strong that there's no key. There's no door at all. Then how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot? What mechanisms do we have available to even do simple things like tax enforcement? Because if, in fact, you can't crack that at all, government can't get in, then everybody's walking around with a Swiss bank account in their pocket. So there has to be some concession to the need to be able get into that information somehow.
The answer to those questions in that final paragraph are through good old fashioned detective work. In a time before smartphones, detectives were still able to catch child pornographers or disrupt terrorist plots. And, in some cases, the government failed to stop either of those things. But it wasn't because strong enforcement stymied them, but because there are always going to be some plots that people are able to get away with. We shouldn't undermine our entire security setup just because there are some bad people out there. In fact, that makes us less safe.
Also: tax enforcement? Tax enforcement? Are we really getting to the point that the government wants to argue that we need to break strong encryption to better enforce taxes? Really? Again, there are lots of ways to go after tax evasion. And, yes, there are lots of ways that people and companies try to hide money from the IRS. And sometimes they get away with it. To suddenly say that we should weaken encryption because the IRS isn't good enough at its job just seems... crazy.
Now, what folks who are on the encryption side will argue, is that any key, whatsoever, even if it starts off as just being directed at one device, could end up being used on every device. That's just the nature of these systems. That is a technical question. I am not a software engineer. It is, I think, technically true, but I think it can be overstated.
This is the part that's most maddening of all. He almost gets the point right. He almost understands. The crypto community has been screaming from the hills for ages that introducing any kind of third party access to encryption weakens it for all, introducing vulnerabilities that ensure that those with malicious intent will get in much sooner than they would otherwise. The President is mixing up that argument with one of the other arguments in the Apple/FBI case, about whether it's about "one phone" or "all the phones."
But even assuming this slight mixup is a mistake, and that he does recognize the basics of the arguments from the tech community, to have him then say that this "can be overstated" is crazy. A bunch of cryptography experts -- including some who used to work for Obama -- laid out in a detailed paper the risks of undermining encryption. To brush that aside as some sort of rhetorical hyperbole -- to brush aside the realities of cryptography and math -- is just crazy.
Encryption expert Matt Blaze (whose research basically helped win Crypto War 1.0) responded to this argument by noting that the "nerd harder, nerds" argument fundamentally misunderstands the issue:
Figuring out how to build the reliable, secure systems required to "compromise" on crypto has long been a central problem in CS.
If you can't read that, Blaze is basically saying that all crypto includes backdoors -- they're known as vulnerabilities. And the key focus in crypto is closing those backdoors, because leaving them open is disastrous. And yet the government is now demanding that tech folks purposely put in more backdoors and not close them, without recognizing the simple fact that vulnerabilities in crypto always lead to disastrous results.
So the question now becomes that, we as a society, setting aside the specific case between the FBI and Apple, setting aside the commercial interests, the concerns about what could the Chinese government do with this, even if we trust the US government. Setting aside all those questions, we're going to have to make some decisions about how do we balance these respective risks. And I've got a bunch of smart people, sitting there, talking about it, thinking about it. We have engaged the tech community, aggressively, to help solve this problem. My conclusions so far is that you cannot take an absolutist view on this. So if your argument is "strong encryption no matter what, and we can and should in fact create black boxes," that, I think, does not strike the kind of balance that we have lived with for 200, 300 years. And it's fetishizing our phones above every other value. And that can't be the right answer.
This is not an absolutist view. It is not an absolutist view to say that anything you do to weaken the security of phones creates disastrous consequences for overall security, far beyond the privacy of individuals holding those phones. And, as Julian Sanchez rightly notes, it's ridiculous that it's the status quo on the previous compromise that is now being framed as an "absolutist" position:
CALEA--with obligations on telecoms to assist, but user-side encryption protected--WAS the compromise. Now that's "absolutism".
Also, the idea that this is about "fetishizing our phones" is ridiculous. No one is even remotely suggesting that. No one is even suggesting -- as Obama hints -- that this is about making phones "above and beyond" what other situations are. It's entirely about the nature of computer security and how it works. It's about the risks to our security in creating deliberate vulnerabilities in our technologies. To frame that as "fetishizing our phones" is insulting.
There's a reason why the NSA didn't want President Obama to carry a Blackberry when he first became President. And there's a reason the President wanted a secure Blackberry. And it's not because of fetishism in any way, shape or form. It's because securing data on phones is freaking hard and it's a constant battle. And anything that weakens the security puts people in harm's way.
I suspect that the answer is going to come down to how do we create a system where the encryption is as strong as possible. The key is as secure as possible. It is accessible by the smallest number of people possible for a subset of issues that we agree are important. How we design that is not something that I have the expertise to do. I am way on the civil liberties side of this thing. Bill McCraven will tell you that I anguish a lot over the decisions we make over how to keep this country safe. And I am not interested in overthrowing the values that have made us an exceptional and great nation, simply for expediency. But the dangers are real. Maintaining law and order and a civilized society is important. Protecting our kids is important.
You suspect wrong. Because while your position sounds reasonable and "balanced" (and I've seen some in the press describe President Obama's position here as "realist"), it's actually dangerous. This is the problem. The President is discussing this like it's a political issue rather than a technological/math issue. People aren't angry about this because they're "extremists" or "absolutists" or people who "don't want to compromise." They're screaming about this because "the compromise" solution is dangerous. If there really were a way to have strong encryption with a secure key where only a small number of people could get in on key issues, then that would be great.
But the key point that all of the experts keep stressing is: that's not reality. So, no the President's not being a "realist." He's being the opposite.
So I would just caution against taking an absolutist perspective on this. Because we make compromises all the time. I haven't flown commercial in a while, but my understanding is that it's not great fun going through security. But we make the concession because -- it's a big intrusion on our privacy -- but we recognize that it is important. We have stops for drunk drivers. It's an intrusion. But we think it's the right thing to do. And this notion that somehow our data is different and can be walled off from those other trade-offs we make, I believe is incorrect.
Again, this is not about "making compromises" or some sort of political perspective. And the people arguing for strong encryption aren't being "absolutist" about it because they're unwilling to compromise. They're saying that the "compromise" solution means undermining the very basis of how we do security and putting everyone at much greater risk. That's ethically horrific.
And, also, no one is saying that "data is different." There has always been information that is "walled off." What people are saying is that one consequence of strong encryption is that it has to mean that law enforcement is kept out of that information too. That does not mean they can't solve crimes in other ways. It does not mean that they don't get access to lots and lots of other information. It just means that this kind of content is harder to access, because we need it to be harder to access to protect everyone.
It's not security v. privacy. It's security v. security, where the security the FBI is fighting for is to stop the 1 in a billion attack and the security everyone else wants is to prevent much more likely and potentially much more devastating attacks.
Meanwhile, of all the things for the President to cite as an analogy, TSA security theater may be the worst. Very few people think it's okay, especially since it's been shown to be a joke. Setting that up as the precedent for breaking strong encryption is... crazy. And, on top of that, using the combination of TSA security and DUI checkpoints as evidence for why we should break strong encryption with backdoors again fails to recognize the issue at hand. Neither of those undermine an entire security setup.
We do have to make sure, given the power of the internet and how much our lives are digitalized, that it is narrow and that it is constrained and that there's oversight. And I'm confident this is something that we can solve, but we're going to need the tech community, software designers, people who care deeply about this stuff, to help us solve it. Because what will happen is, if everybody goes to their respective corners, and the tech community says "you know what, either we have strong perfect encryption, or else it's Big Brother and Orwellian world," what you'll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties, because the people who understand this best, and who care most about privacy and civil liberties have disengaged, or have taken a position that is not sustainable for the general public as a whole over time.
I have a lot of trouble with the President's line about everyone going to "their respective corners," as it suggests a ridiculous sort of tribalism in which the natural state is the tech industry against the government and even suggests that the tech industry doesn't care about stopping terrorism or child pornographers. That, of course, is ridiculous. It's got nothing to do with "our team." It has to do with the simple realities of encryption and the fact that what the President is suggesting is dangerous.
Furthermore, it's not necessarily the "Orwellian/big brother" issue that people are afraid of. That's a red herring from the "privacy v. security" mindset. People are afraid of this making everyone a lot less safe. No doubt, the President is right that if there's "something really bad" happening then the politics moves in one way -- but it's pretty ridiculous for him to be saying that, seeing as the latest skirmish in this battle is being fought by his very own Justice Department, he's the one who jumped on the San Bernardino attacks as an excuse to push this line of argument.
If the President is truly worried about stupid knee-jerk reactions following "something bad" happening, rather than trying to talk about "balance" and "compromise," he could and should be doing more to fairly educate the American public, and to make public statements about this issue and how important strong encryption is. Enough of this bogus "strong encryption is important, but... the children" crap. The children need strong encryption. The victims of crimes need encryption. The victims of terrorists need encryption. Undermining all that because just a tiny bit of information is inaccessible to law enforcement is crazy. It's giving up the entire ballgame to those with malicious intent, just so that we can have a bit more information in a few narrow cases.
President Obama keeps mentioning trade-offs, but it appears that he refuses to actually understand the trade-offs at issue here. Giving up on strong encryption is not about finding a happy middle compromise. Giving up on strong encryption is putting everyone at serious risk.
A couple of months ago, I wrote a long post trying to dig into the details of David Lowery's class action lawsuit against Spotify. In the end, while there was some question over whether or not streaming music services really need to get compulsory mechanical licenses for producing reproductions of songs, it seemed like the fact that such licenses are compulsory and can be obtained easily via having the Harry Fox Agency issue a "Notice of Intention" under Section 115, it seemed crazy to think that the various music services had not done that. In fact, we noted that the only way the lawsuits made any sense was if the various music services and HFA ignored this and didn't send out such NOIs. At the time, I noted that this would be a surprise, and it could mean the services were in deep trouble.
Or perhaps not a surprise... and, yes, some folks may be in deep trouble. Beyond Lowery's lawsuit, a few other similar lawsuits have been filed. Earlier this month, Tim Geigner wrote about a very similar lawsuit filed by Yesh Music against Tidal. Of course, what didn't get as much attention is that Yesh filed very similar lawsuits against a bunch of other music services as well, including Google Music, Slacker, Line Corporation (which runs Mix Radio) and Guerva (which I think is a misspelling of the music site Guvera). Yesh also sued Deezer a few months ago.
One of the key questions that came up following the reporting on all of these cases is the Harry Fox Agency's role in all of this. HFA, an organization that was set up by the publishers themselves is supposed to be responsible for managing compulsory licensing for the vast majority (though not all) of popular songwriters (remember, HFA is about compositions/publishing, not sound recordings). But it's beginning to look seriously like HFA just fell asleep on the job and didn't bother to do the one key thing it was supposed to do for all these music services: file Section 115 NOIs.
Both David Lowery and another songwriter, Ari Herstand, have recently posted examples of HFA suddenly sending them NOIs that appear to be rushed and are showing up way after they're supposed to. I rarely agree with Lowery about anything, but it's seriously looking like HFA totally fucked up here. Big time. Here's the notice Lowery received:
As Lowery notes, this NOI was sent on February 16th, 2016, but was signed by a Spotify exec who left the company in 2015, for a song that showed up on Spotify in 2011 and using an HFA address that didn't exist until 2012. Basically... it looks like HFA is rushing around trying to send out NOIs that it failed to do properly, and doing a pretty half-assed job about it. And that seems especially stupid when it comes to issuing those NOIs to the guy who is already suing over those missing NOIs.
Herstand just received a similarly late NOI from HFA for his music appearing on Apple Music. As he notes, his notice says the music should appear on Apple Music as of March 10th of 2016, but it's actually been there since Apple Music launched last summer. He also notes this is the first NOI he's ever received from HFA, while he has received plenty of NOIs from the much smaller HFA competitor Music Reports "on a regular basis."
So, given all that, it sure looks like HFA didn't do the one thing that it was supposed to be doing all along, and that's... going to be bad news for someone. The big question is who? All of the lawsuits have been against the various music services, but without being privy to the contracts between HFA and the music services themselves, I'd be shocked if they didn't include some sort of indemnity clauses, basically saying that if music isn't licensed because of HFA's own failures to do its job that any liability falls back on HFA.
And, if that's the case, HFA could be on the hook for a ton of copyright infringement. If it's true that it's basically been ignoring the fairly simple NOI process for a lot of artists, then that's going to be a major scandal -- but one that seems a lot harder to pin on the music services themselves. They went to HFA and hired the company to handle mechanical licenses. There may be more going on behind the scenes here, but at a first pass, based on what appears to be happening, HFA may be in some seriously deep trouble.
It must be admitted that the Apple/FBI fight over iPhone encryption has had much more "outside the courtroom" drama than most cases -- what with both sides putting out their own blog posts and commenting publicly at length on various aspects. But things have been taken up a notch, it seems, with the latest. We wrote about the DOJ's crazy filing in the case, which is just chock full of incredibly misleading claims. Most of the time, when we call out misleading claims in lawsuits, the various parties stay quiet about it. But this one was apparently so crazy that Apple's General Counsel Bruce Sewell called a press conference where he just blasted the DOJ through and through. It's worth looking at his whole statement (highlights by me):
First, the tone of the brief reads like an indictment. We've all heard Director Comey and Attorney General Lynch thank Apple for its consistent help in working with law enforcement. Director Comey's own statement that "there are no demons here." Well, you certainly wouldn't conclude it from this brief. In 30 years of practice I don't think I've seen a legal brief that was more intended to smear the other side with false accusations and innuendo, and less intended to focus on the real merits of the case.
For the first time we see an allegation that Apple has deliberately made changes to block law enforcement requests for access. This should be deeply offensive to everyone that reads it. An unsupported, unsubstantiated effort to vilify Apple rather than confront the issues in the case.
Or the ridiculous section on China where an AUSA, an officer of the court, uses unidentified Internet sources to raise the spectre that Apple has a different and sinister relationship with China. Of course that is not true, and the speculation is based on no substance at all.
To do this in a brief before a magistrate judge just shows the desperation that the Department of Justice now feels. We would never respond in kind, but imagine Apple asking a court if the FBI could be trusted "because there is this real question about whether J. Edgar Hoover ordered the assassination of Kennedy — see ConspiracyTheory.com as our supporting evidence."
We add security features to protect our customers from hackers and criminals. And the FBI should be supporting us in this because it keeps everyone safe. To suggest otherwise is demeaning. It cheapens the debate and it tries to mask the real and serious issues. I can only conclude that the DoJ is so desperate at this point that it has thrown all decorum to the winds....
We know there are great people in the DoJ and the FBI. We work shoulder to shoulder with them all the time. That's why this cheap shot brief surprises us so much. We help when we're asked to. We're honest about what we can and cannot do. Let's at least treat one another with respect and get this case before the American people in a responsible way. We are going before court to exercise our legal rights. Everyone should beware because it seems like disagreeing with the Department of Justice means you must be evil and anti-American. Nothing could be further from the truth.
Somehow, I don't think Apple and the DOJ will be exchanging holiday cards this year. Apple's reply brief is due on Tuesday. I imagine it'll be an interesting weekend in Cupertino.
The Justice Department has now filed its response to Apple's motion to vacate being forced to undermine the security features of Syed Farook's work iPhone. It's... quite a piece of work. The DOJ is pulling out all the stops in this one, and it seems to be going deeper and deeper into the ridiculous as it does so. Of course, it repeats many of the arguments in its earlier filings (both its original application for the All Writs Order as well as its Motion to Compel -- which even the judge told the DOJ she didn't think it should file). For example, it continues to assert that this should be judged on the "three factor test" that it made up from a Supreme Court decision that doesn't actually have a three factor test.
But the crux of the DOJ's argument is basically "how dare Apple make a warrant-proof phone" and thus it's Apple's fault that they haven't made it easy for the FBI to get what it wants. This argument is bonkers on many levels. Let's dig in:
By Apple’s own reckoning, the corporation—which grosses hundreds of billions of dollars a year—would need to set aside as few as six of its 100,000 employees for perhaps as little as two weeks. This burden, which is not unreasonable, is the direct result of Apple’s deliberate marketing decision to engineer its products so that the government cannot search them, even with a warrant.
This is a purposeful misrepresentation. The issue here is that the judge has made it clear that the key issue that she's concerned with is whether or not the request from the DOJ represents an "unreasonable burden" on Apple -- as the "burden" is the only actual test laid out in the US v. NY Telephone case the DOJ keeps pointing to. But Apple didn't present the time and manpower to show that it's the resources that are the unreasonable burden, but the potential impact on the safety and security of its customers. Focusing on the time is not the issue, but of course, the DOJ pretends it is.
Second, the DOJ's continued its ridiculous insistence that making your products safe and secure is a "deliberate marketing decision" -- which somehow makes it offensive in some way. Apple didn't engineer its products "so that the government cannot search them," it's so that your information is safe and secure from anyone, including criminals. You would think that law enforcement people in the FBI and DOJ would appreciate more secure devices that reduce crime. There was a time that they did. To sneeringly suggest that better protecting the public is nothing more than a "marketing decision" is ridiculous. Hell, even if it was a "marketing decision," a big part of the reason that "the market" wanted such features so badly was because the US government itself overstepped its bounds with mass surveillance.
The Court’s Order is modest. It applies to a single iPhone, and it allows Apple to decide the least burdensome means of complying. As Apple well knows, the Order does not compel it to unlock other iPhones or to give the government a universal “master key” or “back door.” It is a narrow, targeted order that will produce a narrow, targeted piece of software capable of running on just one iPhone, in the security of Apple’s corporate headquarters.
It has been explained -- at length -- by both Apple and various amicus briefs, how ridiculous this is. Everyone -- including the FBI -- has now admitted that this case is almost entirely about the precedent, and that a win for the DOJ will inevitably mean a long line of law local and federal law enforcement lining up outside Apple's headquarters in Cupertino with court orders in their hands, demanding that Apple help them crack into iPhones. That's a big deal. It also sets a precedent even beyond Apple, that companies can be forced to deliberately (1) weaken security on their devices and services and (2) lie to the public about it by "signing" the devices as legit.
The government and the community need to know what is on the terrorist’s phone, and the government needs Apple’s assistance to find out.
Instead of complying, Apple attacked the All Writs Act as archaic, the Court’s Order as leading to a “police state,” and the FBI’s investigation as shoddy, while extolling itself as the primary guardian of Americans’ privacy.... Apple’s rhetoric is not only false, but also corrosive of the very institutions that are best able to safeguard our liberty and our rights: the courts, the Fourth Amendment, longstanding precedent and venerable laws, and the democratically elected branches of government.
Apple didn't attack the AWA as "archaic" so much as inapplicable in this situation. Once again, the DOJ is doing some serious misrepresentation in this filing (and we're just three paragraphs in).
This case—like the three-factor Supreme Court test on which it must be decided—is about specific facts, not broad generalities. Here, Apple deliberately raised technological barriers that now stand between a lawful warrant and an iPhone containing evidence related to the terrorist mass murder of 14 Americans. Apple alone can remove those barriers so that the FBI can search the phone, and it can do so without undue burden. Under those specific circumstances, Apple can be compelled to give aid. That is not lawless tyranny. Rather, it is ordered liberty vindicating the rule of law. This Court can, and should, stand by the Order. Apple can, and should, comply with it.
Three factors! Drink! And, yes, Apple put in place these "barriers," but not as barriers to the government, but as security for everyone -- and there's a very big question, which the DOJ so desperately wishes to avoid with the mumble jumble above, which is whether or not a company can be forced to purposely write and sign code that deliberately undermines security features.
In deciding New York Telephone, the Supreme Court directly confronted and expressly rejected the policy arguments Apple raises now. Like Apple, the telephone company argued: that Congress had not given courts the power to issue such an order in its prior legislation; that the AWA could not be read so broadly; that it was for Congress to decide whether to provide such authority; and that relying on the AWA was a dangerous step down a slippery slope ending in arbitrary police powers.
Once again, the DOJ is misrepresenting the issues at play both in this case and in NY Telephone. In that case, a key part of the SCOTUS decision was based on the fact that NY Telephone was a public utility and therefore had certain responsibilities. That's not true of Apple. The DOJ also misrepresents the Congressional situation, which is different here, in that Congress did pass a specific law in this area, CALEA, which explicitly says that Apple need not help in this situation. The All Writs Act is a "gap filling" law, for when Congress has not spoken. But on this issue, it has.
The Supreme Court’s approach to the AWA does not create an unlimited source of judicial power, as Apple contends. The Act is self-limiting because it can only be invoked in aid of a court’s jurisdiction. Here, that jurisdiction rests on a lawful warrant, issued by a neutral magistrate pursuant to Rule 41. And New York Telephone provides a further safeguard, not through bright-line rules but rather through three factors courts must consider before exercising their discretion: (1) how far removed a party is from the investigative need; (2) how unreasonable a burden would be placed on that party; and (3) how necessary the party’s assistance is to the government. This three-factor analysis respects Congress’s mandate that the Act be flexible and adaptable, while eliminating the concern that random citizens will be forcibly deputized.
The DOJ insists that even with CALEA not saying it can do this, that doesn't matter, because CALEA is all about what companies can be forced to do prior to a warrant, not after.
CALEA, passed in 1994, does not “meticulously,” “intricately,” or “specifically” address when a court may order a smartphone manufacturer to remove barriers to accessing stored data on a particular smartphone. Rather, it governs what steps telecommunications carriers involved in transmission and switching must take in advance of court orders to ensure their systems can isolate information to allow for the real-time interception of network communications
But of course, under that interpretation, then the All Writs Act grants tremendous powers -- exactly the kinds of powers the DOJ insists elsewhere in this brief that isn't at issue in this case. I don't see how the DOJ can have it both ways.
As Apple recognizes, this Court must consider three equitable factors: (1) how “far removed” Apple is “from the underlying controversy”; (2) how “unreasonable [a] burden” the Order would place on Apple; and (3) how “necessary” its assistance is to searching Farook’s iPhone.
Apple is not so far removed from the underlying controversy that it should be excused from assisting in the execution of the search warrant. In New York Telephone, the phone company was sufficiently close to the controversy because the criminals used its phone lines. See 434 U.S. at 174. The Court did not require that the phone company know criminals were using its phone lines, or that it be involved in the crime. See id. Here, as a neutral magistrate found, there is probable cause to believe that Farook’s iPhone contains evidence related to his crimes. That alone would be sufficient proximity under the AWA and New York Telephone, even if Apple did not also own and control the software on Farook’s iPhone.
But again, under such an interpretation, the AWA can be used to force basically any tech company to figure out ways to spy on users if the FBI comes calling and gets a magistrate judge to rubber stamp an order. That's... crazy. Just because they use your technology does not mean that you're somehow legally on the hook for helping the FBI investigate their usage.
As Apple’s business model and its representations to its investors and customers make clear, Apple intentionally and for commercial advantage retains exclusive control over the software that can be used on iPhones, giving it monopoly-like control over the means of distributing software to the phones. As detailed below, Apple does so by: (1) firmly controlling iPhones’ operating systems and first-party software; (2) carefully managing and vetting third-party software before authenticating it for use on iPhones; and (3) continually receiving information from devices running its licensed software and its proprietary services, and retaining continued access to data from those devices about how its customers are using them. Having established suzerainty over its users’ phones—and control over the precise features of the phones necessary for unlocking them—Apple cannot now pretend to be a bystander, watching this investigation from afar.
This is kind of an incredible argument when you think about it: because Apple makes sure that its devices have updated software to keep it safe from vulnerabilities, that means that Apple is somehow connected to any use of the phone and responsible for helping the FBI crack into the phone. Does the FBI really want to encourage companies to stop offering any follow on support for software? Because that's the argument they're making here.
Thus, by its own design, Apple remains close to its iPhones through careful management and constant vigil over what software is on an iPhone and how that software is used. Indeed, Apple is much less “removed from the controversy”—in this case, the government’s inability to search Farook’s iPhone—than was the New York Telephone company because that company did not deliberately place its phone lines to prevent inconspicuous government access.... Here, Apple has deliberately used its control over its software to block law-enforcement requests for access to the contents of its devices, and it has advertised that feature to sell its products.
This argument is particularly maddening: basically continuing the ridiculous line of thinking that protecting user privacy is some sort of deliberate marketing strategy against the government, rather than in favor of protecting customers' own security and privacy.
And then we get even more maddening. In discussing the "burden" the DOJ literally tries to argue that if there is a burden, it's Apple's fault for designing a system so secure.
Apple is one of the richest and most tech-savvy companies in the world, and it is more than able to comply with the AWA order. Indeed, it concedes it can do so with relatively little effort. Even this modest burden is largely a result of Apple’s own decision to design and market a nearly warrant-proof phone.
This is monumentally misleading. The whole DOJ premise is that Apple deliberately is trying to interfere with legal investigations. But that's bonkers. Apple is just trying to build a secure phone for its users -- and a natural and unavoidable consequence of that is that it makes it more difficult for law enforcement to get access to that info. But that's because the whole point of such security is to make it more difficult for everyone who is not the phone's owner to get access, because that's how you protect them.
The DOJ is so vain it thinks Apple's security is all about them.
Then we get back to the lying:
Apple’s primary argument regarding undue burden appears to be that it should not be required to write any amount of code to assist the government.
Not really. Its primary argument is that the burden is in writing any amount of code that undermines the safety and security of its customers. That last part is kind of the important part. No wonder the DOJ ignores it.
Apple asserts that it would take six to ten employees two to four weeks to develop new code in order to carry out the Court’s Order.... Even taking Apple at its word, this is not an undue burden, especially given Apple’s vast resources and the government’s willingness to find reasonable compromises and provide reasonable reimbursement.
Apple is a Fortune 5 corporation with tremendous power and means: it has more than 100,000 full-time-equivalent employees and had an annual income of over $200 billion dollars in fiscal year 2015—more than the operating budget for California.... Indeed, Apple’s revenues exceed the nominal GDPs of two thirds of the world’s nations. To build the ordered software, no more than ten employees would be required to work for no more than four weeks, perhaps as little as two weeks.
Again, this is misleading (sense a theme?). First, as noted above, the "burden" is not so much in the time or engineers allotted to this issue. Second, even if we accept the DOJ's assertions here, it's misleading. The Apple filing noted that it would take that much effort just to create the initial code and to test it, but then noted -- quite rightly -- that if in the testing any problems arose, as they almost certainly would, it would need to basically redo the process. Part of the point, which can slip by non-technical people who have no experience developing and deploying code, is that this process could take a long, long time, and involve a lot of effort before it's actually safe to use on the actual phone.
Next up, the DOJ continues to insist that there can't possibly be any danger in creating this code, because Apple surely knows how to guard it, and further, that even if the code got out, that it wouldn't matter because it's asking for code that will only run on the Farook phone.
Next, contrary to Apple’s stated fears, there is no reason to think that the code Apple writes in compliance with the Order will ever leave Apple’s possession. Nothing in the Order requires Apple to provide that code to the government or to explain to the government how it works. And Apple has shown it is amply capable of protecting code that could compromise its security. For example, Apple currently protects (1) the source code to iOS and other core Apple software and (2) Apple’s electronic signature, which as described above allows software to be run on Apple hardware.... Those —which the government has not requested—are the keys to the kingdom. If Apple can guard them, it can guard this.
But, again, that leaves out the reality of testing this particular code and how that makes it much more likely the code will get out. This argument was presented in the amicus brief filed by iPhone forensics and security experts.
Next up, the DOJ totally misrepresents Apple's current assistance to government requests for information from the Chinese government. The DOJ is trying to argue, misleadingly, that Apple has no problem doing the same stuff for China, so that its worries about this case, creating a precedent for authoritarian regimes, is nonsense. But it's the DOJ's argument that's truly nonsense:
Apple suggests that, as a practical matter, it will cease to resist foreign governments’ efforts to obtain information on iPhone users if this Court rules against it. It offers no evidence for this proposition, and the evidence in the public record raises questions whether it is even resisting foreign governments now. For example, according to Apple’s own data, China demanded information from Apple regarding over 4,000 iPhones in the first half of 2015, and Apple produced data 74% of the time.... Apple appears to have made special accommodations in China as well: for example, moving Chinese user data to Chinese government servers, and installing a different WiFi protocol for Chinese iPhones.... Such accommodations provide Apple with access to a huge, and growing, market.... This Court’s Order changes neither the carrots nor the sticks that foreign governments can use on Apple. Thus, it does not follow that if America forgoes Apple’s assistance in this terrorism investigation, Apple will refuse to comply with the demands of foreign governments. Nor does it follow that if the Court stands by its Order, Apple must yield to foreign demands, made in different circumstances without the safeguards of American law.
What the DOJ is referring to here is Apple's latest transparency report in which it notes that it complied with 74% of government requests for information from China. You can see it here:
But again, Apple has always been willing to respond to legitimate government requests for information that it has access to. That's why that same chart shows that it complied with 81% of US requests as well. But that says absolutely nothing about the requirement to build a special system to hack in and access data that it does not currently have access to.
The rest of the China stuff, about servers and WAPI, is just the DOJ picking up on Stewart Baker's conspiracy theory that he posted a few weeks back. Lots of countries (stupidly) demand local storage, not necessarily because of surveillance reasons, but because they think it's good for their economy. And the reason Apple used WAPI was because that was the standard used in China for WiFi-like wireless. And as for the idea that Apple magically gave access to the Chinese, that makes no sense, given that Apple then had to fight a man in the middle attack against iCloud in China that was claimed to have originated from the Chinese government. If Apple gave it access, why would the government need to run a MiTM attack? The whole argument makes no sense.
In the first half of 2015 alone, Apple handled 27,000 “device requests”—often covering multiple devices—and provided data approximately 60% of the time.... If Apple can provide data from thousands of iPhones and Apple users to China and other countries, it can comply with the AWA in America. (Id.) This is not speculation because, in fact, Apple complied for years with American court orders to extract data from passcode-locked iPhones, dedicating infrastructure and personnel in order to do so.
Again that's different. That's about supplying the info that Apple had access to and not about writing code to undermine security features. Apples and oranges.
Finally, the DOJ mocks Apple's constitutional arguments on the First and Fifth Amendments.
Apple’s claim is particularly weak because it does not involve a person being compelled to speak publicly, but a for-profit corporation being asked to modify commercial software that will be seen only by Apple. There is reason to doubt that functional programming is even entitled to traditional speech protections....
To the extent Apple’s software includes expressive elements—such as variable names and comments—the Order permits Apple to express whatever it wants, so long as the software functions.
We're not "compelling" you to say this exactly, we're letting you say whatever you want... so long as it does what we want it to. That still seems like compelled speech, no?
Apple lastly asserts that the Order violates its Fifth Amendment right to due process. Apple is currently availing itself of the considerable process our legal system provides, and it is ludicrous to describe the government’s actions here as “arbitrary.”
Once again, it appears that many of the DOJ's arguments here are misleading in the extreme. Apple's response is due next week, and I imagine it will be quite a read as well.
As so many have tried to frame the Apple v. FBI fight as one of "privacy v. security," the fact is that it's really about security v. security, where it really comes down to what are you more afraid of: the off-chance that someone will secretly plan a terrorist attack on an encrypted iPhone, or the much more likely issue with millions of phones being stolen or hacked into by criminals looking to swipe your private information. Apple's VP of software engineering, Craig Federighi, had now taken to the pages of the Washington Post to try to highlight this issue, and explain that the FBI and DOJ are really trying to make everyone a lot less safe.
But the threat to our personal information is just the tip of the iceberg. Your phone is more than a personal device. In today’s mobile, networked world, it’s part of the security perimeter that protects your family and co-workers. Our nation’s vital infrastructure -- such as power grids and transportation hubs -- becomes more vulnerable when individual devices get hacked. Criminals and terrorists who want to infiltrate systems and disrupt sensitive networks may start their attacks through access to just one person’s smartphone.
And he also has a good response to those, like Manhattan DA Cyrus Vance, who insist that they just want Apple "to go back" to the way they had security on phones prior to iOS 8. In other words, make everyone less secure. Their argument is that if that was okay a few years ago, why isn't it okay now. And the answer is that security holes are found over time and they make systems less and less secure. So taking a step back is not just like going back a couple of years, but much, much worse, because now lots of people know how to get past the security features:
Of course, despite our best efforts, nothing is 100 percent secure. Humans are fallible. Our engineers write millions of lines of code, and even the very best can make mistakes. A mistake can become a point of weakness, something for attackers to exploit. Identifying and fixing those problems are critical parts of our mission to keep customers safe. Doing anything to hamper that mission would be a serious mistake.
That’s why it’s so disappointing that the FBI, Justice Department and others in law enforcement are pressing us to turn back the clock to a less-secure time and less-secure technologies. They have suggested that the safeguards of iOS 7 were good enough and that we should simply go back to the security standards of 2013. But the security of iOS 7, while cutting-edge at the time, has since been breached by hackers. What’s worse, some of their methods have been productized and are now available for sale to attackers who are less skilled but often more malicious.
And, as he notes, the FBI's demands in the San Bernardino case are akin to doing the same thing to the security of iOS 8: creating a vulnerability that will almost certainly "spread around the world in the blink of an eye." It's a good, straightforward piece explaining why the FBI and DOJ's demands are so dangerous here.
The head of GCHQ has decided to make some belated overtures to the tech companies intelligence agencies have alienated over the past few years. With the UK's ever-evolving Investigatory Powers Act hanging around like an unwanted, hungover and extremely nosy houseguest, GCHQ Director Robert Hannigan says it's time for the government and tech companies to work together for the mutual betterment of both to give government agencies the access they're stopping just short of legislating into existence.
“It should be possible for technical experts to sit down together and work out solutions,” he said. “I am not in favor of banning encryption. Nor am I asking for mandatory back doors. …
...before stating that backdoors would be a wonderful thing. They just might need to undergo rebranding.
"Not everything is a back door, still less a door which can be exploited outside a legal framework."
OH. You want those kinds of backdoors: the ones that can only be solely exploited within a legal framework. Sorry, we're fresh out. It's not that tech companies don't want to help, but even "technical experts" can't craft an exploitable "not a backdoor" that can only be exploited by whoever the government decides should be able to exploit it. ("Working together" or not, it will be the government that determines access, rather than the technical experts at tech companies who designed it.)
So, Hannigan doesn't want a backdoor. He wants another set of keys for the front door and is requesting that all parties work together to decide whether this set should be left in the mailbox or under the welcome mat. This skewed view likely comes from Hannigan's assumption that the many years of tech compnany cooperation intelligence and law enforcement agencies have enjoyed comes from a deep well of heartfelt goodwill, rather than numerous laws compelling them to do so.
Nonetheless, Hannigan—making just his second appearance in a public forum since taking the helm of GCHQ in 2014—said tech companies should work more closely with governments to try to come up with ways to give law enforcement what it wants. “The perception that there is nothing but conflict between governments and the tech industry is a caricature,” he said in his speech. “In reality, companies are routinely providing help within the law, and I want to acknowledge that today.”
It may have been less of a caricature pre-2013, but it was never simply about tech companies giving the government whatever it wanted whenever it asked for it. That's why the world's intelligence agencies are hoarding exploits, buying malware from shady merchants, and intercepting hardware shipments to add their own backdoors.
And so, yet another request goes out for cooperation in the ongoing search for an intelligence/law enforcement unicorn while War's "Why Can't We Be Friends?" plays in the background. Nothing's going to move forward until officials like Hannigan admit the thing they want (a safe backdoor) isn't something they can actually have -- at least not without a lot of collateral damage. If they can at least be honest enough to state they want it no matter how many problems it causes elsewhere, then maybe they'll be ready to move to the next level of discussion -- even if the "next level" means the discussion has reached its logical endpoint.