Yesterday, at the excellent RightsCon event in San Francisco, Senator Ron Wyden gave a barn burner of a speech, in which he detailed why it was so important to protect our privacy and security in a digital age, at a time when law enforcement and the intelligence communities are digging deeper and deeper into all of our personal information. He started out with a clear and emphatic statement on how he will block any attempt by Congress to undermine encryption:
I am here to tell you why I will use every power I have as a senator to block plans to weaken strong encryption. I am here to tell you why FBI Director Comey’s plans and expected legislation will be a lose-lose - they would lead to less security and less liberty
Furthermore, he made it clear that anyone who says that this just a debate between privacy and security has it totally wrong:
And let me be clear at the outset that the debate about data security is not about choosing security or choosing privacy. It is about choosing less security or choosing more security. People who think that the government should have more surveillance powers will often try to frame this debate as a choice between privacy and security. They are wrong. Our job is to convince the public that when politicians or the news media say that, we are here to tell you it’s not the case. It’s less security versus more security.
He further pointed out that, contrary to the claims of James Comey and others in law enforcement, this is the "golden age of surveillance" in that modern technologies have given law enforcement much more access to private information than they've ever had before. And he compared the freakout claims from law enforcement to similar claims 50 years ago, when the Supreme Court ruled in Miranda v. Arizona that law enforcement had to read people their rights -- and law enforcement and the media insisted this would lead to much more criminal activity:
I think that it is useful to compare this discussion to another one that was playing out fifty years ago. Fifty years ago this summer, the Supreme Court handed down a landmark decision in the case of Miranda vs. Arizona, in which the Court ruled that before law enforcement officers interrogate a suspect, they must advise that person of his or her constitutional rights. Everyone who’s ever watched a TV cop show knows this – you have the right to remain silent, you have the right to an attorney, and so forth. Today, this is a very important feature of the American justice system. It helps ensure that poor people know that they have the same rights under the law as rich people who can afford high-priced lawyers. And it helps reduce the likelihood of innocent people who are unsure about their rights being pressured to sign false confessions. The Miranda ruling helped bring our country closer to the promise of equal justice for all.
But if you had been following the public debate back in the summer of 1966, you would have heard a lot of politicians and prosecutors saying that the sky was falling. A few weeks after the decision, a New York Times headline read “Miranda Decision Said to End Effective Use of Confessions.” The article quoted some of the most respected prosecutors and law enforcement officials in the country warning that this decision was an absolute catastrophe. Future president Richard Nixon called the ruling a “Dickensian legalism” that would “hamstring” law enforcement, and he even suggested that the Constitution should be amended to overturn it.
Needless to say, the sky did not fall. In fact, crime rates have been dropping for the past twenty or thirty years. The national murder rate and burglary rate are both lower than they were the day that the Miranda ruling was handed down. Obviously there are a lot of factors that go into crime rates, but I think it’s clear that despite all of the dire warnings from both politicians and respected law enforcement officials, this ruling did not lead to the end of law enforcement in America. Fifty years later, the Miranda ruling remains a cornerstone of American due process.
Protecting strong encryption to safeguard Americans’ private data. Wyden’s Secure Data Act would ban the government from forcing companies to build backdoors or otherwise weaken the security of their products.
Overhauling the Third Party Doctrine to make clear individuals do not lose their privacy rights just because they share some of their personal information with a particular company.
Increasing transparency by holding at least three congressional hearings each year on the privacy impacts of surveillance laws, authorities and practices.
Being on high alert for fresh attempts to undermine checks on government power. Right now the Justice Department is seeking a change to the rules for getting warrants to track computer hackers that would allow DOJ to use a single warrant to remotely access any computer that a suspected hacker is believed to have broken into. This rule change could potentially allow federal investigators to use a single warrant to access millions of computers, and it would treat the victims of the hack the same as the hacker himself.
Finally, the government must do much more to hire people who understand technology and the implications of weakening digital security and privacy.
He spent some time in his speech blasting the concept of the third party doctrine and how ridiculous it is in practice:
Here’s the problem. A few decades ago, courts began ruling that if you provide information to a third party, like your bank or your phone company, you are no longer keeping it private, and it is no longer protected under the Fourth Amendment to the Constitution.
There is a huge, glaring problem with that logic. When you share your information with a single private company, that is not the same thing as making it public. Your phone company may have records of who you call, and your bank may have records of how you spend your money, but your contract with them will have rules for when and how they are allowed to share that information. They are not allowed to just disclose it freely.
This is true in the digital world as well. When I post a handsome new profile picture on Facebook, or send out a tweet to tell people that I’m holding a town hall in Oregon, I’ve chosen to make that information public. But when I send an email to my wife, or store a document in the cloud so I can work on it later, my service provider and I have an agreement that my information will stay private. The premise in current law is that I have agreed to make that information public just because my service provider is holding it. And that premise is simply absurd.
It's yet another great speech on an important topic from Senator Wyden -- and he includes a call to action to get people who support this vision to speak out on it. As it stands right now there are a few others in Congress who get how important all of this is, but many do not. And that needs to change. And while many people will be quite cynical about this and say that we'll never get others in Congess to recognize this issue, Senator Wyden reminded everyone that many people had the same view about SOPA/PIPA and the public eventually shifted Congress' position on that as well:
We can win this fight for security and liberty. It obviously won’t be easy, but we’ve done it before. Remember in the January of 2012, we were talking about the anti-Internet SOPA and PIPA bills. The first vote was on whether to override my hold on PIPA. Talk about long odds. The Chamber of Commerce, Hollywood, all the powerful special interests were against us. When that debate started, no one gave us a chance. Then the Internet community mobilized. Websites went dark in protest. And when the dust settled, well, everyone here knows how that ended. We won. Let’s work together and do it again.
While I do worry about the tendency of some to always roll out the "SOPA example" as proof, it is true that when enough people speak up, all the lobbyists and money in the world can be defeated. And this is a time when it would be nice to see that happen again.
Anyone who reads Techdirt knows our opinion on encryption: stronger is better, and giving the government (or anyone else) a back door is a dangerous idea. We've decried a lot of the stupid arguments that we've heard in favor of back doors — usually coming from technologically clueless politicians and law enforcement officers — but that doesn't mean we aren't open to considering some smart ones. This week, we've invited Albert Wenger (who you may recall from a discussion about basic income way back in Episode 16) to share his pro-backdoor position and engage in some friendly debate.
If you're a Techdirt regular, you know we've been working hard to cover the truth about a debate that is so often distorted by fear-mongering and empty rhetoric. Earlier this week, we released a new explainer video to summarize the subject and explain why it matters so much. Following the tragic attack in Brussels, the response to which is echoing what happened after the Paris attack, we've been keeping tabs on the politicians who are rushing to blame encryption even though the real issue was an intelligence community failure. And we're committed to covering each new legal development between the FBI and Apple, and (as is always Techdirt's policy) to include the full text of court filings and decisions whenever possible, so those who wish to go beyond our analysis can delve in themselves — something we're surprised still isn't the norm on most news websites.
With your help, we'll be able to keep doing this kind of important reporting on the encryption debate and even expand our coverage. We're also dedicating some of the funding towards spreading the word — purchasing up some of the internet's cheap ad impressions that normally go to spammy banners and replacing them with public messages about encryption and links to our most important stories. We couldn't do any of this without your help, so if you haven't yet backed our project on Beacon, please consider doing so today before it closes. Thanks!
One of the points that seems to be widely misunderstood by people who don't spend much time in computer security worlds, is that building secure encryption systems is really hard and almost everything has some sort of vulnerability somewhere. This is why it's a constant struggle by security researchers, cryptographers and security engineers to continually poke holes in encryption, and try to fix up and patch systems. It's also why the demand for backdoors is idiotic, because they probably already exist in some format. But purposely building in certain kinds of backdoors that can't be closed by law almost certainly blasts open much larger holes for those with nefarious intent to get in.
Case in point: over the weekend, computer science professor Matthew Green and some other researchers announced that they'd discovered a serious hole in the encryption used for Apple's iMessage platform, allowing a sophisticated hacker to access encrypted messages and pictures. And, Green, who has been vocal about the ridiculousness of the DOJ's request against Apple, notes how this is yet more evidence that the DOJ's request is a bad idea:
“Even Apple, with all their skills — and they have terrific cryptographers — wasn’t able to quite get this right,” said Green, whose team of graduate students will publish a paper describing the attack as soon as Apple issues a patch. “So it scares me that we’re having this conversation about adding back doors to encryption when we can’t even get basic encryption right.”
It's worth noting that the flaw that he and his team found would not have helped the FBI get what it wants off of Syed Farook's iPhone, but it's still a reminder of just how complex cryptography currently is, at a time when people are trying to keep everyone out. Offer up any potential backdoor, and you're almost certainly blasting major holes throughout the facade.
Apple is getting ready to push out a software update that will fix the flaw shortly. And this, alone, is yet another reason why the DOJ's case is so dangerous -- since the method it wants to use to get into Farook's phone is via its capabilities to push software updates. Patching software holes is a major reason to accept regular software updates, but the FBI is now trying to co-opt that process to install unsafe code. That, in turn, may prompt people to avoid software updates altogether, which in most cases will make them less safe.
from the concerns-can-be-ignored-when-you're-in-power dept
We've written a few times in the past year about the latest UK efforts to enact its "Snooper's Charter" law, officially the Investigatory Powers Bill, which would give the government much greater surveillance capabilities. Right after last year's election, Prime Minister David Cameron and Home Office Secretary Theresa May made it clear that they were going to go full Orwell, and do whatever possible to grant themselves greater powers to spy on everyone. As more concerns were raised, we noted that the government pretended to back down, while still including all the bad stuff people predicted.
As more and more complaints about the bill were raised, we noted May decided to try to rush the bill through, along with a healthy dose of "if you don't do this we're all going to die!" FUD. That included releasing a new draft of the bill, which pretended to address the privacy concerns people raised, but which did so basically by just adding the word "privacy" to a heading and making no substantive changes to protect privacy at all (and possibly changes that made things worse).
At present the draft law fails to meet international standards for surveillance powers. It requires significant revisions to do so.
First, a law that gives public authorities generalised access to electronic communications contents compromises the essence of the fundamental right to privacy and may be illegal. The investigatory powers bill does this with its “bulk interception warrants” and “bulk equipment interference warrants”.
Second, international standards require that interception authorisations identify a specific target – a person or premises – for surveillance. The investigatory powers bill also fails this standard because it allows “targeted interception warrants” to apply to groups or persons, organisations, or premises.
Third, those who authorise interceptions should be able to verify a “reasonable suspicion” on the basis of a factual case. The investigatory powers bill does not mention “reasonable suspicion” – or even suspects – and there is no need to demonstrate criminal involvement or a threat to national security.
These are international standards found in judgments of the European court of justice and the European court of human rights, and in the recent opinion of the UN special rapporteur for the right to privacy. At present the bill fails to meet these standards – the law is unfit for purpose.
On Tuesday, the House of Commons had its "Second Reading" of the bill, and the debate about it allowed some to raise concerns, but with various parties deciding to abstain from voting, rather than vote against it, the bill moved forward easily (it'll come back to Parliament after the House of Lords goes through the bill). Even worse, the main "opposition" to the bill was not that strongly raised:
Andy Burnham, former Home Office minister, stood to offer the Labour party's official perspective. If there is substantive opposition to the contents of the IP Bill within the Labour party - and I know there is from MPs like Tom Watson and David Winnick - then there was little evidence of it from Mr Burnham's contribution to the debate. He opened by trotting out the dire need to combat the four horsemen of the infocalypse and the false and distorting 'balance security with privacy' dichotomy. From those foundations he was highly unlikely to get anywhere enlightened.
While we're fighting against backdoors and for encryption here in the US, it looks like the UK government is potentially moving very much in the other direction.
The US government has made numerous attempts to obtain source code from tech companies in an effort to find security flaws that could be used for surveillance or investigations.
The government has demanded source code in civil cases filed under seal but also by seeking clandestine rulings authorized under the secretive Foreign Intelligence Surveillance Act (FISA), a person with direct knowledge of these demands told ZDNet. We're not naming the person as they relayed information that is likely classified.
With these hearings held in secret and away from the public gaze, the person said that the tech companies hit by these demands are losing "most of the time."
That's hardly heartening. The DOJ would only go so far as to confirm this has happened before, likely because there's no way to deny it. The documents from the Lavabit case have been made public -- with the DOJ using a formerly-sealed document to hint at what could be in store for Apple if it refuses to write FBiOS for it.
Unfortunately, because of the secrecy surrounding the government's requests for source code -- and the court where those requests have been made -- it's extremely difficult to obtain outside confirmation. Whittaker contacted more than a dozen Fortune 500 companies about the unnamed official's claims and received zero comments.
A few, however, flatly denied ever having handed over source code to the US government.
Cisco said in an emailed statement: "We have not and we will not hand over source code to any customers, especially governments."
IBM referred to a 2014 statement saying that the company does not provide "software source code or encryption keys to the NSA or any other government agency for the purpose of accessing client data." A spokesperson confirmed that the statement is still valid, but did not comment further on whether source code had been handed over to a government agency for any other reason.
Cisco is likely still stinging from leaked documents showing its unwitting participation in an NSA unboxing photo shoot and has undoubtedly decided to take a stronger stance against government meddling since that point. As for IBM, its statement is a couple of years old and contains a major qualifying statement.
Previously-leaked documents somewhat confirm the existence of court orders allowing the NSA to perform its own hardware/software surgery. Presumably, the introduction of backdoors and exploits is made much easier with access to source code. Whittaker points to a Kaspersky Lab's apparent discovery of evidence pointing to the NSA being in possession of "several hard drive manufacturers'" source code -- another indication that the government's history of demanding source code from manufacturers and software creators didn't begin (or end) with Lavabit.
The government may be able to talk the FISA court into granting these requests, given that its purview generally only covers foreign surveillance (except for all the domestic dragnets and "inadvertent" collections) and national security issues. The FBI's open air battle with Apple has already proceeded far past the point that any quasi-hearing in front of the FISC would have. That's the sort of thing an actually adversarial system -- unlike the mostly-closed loop of the FISA court -- tends to result in: a give-and-take played out (mostly) in public, rather than one party saying "we need this" and the other applying ink to the stamp.
In all the discussions about Apple v. the FBI, a few people occasionally ask what would happen if Apple's engineers just refused to write the code demanded (some also ask about writing the code, but purposely messing it up). And now it appears that at least some Apple engineers are thinking about just this scenario. According to the NY Times:
Apple employees are already discussing what they will do if ordered to help law enforcement authorities. Some say they may balk at the work, while others may even quit their high-paying jobs rather than undermine the security of the software they have already created, according to more than a half-dozen current and former Apple employees.
As the NY Times notes, these details certainly add some pretty hefty weight to the First Amendment arguments about "compelled speech" that Apple has made (and that the EFF doubled down on in its amicus brief). As for what then would happen... that's up to the court, but it's likely that the court would find Apple in contempt and/or start fining it. But that still leaves open the question of how does it comply if not a single engineer is willing to help out.
This particular legal dispute gets more interesting day by day...
Although the benefits of sharing big datasets are well-known, so are the privacy issues that can arise as a result. The tension between a desire to share information widely and the need to respect the wishes of those to whom it refers is probably most acute in the medical world. Although the hope is that aggregating health data on a large scale can provide new insights into diseases and their treatments, doing so makes issues of consent even trickier to deal with. A new study of Parkinson's disease from Sage Bionetworks, which describes itself as a "non-profit biomedical research organization," takes a particularly interesting approach. Unusually, it used an iPhone app to gather data directly from the participants:
The mPower app, built by Sage with support from the Robert Wood Johnson Foundation, collects data on capacities affected by Parkinson's disease, including dexterity, balance and gait, memory, and certain vocal characteristics, through tasks that make use of iPhone sensors. For example, to measure dexterity, participants complete a speed tapping exercise on their iPhone’s touchscreen. To evaluate speech, participants use their iPhone's microphone to record themselves pronouncing a vowel -- saying Aaaaah -- for 10 seconds. The app also allows participants to track when each task is completed alongside the time they take their medication, to help determine the effects of that medicine on their symptoms. Participants also complete regular surveys, rating the severity of their symptoms and what they think makes them better or worse.
That's a clever use of smartphone capabilities to allow people to become active participants in the study -- citizen scientists, almost -- but hardly a major breakthrough. What is much more impressive is the way in which the study has handled the issue of how widely the resulting data will be shared, and with whom:
Unlike traditional studies, mPower participants are able to choose who to share their data with. Sharing options include only those researchers associated with mPower, or qualified researchers worldwide. So far, over 75 percent of the more than 12,000 mPower participants chose to share their data broadly with researchers. This cutting-edge consent process, which is the driving force behind the decision to widely release the mPower data set, is outlined in a third paper published today in Nature Biotechnology, and represents a sea of change in participant control over data sharing.
There's a double benefit here. Not only are participants given direct control over how their data will be used, but a broader range of "qualified researchers" can gain access. To be classed as "qualifying," researchers must:
demonstrate their awareness and understanding of the data-sharing framework and applied ethics through a short, 18-question examination;
validate their identity to Sage Bionetworks through a variety of approved methods, such as an academic letter from a signing official, a notarized letter attesting to identity or a copy of a professional license;
make a public statement of intended data use, which we can in turn feed back to participants in the spirit of engagement and transparency;
explicitly agree to a 'contract' of data sharing, including the following: (i) downloading, initialing, signing, scanning and uploading a researcher oath to adhere to a code of behavior; (ii) complying with any data-specific conditions of use.
As you might expect, the researchers at Sage Bionetworks are great believers in openness:
As part of living our philosophy, all of our tools, platforms, and products are open source. Our software is available in Github, and our non-software creative works are licensed under the Creative Commons Attribution 3.0 Unported license except for legacy publications in closed journals.
We now dedicate funds to pay author processing charges for full Open Access on research publications and are actively working to free up our backfile of closed publications.
Open source, open access, open data released with informed consent: this is what open science looks like.
This is not all that surprising, but President Obama, during his SXSW keynote interview, appears to have joined the crew of politicians making misleading statements pretending to be "balanced" on the question of encryption. The interview (the link above should start at the very beginning) talks about a variety of issues related to tech and government, but eventually the President zeroes in on the encryption issue. The embed below should start at that point (if not, it's at the 1 hour, 16 minute mark in the video). Unfortunately, the interviewer, Evan Smith of the Texas Tribune, falsely frames the issue as one of "security v. privacy" rather than what it actually is -- which is "security v. security."
In case you can't watch that, the President says he won't comment directly on the Apple legal fights, but then launches into the standard politician talking point of "yes, we want strong encryption, but bad people will use it so we need to figure out some way to break in."
If you watch that, the President is basically doing the same thing as all the Presidential candidates, stating that there's some sort of equivalency on both sides of the debate and that we need to find some sort of "balanced" solution short of strong encryption that will somehow let in law enforcement in some cases.
This is wrong. This is ignorant.
To his at least marginal credit, the President (unlike basically all of the Presidential candidates) did seem to acknowledge the arguments of the crypto community, but then tells them all that they're wrong. In some ways, this may be slightly better than those who don't even understand the actual issues at all, but it's still problematic.
Let's go through this line by line.
All of us value our privacy. And this is a society that is built on a Constitution and a Bill of Rights and a healthy skepticism about overreaching government power. Before smartphones were invented, and to this day, if there is probable cause to think that you have abducted a child, or that you are engaging in a terrorist plot, or you are guilty of some serious crime, law enforcement can appear at your doorstep and say 'we have a warrant to search your home' and they can go into your bedroom to rifle through your underwear to see if there's any evidence of wrongdoing.
Again, this is overstating the past and understating today's reality. Yes, you could always get a warrant to go "rifle through" someone's underwear, if you could present probable cause that such a search was reasonable to a judge. But that does not mean that the invention of smartphones really changed things so dramatically as President Obama presents here. For one, there has always been information that was inaccessible -- such as information that came from an in-person conversation or information in our brains or information that has been destroyed.
In fact, as lots of people have noted, today law enforcement has much more recorded evidence that it can obtain, totally unrelated to the encryption issue. This includes things like location information or information on people you called. That information used to not be available at all. So it's hellishly misleading to pretend that we've entered some new world of darkness for law enforcement when the reality is that the world is much, much brighter.
And we agree on that. Because we recognize that just like all our other rights, freedom of speech, freedom of religion, etc. there are going to be some constraints that we impose in order to make sure that we are safe, secure and living in a civilized society. Now technology is evolving so rapidly that new questions are being asked. And I am of the view that there are very real reasons why we want to make sure that government cannot just willy nilly get into everyone's iPhones, or smartphones, that are full of very personal information and very personal data. And, let's face it, the whole Snowden disclosure episode elevated people's suspicions of this.
That was a real issue. I will say, by the way, that -- and I don't want to go to far afield -- but the Snowden issue, vastly overstated the dangers to US citizens in terms of spying. Because the fact of the matter is that actually that our intelligence agencies are pretty scrupulous about US persons -- people on US soil. What those disclosures did identify were excesses overseas with respect to people who are not in this country. A lot of those have been fixed. Don't take my word for it -- there was a panel that was constituted that just graded all the reforms that we set up to avoid those charges. But I understand that that raised suspicions.
Again, at least some marginal kudos for admitting that this latest round was brought on by "excesses" (though we'd argue that it was actually unconstitutional, rather than mere overreach). And nice of him to admit that Snowden actually did reveal such "excesses." Of course, that raises a separate question: Why is Obama still trying to prosecute Snowden when he's just admitted that what Snowden did was clearly whistleblowing, in revealing questionable spying?
Also, the President is simply wrong that it was just about issues involving non-US persons. The major reform that has taken place wasn't about US persons at all, but rather about Section 215 of the PATRIOT Act, which was used almost entirely on US persons to collect all their phone records. So it's unclear why the President is pretending otherwise. The stuff outside of the US is governed by Executive Order 12333, and there's been completely no evidence that the President has changed that at all. I do agree, to some extent, that many do believe in an exaggerated view of NSA surveillance, and that's distracting. But the underlying issues about legality and constitutionality -- and the possibilities for abuse -- absolutely remain.
But none of that actually has to do with the encryption fight, beyond the recognition -- accurately -- that the government's actions, revealed by Snowden, caused many to take these issues more seriously. And, on that note, it would have been at least a little more accurate for the President to recognize that it wasn't Snowden who brought this on the government, but the government itself by doing what it was doing.
So we're concerned about privacy. We don't want government to be looking through everybody's phones willy-nilly, without any kind of oversight or probable cause or a clear sense that it's targeted who might be a wrongdoer.
What makes it even more complicated is that we also want really strong encryption. Because part of us preventing terrorism or preventing people from disrupting the financial system or our air traffic control system or a whole other set of systems that are increasingly digitalized is that hackers, state or non-state, can just get in there and mess them up.
So we've got two values. Both of which are important.... And the question we now have to ask is, if technologically it is possible to make an impenetrable device or system where the encryption is so strong that there's no key. There's no door at all. Then how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot? What mechanisms do we have available to even do simple things like tax enforcement? Because if, in fact, you can't crack that at all, government can't get in, then everybody's walking around with a Swiss bank account in their pocket. So there has to be some concession to the need to be able get into that information somehow.
The answer to those questions in that final paragraph are through good old fashioned detective work. In a time before smartphones, detectives were still able to catch child pornographers or disrupt terrorist plots. And, in some cases, the government failed to stop either of those things. But it wasn't because strong enforcement stymied them, but because there are always going to be some plots that people are able to get away with. We shouldn't undermine our entire security setup just because there are some bad people out there. In fact, that makes us less safe.
Also: tax enforcement? Tax enforcement? Are we really getting to the point that the government wants to argue that we need to break strong encryption to better enforce taxes? Really? Again, there are lots of ways to go after tax evasion. And, yes, there are lots of ways that people and companies try to hide money from the IRS. And sometimes they get away with it. To suddenly say that we should weaken encryption because the IRS isn't good enough at its job just seems... crazy.
Now, what folks who are on the encryption side will argue, is that any key, whatsoever, even if it starts off as just being directed at one device, could end up being used on every device. That's just the nature of these systems. That is a technical question. I am not a software engineer. It is, I think, technically true, but I think it can be overstated.
This is the part that's most maddening of all. He almost gets the point right. He almost understands. The crypto community has been screaming from the hills for ages that introducing any kind of third party access to encryption weakens it for all, introducing vulnerabilities that ensure that those with malicious intent will get in much sooner than they would otherwise. The President is mixing up that argument with one of the other arguments in the Apple/FBI case, about whether it's about "one phone" or "all the phones."
But even assuming this slight mixup is a mistake, and that he does recognize the basics of the arguments from the tech community, to have him then say that this "can be overstated" is crazy. A bunch of cryptography experts -- including some who used to work for Obama -- laid out in a detailed paper the risks of undermining encryption. To brush that aside as some sort of rhetorical hyperbole -- to brush aside the realities of cryptography and math -- is just crazy.
Encryption expert Matt Blaze (whose research basically helped win Crypto War 1.0) responded to this argument by noting that the "nerd harder, nerds" argument fundamentally misunderstands the issue:
Figuring out how to build the reliable, secure systems required to "compromise" on crypto has long been a central problem in CS.
If you can't read that, Blaze is basically saying that all crypto includes backdoors -- they're known as vulnerabilities. And the key focus in crypto is closing those backdoors, because leaving them open is disastrous. And yet the government is now demanding that tech folks purposely put in more backdoors and not close them, without recognizing the simple fact that vulnerabilities in crypto always lead to disastrous results.
So the question now becomes that, we as a society, setting aside the specific case between the FBI and Apple, setting aside the commercial interests, the concerns about what could the Chinese government do with this, even if we trust the US government. Setting aside all those questions, we're going to have to make some decisions about how do we balance these respective risks. And I've got a bunch of smart people, sitting there, talking about it, thinking about it. We have engaged the tech community, aggressively, to help solve this problem. My conclusions so far is that you cannot take an absolutist view on this. So if your argument is "strong encryption no matter what, and we can and should in fact create black boxes," that, I think, does not strike the kind of balance that we have lived with for 200, 300 years. And it's fetishizing our phones above every other value. And that can't be the right answer.
This is not an absolutist view. It is not an absolutist view to say that anything you do to weaken the security of phones creates disastrous consequences for overall security, far beyond the privacy of individuals holding those phones. And, as Julian Sanchez rightly notes, it's ridiculous that it's the status quo on the previous compromise that is now being framed as an "absolutist" position:
CALEA--with obligations on telecoms to assist, but user-side encryption protected--WAS the compromise. Now that's "absolutism".
Also, the idea that this is about "fetishizing our phones" is ridiculous. No one is even remotely suggesting that. No one is even suggesting -- as Obama hints -- that this is about making phones "above and beyond" what other situations are. It's entirely about the nature of computer security and how it works. It's about the risks to our security in creating deliberate vulnerabilities in our technologies. To frame that as "fetishizing our phones" is insulting.
There's a reason why the NSA didn't want President Obama to carry a Blackberry when he first became President. And there's a reason the President wanted a secure Blackberry. And it's not because of fetishism in any way, shape or form. It's because securing data on phones is freaking hard and it's a constant battle. And anything that weakens the security puts people in harm's way.
I suspect that the answer is going to come down to how do we create a system where the encryption is as strong as possible. The key is as secure as possible. It is accessible by the smallest number of people possible for a subset of issues that we agree are important. How we design that is not something that I have the expertise to do. I am way on the civil liberties side of this thing. Bill McCraven will tell you that I anguish a lot over the decisions we make over how to keep this country safe. And I am not interested in overthrowing the values that have made us an exceptional and great nation, simply for expediency. But the dangers are real. Maintaining law and order and a civilized society is important. Protecting our kids is important.
You suspect wrong. Because while your position sounds reasonable and "balanced" (and I've seen some in the press describe President Obama's position here as "realist"), it's actually dangerous. This is the problem. The President is discussing this like it's a political issue rather than a technological/math issue. People aren't angry about this because they're "extremists" or "absolutists" or people who "don't want to compromise." They're screaming about this because "the compromise" solution is dangerous. If there really were a way to have strong encryption with a secure key where only a small number of people could get in on key issues, then that would be great.
But the key point that all of the experts keep stressing is: that's not reality. So, no the President's not being a "realist." He's being the opposite.
So I would just caution against taking an absolutist perspective on this. Because we make compromises all the time. I haven't flown commercial in a while, but my understanding is that it's not great fun going through security. But we make the concession because -- it's a big intrusion on our privacy -- but we recognize that it is important. We have stops for drunk drivers. It's an intrusion. But we think it's the right thing to do. And this notion that somehow our data is different and can be walled off from those other trade-offs we make, I believe is incorrect.
Again, this is not about "making compromises" or some sort of political perspective. And the people arguing for strong encryption aren't being "absolutist" about it because they're unwilling to compromise. They're saying that the "compromise" solution means undermining the very basis of how we do security and putting everyone at much greater risk. That's ethically horrific.
And, also, no one is saying that "data is different." There has always been information that is "walled off." What people are saying is that one consequence of strong encryption is that it has to mean that law enforcement is kept out of that information too. That does not mean they can't solve crimes in other ways. It does not mean that they don't get access to lots and lots of other information. It just means that this kind of content is harder to access, because we need it to be harder to access to protect everyone.
It's not security v. privacy. It's security v. security, where the security the FBI is fighting for is to stop the 1 in a billion attack and the security everyone else wants is to prevent much more likely and potentially much more devastating attacks.
Meanwhile, of all the things for the President to cite as an analogy, TSA security theater may be the worst. Very few people think it's okay, especially since it's been shown to be a joke. Setting that up as the precedent for breaking strong encryption is... crazy. And, on top of that, using the combination of TSA security and DUI checkpoints as evidence for why we should break strong encryption with backdoors again fails to recognize the issue at hand. Neither of those undermine an entire security setup.
We do have to make sure, given the power of the internet and how much our lives are digitalized, that it is narrow and that it is constrained and that there's oversight. And I'm confident this is something that we can solve, but we're going to need the tech community, software designers, people who care deeply about this stuff, to help us solve it. Because what will happen is, if everybody goes to their respective corners, and the tech community says "you know what, either we have strong perfect encryption, or else it's Big Brother and Orwellian world," what you'll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties, because the people who understand this best, and who care most about privacy and civil liberties have disengaged, or have taken a position that is not sustainable for the general public as a whole over time.
I have a lot of trouble with the President's line about everyone going to "their respective corners," as it suggests a ridiculous sort of tribalism in which the natural state is the tech industry against the government and even suggests that the tech industry doesn't care about stopping terrorism or child pornographers. That, of course, is ridiculous. It's got nothing to do with "our team." It has to do with the simple realities of encryption and the fact that what the President is suggesting is dangerous.
Furthermore, it's not necessarily the "Orwellian/big brother" issue that people are afraid of. That's a red herring from the "privacy v. security" mindset. People are afraid of this making everyone a lot less safe. No doubt, the President is right that if there's "something really bad" happening then the politics moves in one way -- but it's pretty ridiculous for him to be saying that, seeing as the latest skirmish in this battle is being fought by his very own Justice Department, he's the one who jumped on the San Bernardino attacks as an excuse to push this line of argument.
If the President is truly worried about stupid knee-jerk reactions following "something bad" happening, rather than trying to talk about "balance" and "compromise," he could and should be doing more to fairly educate the American public, and to make public statements about this issue and how important strong encryption is. Enough of this bogus "strong encryption is important, but... the children" crap. The children need strong encryption. The victims of crimes need encryption. The victims of terrorists need encryption. Undermining all that because just a tiny bit of information is inaccessible to law enforcement is crazy. It's giving up the entire ballgame to those with malicious intent, just so that we can have a bit more information in a few narrow cases.
President Obama keeps mentioning trade-offs, but it appears that he refuses to actually understand the trade-offs at issue here. Giving up on strong encryption is not about finding a happy middle compromise. Giving up on strong encryption is putting everyone at serious risk.
As had been hinted at for months, the FCC formally unveiled its plans to apply some relatively basic privacy protections for broadband service. And while you'll likely see the broadband industry bitching up a storm over the next few months, none of the requirements are particularly onerous (and many are things ISPs are doing already). The agency's full announcement (pdf) notes the rules will require that ISPs are a) transparent about what they're doing, b) have basic systems in place to protect collected data and alert users in case of data breach and c) provide users with opt out technology that actually works.
When the FCC reclassified ISPs as common carriers under Title II, ISPs were automatically subjected to Title II’s Section 222 privacy protections regarding "customer proprietary network information" (CPNI). But because those rules were largely designed for ye olde phone company, the FCC is updating them to become very basic rules of the road for handling customer data. ISPs have been quick to complain that they shouldn't face privacy protections that go beyond what Apple and Google deal with, but the FCC correctly argues that the lack of broadband competition make ISPs a unique animal:
"A consumer’s relationship with her ISP is very different than the one she has with a website or app. Consumers can move instantaneously to a different website, search engine or application. But once they sign up for broadband service, consumers can scarcely avoid the network for which they are paying a monthly fee."
ISPs have also argued for the better part of a decade that they should be allowed to self-regulate on the privacy front, but their behavior has repeatedly indicated they're not actually capable of this. Verizon, for example., was caught last year modifying user packets to track users around the Internet and build detailed subscriber profiles. The company engaged in this behavior for two years before it was even discovered by security researchers. It took another six months of public and FCC pressure for Verizon to even provide working opt-out tools.
Preparing for the FCC's announcement the broadband industry has been pushing a number of "studies" trying to argue that broadband privacy protections also aren't necessary because users can protect themselves with encryption or VPNs. But plenty of household data isn't encrypted (including most IOT devices), and the FCC is quick to note that the lack of broadband competition makes the sector notably different from say, search engines:
"Even when data is encrypted, broadband providers can still see the websites that a customer visits, how often they visit them, and the amount of time they spend on each website. Using this information, ISPs can piece together enormous amounts of information about their customers – including private information such as a chronic medical condition or financial problems."
In short what the FCC's doing is again relatively straightforward, basic, and most ISPs will be doing this stuff anyway. But that didn't stop AT&T from taking to its policy blog to complain that the modern broadband industry is just too damned evolved for privacy protections:
"In the 1980’s telecommunications industry, when companies were required by law to stay in their lanes, it might have made sense to have rules that applied only to one set of providers in an industry. But that was 30+ years ago, and we are long past that stage in U.S. communications policy...Limiting ISPs’ ability to compete with ad supported business models – which are overwhelmingly favored by consumers – is bad for consumers and ultimately bad for broadband investment in this country."
Except you should stop and take a look at what AT&T's vision of modern broadband privacy standards looks like. The company has been using deep packet inspection to track its U-verse broadband customers for years. But instead of making it easy and transparent to opt out, AT&T requires that its broadband customers not only jump through a number of confusing hoops, but it actually charges customers upwards of $60 per month to do so. When you understand that AT&T's vision of the broadband privacy future involves making actual privacy a cumbersome paid luxury, you can understand why companies like AT&T are opposed to even the most basic protections.
The FCC's also making it very clear it's not thwarting "innovation" in the user-tracking space, or seriously hindering the cash cow that is tracking users online. That was illustrated with the company's recent settlement with Verizon over the afore-mentioned stealth trackers; the settlement lets Verizon happily continue fiddling with user traffic for the benefit of tracking them, just as long as the company's clear about what's happening and provides working opt-out tools. That's really not a big deal by any standard.
So while you're going to see ISPs and friends bitch endlessly about this being "yet another heavy-handed FCC power grab" and the like, there's really not much here to be afraid of. It's also worth remembering that ISPs like AT&T and Verizon had every opportunity to preempt privacy regulations by showing they were capable of self-regulating, but failed repeatedly and painfully at the task.