Earlier this week, we noted that a huge list of companies, non-profits and cybersecurity experts had signed a letter to the White House about the stupidity and danger of trying to order backdoors into encryption (disclaimer: we signed the letter as well). While many in the press focused on the companies that had signed onto the letter (including Google, Apple, Cisco, Microsoft, Twitter and Facebook), as we noted, what was much more interesting was the long list of cybersecurity/encryption experts who signed onto the letter. Just in case you don't feel like searching it out, I'll post the entire list of those experts after this post.
It's a who's who of the brightest minds in encryption and cryptography. Whitfield Diffie invented public key cryptography. Phil Zimmermann created PGP. Ron Rivest is the "R" in "RSA." Peter Neumann has been working on these issues for decades before I was even born. And many more on the list are just as impressive.
So how do you think FBI director James Comey -- who has been leading the charge on backdooring encryption -- responded to these experts?
A group of tech companies and some prominent folks wrote a letter to the President yesterday that I frankly found depressing. Because their letter contains no acknowledgment that there are societal costs to universal encryption. Look, I recognize the challenges facing our tech companies. Competitive challenges, regulatory challenges overseas, all kinds of challenges. I recognize the benefits of encryption, but I think fair-minded people also have to recognize the costs associated with that. And I read this letter and I think, “Either these folks don’t see what I see or they’re not fair-minded.” And either one of those things is depressing to me. So I’ve just got to continue to have the conversation.
First of all, it's kind of hilarious for the FBI director to be arguing that the people who signed that letter haven't done a cost-benefit analysis, since we've noted that the intelligence and law enforcement communities almost never do such an analysis. They always insist "more surveillance" must be better, without considering the costs involved.
And then there's this, showing that Comey still doesn't understand the letter at all:
We’ve got to have a conversation long before the logic of strong encryption takes us to that place. And smart people, reasonable people will disagree mightily. Technical people will say it’s too hard. My reaction to that is: Really? Too hard? Too hard for the people we have in this country to figure something out? I’m not that pessimistic. I think we ought to have a conversation.
Hey, Comey! No one is saying it's "too hard." They're saying it's IMPOSSIBLE to do this without weakening everyone's security. Impossible. It's not a "hard" problem, it's an impossible problem. Because if you weaken security to let the FBI in, by definition you are weakening the security to let others in as well. That's the point that was being made.
And this is important. For all of the ridiculous claims by Comey and others that we need to "have a conversation" on this, we do not. A conversation is counterproductive. All of these people can and should be working on systems to make us all more safe and secure. But if they have to keep explaining to ignorant folks like Comey why this is a bad idea, then they are taken away from making us safer. You can have a discussion over things that are hard. But there is no point in having a discussion over things that are impossible.
Secretary of State John Kerry gave a speech in South Korea this week about the importance of an "open and secure internet." Of course, that sounds a little hypocritical coming from the very same government that is actively working to undermine encryption, so it seems worth contrasting it with comments made from Secretary of Homeland Security Jeh Johnson, in which he whines about a secure internet making things better for terrorists. Kerry's speech is mostly good (with some caveats that we'll get to), in talking about the importance of not freaking out over moral panics and FUD:
Freedom. The United States believes strongly in freedom – in freedom of expression, freedom of association, freedom of choice. But particularly, this is important with respect to freedom of expression, and you believe in that freedom of expression here in Korea. We want that right for ourselves and we want that right for others even if we don’t agree always with the views that others express. We understand that freedom of expression is not a license to incite imminent violence. It’s not a license to commit fraud. It’s not a license to indulge in libel, or sexually exploit children. No. But we do know that some governments will use any excuse that they can find to silence their critics and that those governments have responded to the rise of the internet by stepping up their own efforts to control what people read, see, write, and say.
This is truly a point of separation in our era – now, in the 21st century. It’s a point of separation between governments that want the internet to serve their citizens and those who seek to use or restrict access to the internet in order to control their citizens.
That sounds good... until you compare it to Kerry's cabinet partner Johnson, who was doing exactly what Kerry said governments should not do:
“We are concerned that with deeper and deeper encryption, the demands of the marketplace for greater cybersecurity, deeper encryption in basic communications,” Johnson said on MSNBC’s “Morning Joe” on Friday. “It is making it harder for the FBI and state and local law enforcement to track crime, to track potential terrorist activity.”
Let's not even bother with the question of just what is "deeper and deeper encryption" or why we should have someone who clearly doesn't understand encryption in charge of Homeland Security. But it seems clear that Kerry and Johnson's views here are quite different. Kerry is saying that "governments will use any excuse they can" including bogus claims about "terrorism" and "criminals" -- and yet that's exactly what Johnson is doing.
Of course, later in his speech, Kerry starts enumerating a similar list for any country to use, should they want to control speech as well:
First, no country should conduct or knowingly support online activity that intentionally damages or impedes the use of another country’s critical infrastructure. Second, no country should seek either to prevent emergency teams from responding to a cybersecurity incident, or allow its own teams to cause harm. Third, no country should conduct or support cyber-enabled theft of intellectual property, trade secrets, or other confidential business information for commercial gain. Fourth, every country should mitigate malicious cyber activity emanating from its soil, and they should do so in a transparent, accountable and cooperative way. And fifth, every country should do what it can to help states that are victimized by a cyberattack.
In other words, here are the guidelines for any other countries to attack freedom of expression and openness online. Just claim it violates one of the list above and the US can't complain. We've certainly seen it happen before. DDoS attacks launched based on claims that it's in "response" to a hacking attempt. Or Russia cracking down on dissidents by arguing that they must be infringing on copyright law.
Kerry's statement is the kind of thing that very few people would argue against. It seems obvious: of course we don't want attacks on critical infrastructure (though, the government likes to define "critical infrastructure" in a manner that best serves its own needs), or corporate espionage. But Kerry defines things in such a broad manner (including the bogus use of "theft" for "intellectual property") that it leaves the US wide open to abuse. Kerry was right at the beginning in arguing that governments will use any means necessary, so why give them this kind of opening? As we've seen for years, when the US beat up on China for not respecting our patents, China eventually "turned things around" by focusing on figuring out ways to use patents to block American companies from beating local Chinese firms in its market.
This isn't arguing that cyberattacks or infringement of intellectual property are good things -- just that giving foreign nations a "open internet, but..." allows them to make use of the "but..." portion to do all sorts of horrible things that suppress dissent and free expression, and then argue that they had to do it, because the US told them to do so. And, of course, it's not just foreign governments, but as Johnson's comments make clear, those at home as well. None of this means to encourage bad or illegal behavior online -- but to recognize that pushing for internet freedom means actually pushing for internet freedom, which is difficult to do when you immediately encumber it with your own set of conditions, and your colleagues are undermining the very foundation of a secure internet.
Nearly 150 tech companies (including us via the Copia Institute), non-profits and computer security experts have all teamed up to send a letter to President Obama telling him to stop these stupid ideas about backdooring encryption that keeping coming out of his administration. The press headlines will note that big companies -- like Google, Apple, Cisco, Microsoft, Twitter and Facebook -- are signing the letter. But significantly more interesting is the signatures from a huge list of computer security experts, all putting their names down on paper to make it clear what a ridiculously bad idea it is to even think about backdooring encryption. Among those signing on are Phil Zimmermann (who lived through this sort of thing before), Whitfield Diffie (guy who invented public key cryptography), Brian Behlendorf, Ron Rivest, Peter Neumann, Gene Spafford, Bruce Schneier, Matt Blaze, Richard Clarke (long-time counterterrorism guy in the White House), Hal Abelson and many, many more. Basically a who's who of people who actually know what they're talking about.
We urge you to reject any proposal that U.S. companies deliberately weaken the
security of their products. We request that the White House instead focus on
developing policies that will promote rather than undermine the wide adoption of
strong encryption technology. Such policies will in turn help to promote and protect
cybersecurity, economic growth, and human rights, both here and abroad.
Strong encryption is the cornerstone of the modern information economy’s security.
Encryption protects billions of people every day against countless threats—be they street
criminals trying to steal our phones and laptops, computer criminals trying to defraud us,
corporate spies trying to obtain our companies’ most valuable trade secrets, repressive
governments trying to stifle dissent, or foreign intelligence agencies trying to
compromise our and our allies’ most sensitive national security secrets.
Encryption thereby protects us from innumerable criminal and national security threats.
This protection would be undermined by the mandatory insertion of any new
vulnerabilities into encrypted devices and services. Whether you call them “front doors”
or “back doors”, introducing intentional vulnerabilities into secure products for the
government’s use will make those products less secure against other attackers. Every
computer security expert that has spoken publicly on this issue agrees on this point,
including the government’s own experts.
There's much more in the full letter which I highly recommend reading. It very nicely summarizes why this is a completely insane idea, and highlights why anyone raising it should be immediately told to move on to some other project instead:
The Administration faces a critical choice: will it adopt policies that foster a global digital
ecosystem that is more secure, or less? That choice may well define the future of the
Internet in the 21st century. When faced with a similar choice at the end of the last
century, during the so-called “Crypto Wars”, U.S. policymakers weighed many of the
same concerns and arguments that have been raised in the current debate, and correctly
concluded that the serious costs of undermining encryption technology outweighed the
purported benefits. So too did the President’s Review Group on Intelligence and
Communications Technologies, who unanimously recommended in their December 2013
report that the US Government should “(1) fully support and not undermine efforts to
create encryption standards; (2) not in any way subvert, undermine, weaken, or make
vulnerable generally available commercial software; and (3) increase the use of
encryption and urge US companies to do so, in order to better protect data in transit, at
rest, in the cloud, and in other storage.”
The Washington Post quotes another surprising signatory: Paul Rosenzweig, the former Deputy Assistant Secretary for Policy at Homeland Security. If that name sounds familiar, it's because we've quoted his defense of the NSA, once arguing that "too much transparency defeats the very purpose of democracy." If even he is arguing against backdooring encryption, you know it's an idea that should be killed off. In his case, it's because he recognizes the simple reality that seems to have eluded the FBI director:
The signatories include policy experts who normally side with national-security hawks. Paul Rosenzweig, a former Bush administration senior policy official at the Department of Homeland Security, said: “If I actually thought there was a way to build a U.S.-government-only backdoor, then I might be persuaded. But that’s just not reality.”
And the world would be much better off if all of these security experts and companies could focus on better protecting us from harm, rather than having to join in ridiculous debates about what a bunch of clueless bureaucrats think might be some sort of mythical magic unicorn encryption breaker.
When asked directly if the FBI wants a backdoor, [Amy] Hess [Asst. Director of FBI's Science & Technology branch] dodged the question and did not describe in detail what actual solution the FBI is seeking.
“We are simply asking for information that we seek in response to a lawful order in a readable format,” Hess responded, while also repeating that the Bureau supports strong encryption. “But how that actually happens should be the decision of the provider.”
When pressed again, Hess said that it would be okay for the FBI not to have a key to decrypt data, if the provider “can get us that information by maintaining the key themselves.”
That's asking the impossible -- for a great many reasons. First and foremost, compromised encryption is compromised encryption. It can be exploited by criminals and other unwanted entities just as certainly as it can assist law enforcement agencies in obtaining the information they're seeking. There's no way around this fact. You cannot have "good guys only" encryption.
First off, if Google gives the FBI the backdoors it wants, that only nails down Google. But Google also distributes thousands of third-party apps through its Play store. And these apps may not contain the subverted encryption the FBI is looking for. Now, Google has to be in the business of regulating third-party apps to ensure they meet the government's standard for compromised encryption.
The obvious answer is that Google can’t stop with just backdooring disk encryption. It has to backdoor the entire Android cryptography library. Whenever a third-party app generates an encrypted blob of data, for any purpose, that blob has to include a backdoor.
This move may work, but it only affects apps using Google's encryption. Other offerings may rely on other encryption methods. Then what? It has a few options, all of them carrying horrendous implications.
One option: require Google to police its app store for strong cryptography. Another option: mandate a notice-and-takedown system, where the government is responsible for spotting secure apps, and Google has a grace period to remove them. Either alternative would, of course, be entirely unacceptable to the technology sector—the DMCA’s notice-and-takedown system is widely reviled, and present federal law (CDA 230) disfavors intermediary liability.
At this point, Mayer suggests the "solution" is already outside the realm of political feasibility. Would the FBI really push this far to obtain encryption backdoors? The FBI itself seems unsure of how far it's willing to go, and many officials quoted (like the one above) seem to think all the FBI really needs to do is be very insistent on this point, and techies will come up with some magical computing solution that maintains the protective qualities of encryption while simultaneously allowing the government to open the door and have a look around any time it wants to.
So, if the FBI is willing to travel this very dark road littered with an untold amount of collateral damage, it still hasn't managed to ensure the phones it encounters will open at its command. Considering phone users could still acquire apps from other sources, the government's reach would only extend as far as the heavily-policed official app store (and other large competitors' app stores). Now what? More government power and less operational stability.
The only solution is an app kill switch. (Google’s euphemism is “Remote Application Removal.”) Whenever the government discovers a strong encryption app, it would compel Google to nuke the app from Android phones worldwide. That level of government intrusion—reaching into personal devices to remove security software—certainly would not be well received. It raises serious Fourth Amendment issues, since it could be construed as a search of the device or a seizure of device functionality and app data. What’s more, the collateral damage would be extensive; innocent users of the app would lose their data.
Even if the government were willing to take it this far, it still doesn't eradicate apps that it can't crack. (But it may be sufficient to only backdoor the most used apps, which may be all it's looking to achieve...) App creators could decide to avoid Google's government-walled garden and mandated kill switch by assigning random identifiers and handling a majority of the app's services (like a messaging service, etc.) via a website, out of reach of app removal tools and government intervention. To stop this, the US government would need to do the previously unimaginable:
In order to prevent secure data storage and end-to-end secure messaging, the government would have to block these web apps. The United States would have to engage in Internet censorship.
Robert Graham at Errata Security makes similar points in his post on the subject, but raises a couple of other interesting (in the horrific train wreck meaning of the word) points. While the government may try to regulate the internet, it can't (theoretically) touch services hosted in foreign countries. (Although it may soon be able to hack away at them with zero legal repercussions…)
Such services could be located in another country, because there are no real national borders in cyberspace. In any event, such services aren't "phone" services, but instead just "contact" services. They let people find each other, but they don't control the phone call. It's possible to bypass such services anyway, by either using a peer-to-peer contact system, or overloading something completely different, like DNS.
Like crypto, the entire Internet is based on the concept of end-to-end, where there is nothing special inside the network that provides a service you can regulate.
The FBI likely has no desire to take its fight against encryption this far. The problem is that it thinks its "solution" to encryption is "reasonable." But it isn't.
The point is this. Forcing Apple to insert a "Golden Key" into the iPhone looks reasonable, but the truth is the problem explodes to something far outside of any sort of reasonableness. It would mean outlawing certain kinds of code -- which is probably not possible in our legal system.
The biggest problem here is that no one arguing for "golden keys," key escrow, "good guy" backdoors, etc. seems to have any idea what implementing this could actually result in. They think it's just tech companies sticking it to The Man, possibly because a former NSA sysadmin went halfway around the world with a pile of documents and a suitcase of whistles with "BLOW ME" printed on the side.
But it isn't. And their continual shrugged assertion that the "smart guys" at tech companies will figure this all out for them is not only lazy, it's colossally ignorant. There isn't a solution. The government can't demand that companies not provide encryption. It's not willing to ban encryption, nor is it in any position to make that ban stick. It doesn't know what it needs. It only knows what it wants. And it can't have what it wants -- not because no one wants to give it to them -- but because no one can give it to them.
Yes, many tech companies are far more wary of collaborating with the government in this post-Snowden era, but in this case, the tech world cannot give the FBI what it wants without destroying nearly everything surrounding the "back door." And continually trotting out kidnappers, child porn enthusiasts and upskirt photographers as reasons for breaking cell phone platforms doesn't change the fact that it cannot be done without potentially harming every non-criminal phone owner and the services they use.
Yesterday, the House Oversight Committee held a hearing over this whole stupid kerfuffle about mobile encryption. If you don't recall, back in the fall, both Apple and Google said they would start encrypting data on mobile devices by default, leading to an immediate freakout by law enforcement types, launching a near exact replica of the cryptowars of the 1990s.
While many who lived through the first round had hoped this would die a quick death, every week or so, we see someone else in law enforcement demonizing encryption, without seeming to recognize how ridiculous they sound. There was quite a bit of that in the hearing yesterday, which you can sit and watch in its entirety if you'd like:
Thankfully, there were folks like cryptographer Matt Blaze and cybersecurity policy expert Kevin Bankston on hand to make it clear how ridiculous all of this is -- but it didn't stop law enforcement from making their usual claims. The most ridiculous, without a doubt, was Daniel Conley, the District Attorney from Suffolk County, Massachusetts, whose opening remarks were so ridiculous that it's tough to read them without loudly guffawing. It's full of the usual "but bad guys -- terrorists, kidnappers, child porn people -- use this" arguments, along with the usual "law enforcement needs access" stuff. And he blames Apple and Google for using a "hypothetical" situation as reason to encrypt:
Apple and Google are using an unreasonable, hypothetical narrative of government intrusion as the rationale for the new encryption software, ignoring altogether the facts as I’ve just explained them. And taking it to a dangerous extreme in these new operating systems, they’ve made legitimate evidence stored on handheld devices inaccessible to anyone, even with a warrant issued by an impartial judge. For over 200 years, American jurisprudence has refined the balancing test that weighs the individual’s rights against those of society, and with one fell swoop Apple and Google has upended it. They have created spaces not merely beyond the reach of law enforcement agencies, but beyond the reach of our courts and our laws, and therefore our society.
The idea that anything in mobile encryption "upends" anything is ridiculous. First, we've had encryption tools for both computers and mobile devices for quite some time. Apple and Google making them more explicit hardly upends anything. Second, note the implicit (and totally incorrect) assumption that historically law enforcement has always had access to all your communications. That's not true. People have always been able to talk in person, or they've been able to communicate in code. Or destroy communications after making them. There have always been "spaces" that are "beyond the reach of law enforcement."
But to someone so blind as to be unaware of all of this, Conley thinks this is somehow "new":
I can think of no other example of a tool or technology that is specifically designed and allowed to exist completely beyond the legitimate reach of law enforcement, our courts, our Congress, and thus, the people. Not safe deposit boxes, not telephones, not automobiles, not homes. Even if the technology existed, would we allow architects to design buildings that would keep police and firefighters out under any and all circumstances? The inherent risk of such a thing is obvious so the answer is no. So too are the inherent risks of what Apple and Google have devised with these operating systems that will provide no means of access to anyone, anywhere, anytime, under any circumstance.
As Chris Soghoian pointed out, just because Conley can't think of any such technology, it doesn't mean it doesn't exist. Take the shredder for example. Or fire.
During the hearing, Conley continued to show just how far out of his depth he was. Rep. Blake Farenthold (right after quizzing the FBI on why it removed its recommendation on mobile encryption from its website -- using the screenshot and highlighting I made), asked the entire panel:
Is there anybody on the panel believes we can build a technically secure backdoor with a golden key -- raise your hand?
No one did -- neither DA Conley nor the FBI's Amy Hess:
But, just a few minutes later, Conley underscored his near absolute cluelessness by effectively arguing "if we can put a man on the moon, we can make backdoor encryption that doesn't put people at risk." Farenthold catalogs a variety of reasons why backdoor encryption is ridiculously stupid -- and even highlights how every other country is going to demand their own backdoors as well -- and asks if anyone on the panel has any solutions. Conley then raises his hand and volunteers the following bit of insanity:
I'm no expert. I'm probably the least technologically savvy guy in this room, maybe. But, there are a lot of great minds in the United States. I'm trying to figure out a way to balance the interests here. It's not an either/or situation. Dr. Blaze said he's a computer scientist. I'm sure he's brilliant. But, geeze, I hate to hear talk like 'that cannot be done.' I mean, think about if Jack Kennedy said 'we can't go to the moon. That cannot be done.' [smirks] He said something else. 'We're gonna get there in the next decade.' So I would say to the computer science community, let's get the best minds in the United States on this. We can balance the interests here.
No, really. Watch it here:
As Julian Sanchez notes, this response is "all the technical experts are wrong because AMERICA FUCK YEAH."
This is why it's kind of ridiculous that we continue to let technologically clueless people lead these debates. There are things that are difficult (getting to the moon) and things that are impossible (arguing we only let "good people" go to the moon.) There are reasons for that. This isn't about technologists not working hard enough on this problem. This is a fundamental reality in that creating backdoors weakens the infrastructure absolutely. That's a fact. Not a condition of poor engineering practices.
And, really, this idea of "getting the best minds" in the computer science community to work on this, I say please don't. That's like asking the best minds in increasing food production to stop all their work and spend months trying to research how to make it rain apples from clouds in the sky. It's not just counterproductive and impossible, but it takes away from the very real and important work they are doing on a daily basis, including protecting us from people who actually are trying to do us harm. That a law enforcement official is actively asking for computer scientists and cybersecurity experts to stop focusing on protecting people and, instead, to help undermine the safety of the public, is quite incredible. How does someone like Conley stay in his job while publicly advocating for putting the American people in more danger like that?
President Obama’s newly installed defense secretary, Ashton B. Carter, toured Silicon Valley last week to announce a new military strategy for computer conflict, starting the latest Pentagon effort to invest in promising start-ups and to meet with engineers whose talent he declared the Pentagon desperately needed in fending off the nation’s adversaries.
I'm sure the government could use the help but sending pitchmen tied to domestic surveillance/crotch-grabbing airport "security" (as in the case of DHS Secretary Jeh Johnson) or extrajudicial killings/endless wars (as in the case of Carter and the DoD) isn't going to win many new converts. It's going to have even less success winning over those who've already decided there's no way they're partnering up with the US government, not after two years of leaked documents showing the NSA has backdoored hardware, software, mobile devices... basically anything these companies touch.
Carter wants to rebuild trust. He could start by declassifying a pile of documents on Dept. of Defense activities before some leaker does it for him, but he's really not here to offer increased transparency. All he's offering is the same talking point agencies have routinely deferred to when commenting on exposed surveillance programs.
“I think that people and companies need to be convinced that everything we do in the cyber domain is lawful and appropriate and necessary,” Mr. Carter told students and faculty at Stanford.
That sentence is full of truth, but fundamentally dishonest. Yes, people and companies need to be "convinced" that these government agencies are acting lawfully and only doing what's '"appropriate or necessary." But a really good place to start would be actually ensuring that government agencies act lawfully and only do what is appropriate and necessary. Simply claiming you are when the facts show otherwise doesn't do anything for anybody.
There's a CyberWar coming and the government is heavily scouting the West Coast for foot soldiers. If the government finds itself continually rebuffed by tech companies, will it decide to institute a cyberdraft? Legislators are pushing through bills to make "information sharing" -- something that would normally describe voluntary efforts -- mandatory. What Carter says sounds like he's prepared to initiate a cyber-Vietnam Conflict in hopes of heading off the next cyber-Pearl Harbor.
He urged the next generation of software pioneers and entrepreneurs to take a break from developing killer apps and consider a tour of service fending off Chinese, Russian and North Korean hackers…
The Pentagon plans to open its first office in Silicon Valley and provide venture capital in an effort to tap commercial technology that can be used to develop more advanced weapons and intelligence systems.
The desire for bright, young minds is understandable. What isn't is the government's apparent belief that a few chats and moving into the neighborhood will somehow make years of uncovered abuses simply vanish. The outreach would be admirable if it wasn't mired in the usual talking points. The government should expect nothing from the tech world -- for years.
The DoD and DHS opening branch offices in the Silicon Valley just as cybersecurity bills edge closer to becoming law is no coincidence. Much like many military-industrial contractors build offices and plants in the Beltway area to ensure maximum access to legislators, the government must also have a West Coast presence if it wants to efficiently "lobby" for information sharing and surveillance-ready products and services. And let's not forget the government's desire to "share" information is still mostly about obtaining usable exploits and beefing up existing surveillance programs, rather than ensuring the security of its constituents. Any statements to the contrary aren't to be trusted.
Last November, we ran through the list of senior law enforcement officers on both sides of the Atlantic who all came out with suspiciously similar whines about how strong crypto was turning the internet into a "dark and ungoverned" place. Judging by this story in Reuters, others want to join the choir:
Some technology and communication firms are helping militants avoid detection by developing systems that are "friendly to terrorists", Britain's top anti-terrorism police officer said on Tuesday.
That remark comes from Assistant Commissioner Mark Rowley, who is the UK's National Policing Lead for Counter-Terrorism, replacing Cressida Dick. Here's the problem according to Rowley:
"Some of the acceleration of technology, whether it's communications or other spheres, can be set up in different ways," Rowley told a conference in London.
"It can be set up in a way which is friendly to terrorists and helps them ... and creates challenges for law enforcement and intelligence agencies. Or it can be set up in a way which doesn't do that."
"Set up in a way which is friendly to terrorists and helps them" obviously means using strong crypto; "set up in a way which doesn't do that" therefore means with compromised crypto. Like his colleagues, Rowley too blames the current mistrust between the intelligence agencies and computer companies on Edward Snowden:
"Snowden has created an environment where some technology companies are less comfortable working with law reinforcement and intelligence agencies and the bad guys are better informed," Rowley told Reuters after his speech.
Well, no, actually. That "environment" has been created by the NSA and GCHQ working together to break into the main online services, and undermine key aspects of digital technology, with no thought for the collateral damage that ruining internet security might cause for the world. Rowley is also quoted as saying:
"We all love the benefit of the internet and all the rest of it, but we need [technology companies'] support in making sure that they're doing everything possible to stop their technology being exploited by terrorists. I'm saying that needs to be front and centre of their thinking and for some it is and some it isn't."
The technology is not being "exploited" by terrorists, it's being used by them, just as they use telephones or microwaves or washing machines. That's what those devices are there for. The idea that trying to make broken internet technologies should be "front and center" of technology companies' thinking bespeaks a complete contempt for their users.
This constant refrain about how awful strong crypto is, and how we must break it, is simply the intelligence services implicitly admitting that they find the idea of doing their job in a free society, where people are able to keep some messages private, too hard, so they would be really grateful if technology companies could just fall in line and make life easier by destroying privacy for everyone.
Today I am pleased to announce that the Department of Homeland Security is also finalizing plans to open up a satellite office in Silicon Valley, to serve as another point of contact with our friends here. We want to strengthen critical relationships in Silicon Valley and ensure that the government and the private sector benefit from each other’s research and development.
That's Jeh Johnson addressing the crowd at the RSA Conference. Of all the news no one wanted to hear, this has to be close to the top of the list. Three-lettered government agencies are pretty much NIMBY as far as the tech world is concerned, especially after Snowden's revelations have seriously and swiftly eroded trust in the government.
No one wants a next-door neighbor who's going to constantly be dropping by for a cup of decryption.
The current course we are on, toward deeper and deeper encryption in response to the demands of the marketplace, is one that presents real challenges for those in law enforcement and national security.
Let me be clear: I understand the importance of what encryption brings to privacy. But, imagine the problems if, well after the advent of the telephone, the warrant authority of the government to investigate crime had extended only to the U.S. mail.
Our inability to access encrypted information poses public safety challenges. In fact, encryption is making it harder for your government to find criminal activity, and potential terrorist activity.
We in government know that a solution to this dilemma must take full account of the privacy rights and expectations of the American public, the state of the technology, and the cybersecurity of American businesses.
We need your help to find the solution.
"Let me be clear: I understand the importance of what doors bring to privacy. But, imagine the problems if, well after humanity moved out of caves, the warrant authority of the government to investigate crime had only extended to dwellings without doors."
Bullshit. The DHS, along with other law enforcement agencies -- is seeking is the path of least resistance. It can get warrants to search encrypted devices. It just may not be able to immediately crack them open and feast on the innards. It may also get court orders to compel decryption. This is far less assured and risks dragging the Fifth Amendment down to the Fourth's level, but it's still an option.
Then there's the option of subpoenaing third parties, like cloud storage services, to find the content that can't be accessed on the phone. So, it's not as though it's locked out forever. This may happen occasionally but it won't suddenly turn law enforcement into a wholly futile pursuit.
Silicon Valley isn't going to help the DHS "find a solution." There isn't one. The DHS may as well get some legislation going and force companies to provide a stupid "good guys only" backdoor because the tech world already knows you can't keep bad guys out with broken encryption. This should be painfully obvious and yet, the "good guy" agencies seem to think tech companies are just holding out on them.
From there, Johnson switches to his most disingenuous rhetorical device: the assertion that Americans are clamoring for an unrealistic level of safety.
I tell audiences that I can build you a perfectly safe city on a hill, but it will constitute a prison.
Who the fuck is asking you to do that? The only people pushing for "perfectly safe" are government agencies who like big budgets and increased power and the private companies that profit from this sort of fearmongering. Most Americans are far more pragmatic and they'd rather keep what's left of their privacy and civil liberties, even if it means the safety of the country is slightly less assured.
And this makes me want to vomit with contempt:
In the name of homeland security, we can build more walls, erect more screening devices, interrogate more people, and make everybody suspicious of each other, but we should not do this at the cost of who we are as a nation of people who cherish privacy and freedom to travel, celebrate our diversity, and who are not afraid.
Jeh Johnson hasn't been in the position long, but he's already descended into inadvertent self-parody. This speech was apparently delivered with complete sincerity, which means Johnson has no idea how his agency is perceived. There are very few people who believe the DHS is some sort of civil liberties champion. Jeh Johnson is obviously one of them.
from the broken-encryption-isn't-broken-said-no-one-ever dept
The government continues to looks for ways to route around Apple and Google's phone encryption. The plans range from legislated backdoors to a mythical "golden key" to split-key escrow where the user holds one key and the government shares the other with device makers.
None of these are solutions. And there's no consensus that this is a problem in search of one. Law enforcement and intelligence agencies will still find ways to get what they want from these phones, but it may involve more legwork/paperwork and the development of new tools and exploits. Without a doubt, encryption will not leave law enforcement unable to pursue investigations. Cellphones are a relatively recent development in the lifespan of law enforcement and no crime prior to the rise of cellphone usage went uninvestigated because suspects weren't walking around with the entirety of their lives in their pockets.
But still the government continues to believe there's some way to undermine this encryption in a way that won't allow criminals to exploit it. This belief is based on nothing tangible. One can only imagine how many deafening silent beats passed between question and answer during White House cybersecurity policy coordinator Michael Daniel's conversation with reporters following the recent RSA conference.
In a meeting with a handful of reporters, Daniel was asked whether or not he could name a respected technology figure who believed it possible to have strong encryption that could be circumvented by just one party's legal authority.
"I don't have any off the top my head," Daniel said…
And he never will. No one who knows anything about encryption will ever say it's possible to create a "good guys only" backdoor. Or front door. Or whatever analogy government officials choose to deploy when arguing for the "right" to access anyone's device with minimum effort.
But that's not the end of Daniel's embarrassing response. He went on to disingenuously toss this back at "Silicon Valley" with a back-handed compliment insinuating that if these companies don't solve this "problem" for the government, they're either stupid or evil.
[Daniel] added that if any place could come up with an answer, it would be the "enormously creative" Silicon Valley.
The government believes there's a solution out there -- some magical alignment of hashes that would keep malicious hackers out and let the government in. It certainly can't figure out this conundrum, so it's going to keep insinuating that tech companies already know how to solve the problem but they hate children/law enforcement/America so much they won't even consider meeting the government halfway.
But the tech companies know -- as do security experts -- that there's no "halfway." You can have encryption that works and keeps everyone locked out or you can have the government's "encryption," which is spelled exactly the same but has extremely leaky quote marks constantly appended, and which lets everyone in the same "door," no matter who they are or what their intent is.
Some months back, our own Glyn Moody wrote about the music industry in Australia and its attempt to basically broadly multiply copyright protections, routing around the public's representatives in government to get ISPs to act as judge, jury and executioner. Then, because Glyn Moody is a witch who turned my sister into a newt, he wondered aloud whether VPNs would be the next target in the copyright industry's crosshairs.
If it is passed, copyright owners would be able to apply for a federal court order requiring internet service providers to block overseas sites whose primary purpose is infringing copyright or facilitating the infringement of copyright. While the bill is designed to target BitTorrent sites, such as the Pirate Bay, there are concerns other online services such as VPNs and digital storage lockers could fall victim.
The campaigns manager for Choice, Erin Turner, says at least 684,000 Australian households currently employ VPNs to bypass geoblocks and access overseas content at globally competitive prices.
No need to go half way here: if the bill is written and passed in its current vague iteration, VPNs and storage lockers absolutely will be under attack. Entertainment companies both foreign and domestic have been complaining for years about Australians using VPNs to route around geo-restrictions and get overseas content and it would be silly to pretend like infringers don't use VPNs to conceal themselves. All that said, there are a ton of legitimate reasons to use a VPN or storage locker. That's why crafting industry-specific legislation like this is so tricky, particularly when the target of the law is a widely used product of platform. There are simply going to be consequences that the public would consider unintended and that I consider specifically intended in the vagueness of the law. Copyright protection advocates always want more, never less, and they aren't exactly known for behaving reservedly when they feel they have tools at their disposal.
The enemy here is ambiguity.
Copyright expert Kimberlee Weatherall says it is difficult to predict if the bill will be used by copyright holders to argue for an injunction against a VPN service because it lacks clarity regarding services and sites whose primary purpose is not copyright infringement, although may be being used for that purpose.
Which means that the law cannot be allowed to pass as it is currently written. Legislation doesn't necessarily have to be specifically proscriptive, but a lack of clarity on a technology service so common and so tangential to the chief target of the bill means the bill sucks. Hell, it's not like I'm making this concern up, even. Already content providers are arguing for tightened screws on Aussie VPNs.
Cordell Jigsaw Zapruder managing director Nick Murray told Mumbrella the current arrangements are only benefitting international players like Netflix because under the current production deals content is sold by territories.
Asked if it should be illegal for Australians to access overseas platforms using a VPN he said: “It should be. It should absolutely be regulated somehow to make it so people in Australia shouldn’t use VPNs.”Murray defended the arrangement of selling content by territory saying “that’s how we get our money” adding: “The people people who say we should get rid of the geo-blocker, it’s just bizarre, as that is how content is sold.”
Yes, arguing that something should change is bizarre because that thing hasn't changed yet. Great argument you have there. But we can at least give Murray credit for being blatantly open and honest about his desire to take technology tools away from Australian citizens.