Well, this is disappointing. Back in September, we were happy to see both Apple and Google announced that their mobile platforms would be encrypted by default (for local storage, not for data transmissions), which has kicked off something of a new round of Crypto Wars, as law enforcement types have shoved each other aside to spread as much possible FUD about the "dangers" of mobile encryption (ignoring that they also recommend mobile encryption to keep your data safe).
However, as Ars Technica reported earlier this week, it appears that while Google is encrypting by default on its own Nexus phones that have the latest Android (Lollipop), it slightly eased back the requirements for its OEM partners such as Motorola and Samsung who make their own devices. Default encryption is now "very strongly RECOMMENDED" rather than required. And even with that "very strong RECOMMENDATION," it appears that neither Samsung or Motorola are enabling default encryption on its latest devices.
While some will likely jump to the conclusion that law enforcement pressure is at work here, a much more likely explanation is just the performance drag created by encryption. Last fall, Anandtech did some benchmarking of the Nexus 6 both with encryption on and off, and as the site itself says, the results are "not pretty." Given the competitive market, there's a decent chance that the big phone manufacturers didn't want to get bad benchmark ratings when phones are compared, and those made the decision to go against the "very strong recommendation."
Hopefully this gets sorted out quickly, as phonemakers can optimize new phones for encryption. And, honestly, as the Anandtech report itself notes, these benchmarks are basically meaningless for real world performance:
The real question we have to ask is whether or not any of these storage benchmarks really matter on a mobile device. After all, the number of intensive storage I/O operations being done on smartphones and tablets is still relatively low, and some of the situations where NAND slowdowns are really going to have an effect can be offset by holding things in memory.
But, it appears, while mobile phone makers don't want to take the chance of bad benchmarks hurting their reputation, they're less concerned about leaving consumers' data exposed.
It's disappointing that this is where things are today, after so much focus on default encryption just a few months ago, but hopefully it's just a temporary situation and we'll get to default encryption very, very soon.
Back in January, we pointed out that just after US and EU law enforcement officials started freaking out about mobile encryption and demanding backdoors, that China was also saying that it wanted to require backdoors for itself in encrypted products. Now, President Obama claims he's upset about this, saying that he's spoken directly with China's President Xi Jinping about it:
In an interview with Reuters, Obama said he was concerned about Beijing's plans for a far-reaching counterterrorism law that would require technology firms to hand over encryption keys, the passcodes that help protect data, and install security "backdoors" in their systems to give Chinese authorities surveillance access.
"This is something that I’ve raised directly with President Xi," Obama said. "We have made it very clear to them that this is something they are going to have to change if they are to do business with the United States."
This comes right after the US Trade Rep Michael Froman issued a statement criticizing China for doing the same damn thing that the US DOJ is arguing the US should be doing:
U.S. Trade Representative Michael Froman issued a statement on Thursday criticizing the banking rules, saying they "are not about security – they are about protectionism and favoring Chinese companies".
"The Administration is aggressively working to have China walk back from these troubling regulations," Froman said.
Just last week, Yahoo's chief security officer Alex Stamos raised this exact issue with NSA director Admiral Mike Rogers, asking if Rogers thinks it's appropriate for tech companies to build backdoors for other countries if they build them for the US. Rogers ignored the question, just saying "I think we can work our way through this," which is not an answer. And now we're "working our way through this" by having to deal with other countries, such as China, leaping at this opportunity.
And the week before, President Obama himself claimed that he was all for strong encryption, but argued that there were tradeoffs worth discussing, and that some in his administration believed that demanding backdoors made sense to try to stop terrorist attacks. But it's tough to see how he can claim that it's okay to entertain those ideas on the one hand, while using the other hand to try to slap China for doing the exact same thing.
As security researcher Matthew Green rightly points out, "someday, US officials will look back and realize how much global damage they've enabled with their silly requests for key escrow." But that day is apparently not today.
The administration keeps bleating on and on about how China is a massive cybersecurity "threat" out there, and then hands the country this massive gift by having a kneejerk reaction to better encryption that protects American citizens.
There are way too many stories of Paypal unfairly and ridiculously cutting off services that rely on it as a payment mechanism, but here's yet another one. Mega, the cloud storage provider that is perhaps well-known for being Kim Dotcom's "comeback" act after the US government shut down Megaupload, has had its Paypal account cut off. The company claims that Paypal was pressured by Visa and Mastercard to cut it off:
Visa and MasterCard then pressured PayPal to cease providing payment services to MEGA.
MEGA provided extensive statistics and other evidence showing that MEGA's business is legitimate and legally compliant. After discussions that appeared to satisfy PayPal’s queries, MEGA authorised PayPal to share that material with Visa and MasterCard. Eventually PayPal made a non-negotiable decision to immediately terminate services to MEGA. PayPal has apologised for this situation and confirmed that MEGA management are upstanding and acting in good faith. PayPal acknowledged that the business is legitimate, but advised that a key concern was that MEGA has a unique model with its end-to-end encryption which leads to “unknowability of what is on the platform”.
MEGA has demonstrated that it is as compliant with its legal obligations as USA cloud storage services operated by Google, Microsoft, Apple, Dropbox, Box, Spideroak etc, but PayPal has advised that MEGA's "unique encryption model" presents an insurmountable difficulty.
That last line is particularly bizarre, given that if anyone recognizes the value of encryption it should be a freaking payments company. And, of course, Paypal can't know what's stored on any of those other platforms, so why is it being pressured to cut off Mega?
Mega's theory -- which is mostly reasonable -- is that because Mega was mistakenly listed in a report released by the "Digital Citizens Alliance" that insisted Mega was a rogue cyberlocker storing infringing content, that payment companies were told to cut it off. If true, this is problematic on multiple levels. The methodology of the report was absolutely ridiculous. Because most Mega files are stored privately (like any Dropbox or Box or Google Drive account), the researchers at NetNames have no idea what's actually being stored there or if it's being done perfectly legitimately. Instead, they found a few links to infringing works, and then extrapolated. That's just bad research practices.
Furthermore, the Digital Citizens Alliance is hardly an unbiased third party. It's an MPAA front group that was the key force in the MPAA's (now revealed) secret plan to have states attorneys general attack Google. Think the MPAA has reasons to try to go after any potential revenue source for Kim Dotcom? Remember, taking down Megaupload and winning in court against Dotcom was a key focus of the company since 2010 or so, and Dotcom recently noted that he's out of money and pleading with the court to release some of the funds seized by the government to continue to fight his case. The lawyers who represented him all along quit late last year when he ran out of money. It seems like the MPAA might have ulterior motives in naming Mega to that list, don't you think?
And, this all goes back to this dangerous effort by the White House a few years ago to set up these "voluntary agreements" in which payment companies would agree to cut off service to sites that the entertainment industry declared "bad." There's no due process. There's no adjudication. There's just one industry getting to declare websites it doesn't like as "bad" and all payment companies refusing to serve it. This seems like a pretty big problem.
Last week, The Intercept revealed how the NSA and GCHQ had hacked into the major supplier of SIM cards to swipe encryption keys for tons of mobile phones. Earlier this week, we noted that Gemalto appeared to be taking the Lenovo approach to insisting that no one was put at risk. Today the company presented the "findings" of its internal analysis of what happened, admitting that there were sophisticated hack attacks, but insisting that those attacks could not have reached the goldmine source of encryption keys. First, the admission of the hack:
In June 2010, we noticed suspicious activity in one of our French sites where a third party was trying to spy on the office network. By office network we mean the one used by employees to communicate with each other and the outside world. Action was immediately taken to counter the threat.
In July 2010, a second incident was identified by our Security Team. This involved fake emails sent to one of our mobile operator customers spoofing legitimate Gemalto email addresses. The fake emails contained an attachment that could download malicious code. We immediately informed the customer and also notified the relevant authorities both of the incident itself and the type of malware used.
During the same period, we also detected several attempts to access the PCs of Gemalto employees who had regular contact with customers.
At the time we were unable to identify the perpetrators but we now think that they could be related to the NSA and GCHQ operation.
And then the "but don't worry about it" part:
These intrusions only affected the outer parts of our networks – our office networks - which are in contact with the outside world. The SIM encryption keys and other customer data in general, are not stored on these networks. It is important to understand that our network architecture is designed like a cross between an onion and an orange; it has multiple layers and segments which help to cluster and isolate data.
While the intrusions described above were serious, sophisticated attacks, nothing was detected in other parts of our network. No breaches were found in the infrastructure running our SIM activity or in other parts of the secure network which manage our other products such as banking cards, ID cards or electronic passports. Each of these networks is isolated from one another and they are not connected to external networks.
The report also notes that it appears that someone (again, probably NSA/GCHQ) also targeted communications between Gemalto and its carrier partners using highly targeted spearphishing attacks -- but that the company sought to block those and has long used a "highly secure exchange process" to protect such transmissions.
The company also says that some of the operators listed in the leaked documents are ones that Gemalto has never worked with anyway, so if NSA/GCHQ got access to their keys, it wasn't via Gemalto. It further notes that even where the NSA/GCHQ may have gotten access to keys (via other means) it may have only been of limited use, while also noting that the encryption that was targeted was already pretty weak:
In 2010-2011 most operators in the targeted countries were still using 2G networks. The security level of this second generation technology was initially developed in the 1980s and was already considered weak and outdated by 2010. If the 2G SIM card encryption keys were to be intercepted by the intelligence services, it would be technically possible for them to spy on communications when the SIM card was in use in a mobile phone. This is a known weakness of the old 2G technology and for many years we have recommended that operators deploy extra security mechanisms. However, even if the encryption keys were intercepted by the Intelligence services they would have been of limited use. This is because most 2G SIMs in service at that time in these countries were prepaid cards which have a very short life cycle, typically between 3 and 6 months.
This known weakness in the original 2G standards was removed with the introduction of proprietary algorithms, which are still used as an extra level of security by major network operators. The security level was further increased with the arrival of 3G and 4G technologies which have additional encryption. If someone intercepted the encryption keys used in 3G or 4G SIMs they would not be able to connect to the networks and consequently would be unable to spy on communications. Therefore, 3G and 4G cards could not be affected by the described attack. However, though backward compatible with 2G, these newer products are not used everywhere around the world as they are a bit more expensive and sometimes operators base their purchasing decision on price alone.
While I will admit to being pretty skeptical based on Gemalto's initial comments, its explanation here is somewhat more reasonable. While some may question if Gemalto really was able to figure out what the NSA/GCHQ got access to, it does not appear that the company is merely brushing this off as a non-story. However, if the company was really hacked back in 2010/2011 -- one can reasonably question how much the company can actually determine what really happened.
Update: Many of Gemalto's claims are now coming under scrutiny, with some suggesting that the company's "research" into things misses the point, and the details...
Admiral Mike Rogers, the NSA Director, has barely been on the job for a year, and so far he'd mostly avoided making the same kinds of absolutely ridiculous statements that his predecessor General Keith Alexander was known for. Rogers had, at the very least, appeared slightly more thoughtful in his discussions about the surveillance state and his own role in it. However, Rogers ran into a bit of trouble at New America's big cybersecurity event on Monday -- in that there were actual cybersecurity folks in the audience and they weren't accepting any of Rogers' bullshit answers. The most notable exchange was clearly between Rogers and Alex Stamos, Yahoo's chief security officer, and a well known privacy/cybersecurity advocate.
Alex Stamos (AS): “Thank you, Admiral. My name is Alex Stamos, I’m the CISO for Yahoo!. … So it sounds like you agree with Director Comey that we should be building defects into the encryption in our products so that the US government can decrypt…
Mike Rogers (MR): That would be your characterization. [laughing]
AS: No, I think Bruce Schneier and Ed Felton and all of the best public cryptographers in the world would agree that you can’t really build backdoors in crypto. That it’s like drilling a hole in the windshield.
MR: I’ve got a lot of world-class cryptographers at the National Security Agency.
AS: I’ve talked to some of those folks and some of them agree too, but…
MR: Oh, we agree that we don’t accept each others’ premise. [laughing]
AS: We’ll agree to disagree on that. So, if we’re going to build defects/backdoors or golden master keys for the US government, do you believe we should do so — we have about 1.3 billion users around the world — should we do for the Chinese government, the Russian government, the Saudi Arabian government, the Israeli government, the French government? Which of those countries should we give backdoors to?
MR: So, I’m not gonna… I mean, the way you framed the question isn’t designed to elicit a response.
AS: Well, do you believe we should build backdoors for other countries?
MR: My position is — hey look, I think that we’re lying that this isn’t technically feasible. Now, it needs to be done within a framework. I’m the first to acknowledge that. You don’t want the FBI and you don’t want the NSA unilaterally deciding, so, what are we going to access and what are we not going to access? That shouldn’t be for us. I just believe that this is achievable. We’ll have to work our way through it. And I’m the first to acknowledge there are international implications. I think we can work our way through this.
AS: So you do believe then, that we should build those for other countries if they pass laws?
MR: I think we can work our way through this.
AS: I’m sure the Chinese and Russians are going to have the same opinion.
MR: I said I think we can work through this.
AS: Okay, nice to meet you. Thanks.
MR: Thank you for asking the question. I mean, there are going to be some areas where we’re going to have different perspectives. That doesn’t bother me at all. One of the reasons why, quite frankly, I believe in doing things like this is that when I do that, I say, “Look, there are no restrictions on questions. You can ask me anything.” Because we have got to be willing as a nation to have a dialogue. This simplistic characterization of one-side-is-good and one-side-is-bad is a terrible place for us to be as a nation. We have got to come to grips with some really hard, fundamental questions. I’m watching risk and threat do this, while trust has done that. No matter what your view on the issue is, or issues, my only counter would be that that’s a terrible place for us to be as a country. We’ve got to figure out how we’re going to change that.
[Moderator Jim Sciutto]: For the less technologically knowledgeable, which would describe only me in this room today, just so we’re clear: You’re saying it’s your position that in encryption programs, there should be a backdoor to allow, within a legal framework approved by the Congress or some civilian body, the ability to go in a backdoor?
MR: So “backdoor” is not the context I would use. When I hear the phrase “backdoor,” I think, “well, this is kind of shady. Why would you want to go in the backdoor? It would be very public.” Again, my view is: We can create a legal framework for how we do this. It isn’t something we have to hide, per se. You don’t want us unilaterally making that decision, but I think we can do this.
As you read it, you realize that Rogers keeps thinking that if he says "legal framework" enough times, he can pretend he's not really talking about undermining encryption entirely. Well known cybersecurity guy Bruce Schneier pushed back, pointing out that:
It’s not the legal framework that’s hard, it’s the technical framework. That’s why it’s all or nothing.
“If these are the paths that criminals, foreign actors, terrorist are going to use to communicate, how do we access that?” he asked, citing the need for a “formalized process” to break through encrypted technology.
Rogers pointed toward cooperation between tech companies and law enforcement to combat child pornography. “We have shown in other areas that through both technology, a legal framework, and social compact that we have been able to take on tough issues. I think we can do the same thing here.”
Yes, but that's very different, even as anyone looking to rip apart important privacy and free speech tools loves to shout "child porn," the examples are not even remotely comparable. And no one's looking to backdoor everything just to get at people passing around child porn. But the larger point stands. Rogers seems to think that there is a magic bullet/golden key that will magically only let the good guys through if only the tech industry is willing to work with him on this.
“You don’t want the FBI and you don’t want the NSA unilaterally deciding what” is permissible, Mr. Rogers said.
Except that presumes that if only the surveillance community and the tech industry got together they could come up with such a safe system, and as everyone else is telling him, that's impossible. And for a guy who is supposed to be running an agency that understand cryptography better than anyone else, that's really troubling:
On Friday morning, we noted that the CEOs of Google, Facebook and Yahoo had declined to appear at the President's cybersecurity summit at Stanford, but that Apple CEO Tim Cook was going. However, we pointed out that all signs suggested Cook was going to send a message that he wasn't going to give in and allow the government a backdoor to iOS encryption. Cook had recently noted that the government "would have to cart us out in a box" before Apple would add a backdoor. And, indeed, speaking right before President Obama's speech, Cook delivered a strong defense of encryption and privacy:
“We believe deeply that everyone has a right to privacy and security,” said Cook. “So much of our information now is digital: photos, medical information, financial transactions, our most private conversations. It comes with great benefits; it makes our lives better, easier and healthier. But at Apple, we have always known this also comes with a great responsibility. Hackers are doing everything they can to steal your data, so we’re using every tool at our disposal to build the most secure devices that we can.”
“People have trusted us with their most personal and private information and we must give them the best technology we can to secure it,” said Cook. “Sacrificing our right to privacy can have dire consequences. We live in a world where people are not treated equally. There are people who don’t feel free to practice their religion, express their opinion or love who they choose. Technology can mean the difference between life and death.”
“If we don’t do everything we can to protect privacy, we risk more than money,” said Cook. “We risk our way of life.”
It's great to see tech companies taking a stronger and stronger stand in protecting the privacy of their users and customers. Once again, thank Snowden for actually making this an issue that companies actually need to care about.
Last Friday, at the White House's Cybersecurity Summit at Stanford, reporter Kara Swisher sat down for a half-hour interview with President Obama (and she even dragged her famous red chairs along). It's a better, more in-depth interview than you're ever likely to see from the established mainstream press, and touches on a variety of issues regarding technology and security. While I don't agree with some of the answers, I will say that the President appears to be extremely well-briefed on these issues, and didn't make any totally ridiculous or glaringly misleading remarks. You can see the whole interview here:
In it, he admits that the "Snowden disclosures" (as he calls them) hurt "trust" between DC and the tech industry, and admits that the government has been "a little slow" in updating the laws for how the NSA operates online. However, he does say that surveillance on US persons is very carefully controlled and that he can say "with almost complete certainty that there haven't been abuses on US soil." He admits that's not entirely the case overseas, where there are basically no limits on the NSA's surveillance, and he recognizes that needs to change. Of course, if that's the case, he can do that right now -- because the NSA's authority for all of that is an executive order, 12333, and he could revoke it and write a new one. But he hasn't.
Then he gets to the area I found most interesting and want to focus on, the question of encryption. After discussing how he's looking to update the rules for surveillance and his relationship with tech, the interview proceeds like this:
Obama: There's still some issues like encryption...
Swisher: Let's talk about encryption.
Obama: ... that are challenging, and that's something that's been brought up...
Swisher: What's wrong with what Google and Apple are doing? You have encrypted email.
Swisher: Shouldn't everybody have encrypted email and have their protections?
Obama: Everybody should. And I'm a strong believer in strong encryption. Where the tension has come up, is historically what's happened is that... let's say you knew a particular person was involved in a terrorist plot, and the FBI is trying to figure out who else are they trying to communicate with to prevent the plot. Traditionally, what's been able to happen is they get a court order, the FBI goes to the company, they request those records, the same way they'd go get a court order to request a wiretap. The company technically can comply.
The issue here is, partly in response to consumer demand, partly in response to legitimate concerns about consumer privacy, the technologies may be built to a point where, when the government goes...
Swisher: They can't get the information.
Obama: ... the company says "sorry, we just can't pull it. It's so sealed and tight that even though the government has a legitimate request, technologically we cannot do it."
Swisher: Is what they're doing wrong?
Obama: No. I think they are properly responding to a market demand. All of us are really concerned about making sure our...
Swisher: So what are you going to do?
Obama: Well, what we're going to try to do is see if there's a way for us to narrow this gap. Ultimately, everybody -- and certainly this is true for me and my family -- we all want to know if we're using a smartphone for transactions, sending messages, having private conversations, we don't have a bunch of people compromising that process. There's no scenario in which we don't want really strong encryption.
The narrow question is going to be: if there is a proper request for -- this isn't bulk collection, this isn't fishing expeditions by government -- where there's a situation in which we're trying to get a specific case of a possible national security threat, is there a way of accessing it? If it turns out there's not, then we're really going to have to have a public debate. And, I think some in Silicon Valley would make the argument -- which is a fair argument, and I get -- that the harms done by having any kind of compromised encryption are far greater than...
Swisher: That's an argument you used to make, you would have made. Has something changed?
Obama: No, I still make it. It's just that I'm sympathetic to law enforcement...
Swisher: Why? What happened? Because you were much stronger on...
Obama: No, I'm as strong as I have been. I think the only concern is... our law enforcement is expected to stop every plot. Every attack. Any bomb on a plane. The first time that attack takes place, where it turns out we had a lead and couldn't follow up on it, the public's going to demand answers. This is a public conversation that we should be having. I lean probably further in the direction of strong encryption than some do inside law enforcement. But I am sympathetic to law enforcement, because I know the kind of pressure they're under to keep us safe. And it's not as black and white as it's sometimes portrayed. Now, in fairness, I think those in favor of air tight encryption also want to be protected from terrorists.
Obama: One of the interesting things about being in this job, is that it does give you a bird's eye view. You are smack dab in the middle of these tensions that exist. But, there are times where folks who see this through a civil liberties or privacy lens reject that there's any tradeoffs involved. And, in fact, there are. And you've got to own the fact that it may be that we want to value privacy and civil liberties far more than we do the safety issues. But we can't pretend that there are no tradeoffs whatsoever.
I actually think this is a very good, nuanced answer to this issue. It doesn't descend into hyperbole about child predators and ticking time bombs like law enforcement officials have done. He admits that there are tradeoffs and, at least publicly, seems to be willing to admit that stronger encryption without compromise might be the best solution.
Of course, where we're left with questions is about his requested "public debate." Where and how is that happening? Because, to date, the only noise on this issue coming out of his administration has been on the other side, pushing for new legislation that would require backdoors and compromise encryption. We haven't seen anyone in the administration presenting the other side at all. And, for those of us who strongly believe that a basic cost/benefit analysis of weakening encryption vs. letting law enforcement do their job through traditional detective work would show that the "costs" of weakened encryption vastly outweigh the "threats" of criminals getting away with stuff, it would be nice to see the government at least recognizing that as well.
President Obama chides civil liberties and privacy folks for not getting that there are tradeoffs here, and I don't think that's accurate. Most do recognize the tradeoffs. It's just that they believe the true benefit in terms of "stopping criminals" to weakening encryption is not very great, while the cost to everyone in risking their own privacy is massive. What we have not seen is any indication that law enforcement recognizes that there are tradeoffs, or that they care. Yes, as the President admits, they're weighing some of this against "not getting blamed" when an inevitable "bad event" happens -- but they don't seem to be willing to recognize, at all, the risks to everyone's privacy. That's why they keep talking about golden keys and magic wizards who can make special encryption that only good guys can use.
So I'm glad that the President at least seems to recognize this is a nuanced issue with tradeoffs, but I wish that others in his administration, especially from the law enforcement side, were willing to recognize that as well.
Britain's security services have acknowledged they have the worldwide capability to bypass the growing use of encryption by internet companies by attacking the computers themselves.
The Home Office release of the innocuously sounding "draft equipment interference code of practice" on Friday put into the public domain the rules and safeguards surrounding the use of computer hacking outside the UK by the security services for the first time.
The publication of the draft code follows David Cameron's speech last month in which he pledged to break into encryption and ensure there was no "safe space" for terrorists or serious criminals which could not be monitored online by the security services with a ministerial warrant, effectively spelling out how it might be done.
Encryption works. Properly implemented strong crypto systems are one of the few things that you can rely on. Unfortunately, endpoint security is so terrifically weak that NSA can frequently find ways around it.
The new consultation document from the UK's Home Office seems to confirm that GCHQ can also find ways around it. It is one of two draft "codes of practice" for the main UK law governing surveillance, the Regulation of Investigatory Powers Act 2000 (RIPA). Although it's welcome that more details about the legislative framework are being provided, the way that is being done is problematic, as Carly Nyst, legal director of Privacy International, points out in the Guardian article:
"GCHQ cannot legitimise their unlawful activities simply by publishing codes of conduct with no legislative force. In particular, the use by intelligence agencies of hacking -- an incredibly invasive and intrusive form of surveillance -- cannot be snuck in by the back door through the introduction of a code of conduct that has undergone neither parliamentary nor judicial scrutiny. It is surely no mistake that this code of conduct comes only days before GCHQ is due to argue the lawfulness of its hacking activities in court."
It is also striking that the codes of conduct were released on the same day that the UK's secretive Investigatory Powers Tribunal ruled that British intelligence services had broken the law, but that they were now in compliance because previously unknown policies had been made public. As Nyst speculates, it could be that the UK government is releasing more details of its spying in the form of these consultation documents in an attempt to head off future losses in the courts.
Whether or not that is the case, it certainly seems that the attempts by civil liberties groups to end or at least limit mass surveillance are already having an effect on the UK government, and forcing it to provide basic details of its hitherto completely-secret activities. That success is a strong incentive to continue fighting for more proportionality and meaningful oversight here.
Yesterday, we reposted Julia Angwin's article from ProPublica about how the guy behind GPG, a key tool for email encryption, Werner Koch, was basically broke, and that attempts to crowdfund money to keep going hadn't been all that successful. The story seemed to resonate with lots of people, and the donations started flowing. After getting a grand total of just about €34,000 in 2014, he's already well over €100,000 this year, with most of that coming yesterday after Angwin's story went up. On top of that, Stripe and Facebook each agreed to fund him to the tune of $50,000 per year (from each of them, so $100k total), and the Linux Foundation had agreed to give him $60k (though, Koch admits that the deal there was actually signed last week).
Either way, this is great to see, though it's unfortunate that it had to wait until an article detailing his plight came out. We've seen this sort of thing a few times now, such as when the Heartbleed bug made everyone realize that OpenSSL was basically supported by volunteers with almost no budget at all. Thankfully, the attention there got the project necessary funds to continue to keep us safe.
It really is quite incredible when you realize how much of the internet that you rely on is built by people out of a true labor of love. Often, people have no idea that there even is an opportunity to support those projects, and it's great that Angwin was able to highlight this one and get it the necessary funding to keep moving forward.
The man who built the free email encryption software used by whistleblower Edward Snowden, as well as hundreds of thousands of journalists, dissidents and security-minded people around the world, is running out of money to keep his project alive.
Werner Koch wrote the software, known as Gnu Privacy Guard, in 1997, and since then has been almost single-handedly keeping it alive with patches and updates from his home in Erkrath, Germany. Now 53, he is running out of money and patience with being underfunded.
"I'm too idealistic," he told me in an interview at a hacker convention in Germany in December. "In early 2013 I was really about to give it all up and take a straight job." But then the Snowden news broke, and "I realized this was not the time to cancel."
Like many people who build security software, Koch believes that offering the underlying software code for free is the best way to demonstrate that there are no hidden backdoors in it giving access to spy agencies or others. However, this means that many important computer security tools are built and maintained by volunteers.
Now, more than a year after Snowden's revelations, Koch is still struggling to raise enough money to pay himself and to fulfill his dream of hiring a full-time programmer. He says he's made about $25,000 per year since 2001 — a fraction of what he could earn in private industry. In December, he launched a fundraising campaign that has garnered about $43,000 to date — far short of his goal of $137,000 — which would allow him to pay himself a decent salary and hire a full-time developer.
The fact that so much of the Internet's security software is underfunded is becoming increasingly problematic. Last year, in the wake of the Heartbleed bug, I wrote that while the U.S. spends more than $50 billion per year on spying and intelligence, pennies go to Internet security. The bug revealed that an encryption program used by everybody from Amazon to Twitter was maintained by just four programmers, only one of whom called it his full-time job. A group of tech companies stepped in to fund it.
Koch's code powers most of the popular email encryption programs GPGTools, Enigmail, and GPG4Win. "If there is one nightmare that we fear, then it's the fact that Werner Koch is no longer available," said Enigmail developer Nicolai Josuttis. "It's a shame that he is alone and that he has such a bad financial situation."
The programs are also underfunded. Enigmail is maintained by two developers in their spare time. Both have other full-time jobs. Enigmail's lead developer, Patrick Brunschwig, told me that Enigmail receives about $1,000 a year in donations — just enough to keep the website online.
GPGTools, which allows users to encrypt email from Apple Mail, announced in October that it would start charging users a small fee. The other popular program, GPG4Win, is run by Koch himself.
Email encryption first became available to the public in 1991, when Phil Zimmermann released a free program called Pretty Good Privacy, or PGP, on the Internet. Prior to that, powerful computer-enabled encryption was only available to the government and large companies that could pay licensing fees. The U.S. government subsequently investigated Zimmermann for violating arms trafficking laws because high-powered encryption was subject to export restrictions.
In 1997, Koch attended a talk by free software evangelist Richard Stallman, who was visiting Germany. Stallman urged the crowd to write their own version of PGP. "We can't export it, but if you write it, we can import it," he said.
Inspired, Koch decided to try. "I figured I can do it," he recalled. He had some time between consulting projects. Within a few months, he released an initial version of the software he called Gnu Privacy Guard, a play on PGP and an homage to Stallman's free Gnu operating system.
Koch's software was a hit even though it only ran on the Unix operating system. It was free, the underlying software code was open for developers to inspect and improve, and it wasn't subject to U.S. export restrictions.
Koch continued to work on GPG in between consulting projects until 1999, when the German government gave him a grant to make GPG compatible with the Microsoft Windows operating system. The money allowed him to hire a programmer to maintain the software while also building the Windows version, which became GPG4Win. This remains the primary free encryption program for Windows machines.
In 2005, Koch won another contract from the German government to support the development of another email encryption method. But in 2010, the funding ran out.
For almost two years, Koch continued to pay his programmer in the hope that he could find more funding. "But nothing came," Koch recalled. So, in August 2012, he had to let the programmer go. By summer 2013, Koch was himself ready to quit.
But after the Snowden news broke, Koch decided to launch a fundraising campaign. He set up an appeal at a crowdsourcing website, made t-shirts and stickers to give to donors, and advertised it on his website. In the end, he earned just $21,000.
The campaign gave Koch, who has an 8-year-old daughter and a wife who isn't working, some breathing room. But when I asked him what he will do when the current batch of money runs out, he shrugged and said he prefers not to think about it. "I'm very glad that there is money for the next three months," Koch said. "Really I am better at programming than this business stuff."