from the might-want-to-stay-off-the-internet-for-a-bit dept
Last night, while the mainstream press was yammering on about the security implications of Microsoft ending support of Windows XP (it's already vulnerable, this won't really change anything), a much bigger issue was concerning security folks. A massive vulnerability in OpenSSL, called Heartbleed, was revealed. As Matt Blaze notes, the bug actually leaks data beyond what it's protecting, which makes it worse than no crypto at all. The vulnerability likely impacts a huge number of servers -- including Yahoo's (many other major sites, including Google, Facebook, Twitter, Dropbox and Microsoft are apparently not impacted by this). Oh, and the vulnerability has been there for years. Over at the Tor Project, they made the most succinct statement of how serious this is:
If you need strong anonymity or privacy on the Internet, you might want to stay away from the Internet entirely for the next few days while things settle.
Of course, that also means that if you needed strong anonymity or privacy on the internet, there's a good chance some of the services you use left you vulnerable for quite some time until now.
Among the many problems with President Obama's weak statement concerning NSA surveillance was the fact that he didn't even address the serious issue of the NSA undermining cryptography with backdoors. The White House's task force had included a recommendation to end this practice, and the President appeared to ignore it entirely. Now, a large group of US computer security and cryptography researchers have sent a strongly worded open letter to the President condemning these efforts (and his failure to stop the program).
Indiscriminate collection, storage, and processing of unprecedented amounts of personal information chill free speech and invite many types of abuse, ranging from mission creep to identity theft. These are not hypothetical problems; they have occurred many times in the past. Inserting backdoors, sabotaging standards, and tapping commercial data-center links provide bad actors, foreign and domestic, opportunities to exploit the resulting vulnerabilities.
The value of society-wide surveillance in preventing terrorism is unclear, but the threat that such surveillance poses to privacy, democracy, and the US technology sector is readily apparent. Because transparency and public consent are at the core of our democracy, we call upon the US government to subject all mass-surveillance activities to public scrutiny and to resist the deployment of mass-surveillance programs in advance of sound technical and social controls. In finding a way forward, the five principles promulgated at http://reformgovernmentsurveillance.com/ provide a good starting point.
The choice is not whether to allow the NSA to spy. The choice is between a communications infrastructure that is vulnerable to attack at its core and one that, by default, is intrinsically secure for its users. Every country, including our own, must give intelligence and law-enforcement authorities the means to pursue terrorists and criminals, but we can do so without fundamentally undermining the security that enables commerce, entertainment, personal communication, and other aspects of 21st-century life. We urge the US government to reject society-wide surveillance and the subversion of security technology, to adopt state-of-the-art, privacy-preserving technology, and to ensure that new policies, guided by enunciated principles, support human rights, trustworthy commerce, and technical innovation.
That ReformGovernmentSurveillance.com site is the one launched by a bunch of the biggest internet companies, so it's good to see these researchers and technologists lining up behind that effort as well.
One of the things that's been glaring about all of the investigations and panels and research into these programs is that they almost always leave out actual technologists, and especially leave out security experts. That seems like a big weakness, and now those security researchers are speaking out anyway. At some point, the politicians backing these programs are going to have to realize that almost no one who actually understands this stuff thinks what they're doing is the right way to go about this.
Alan Turing is a name you're required to know if you have any interest in computers, cryptology or artificial intelligence. The famed "Turing Test" is still used as one way to test functionality of artificial intelligence, he's considered the father of modern day computing, and his work in decrypting the Nazi's Enigma code quite possibly shortened the war by a factor of years, saving who knows how many lives from an even further prolonged conflict. The word hero gets tossed around a lot these days, too often utilized to describe athletes and entertainers when it should probably be reserved for people like Turing. He was an amazing person, smart as hell, and dedicated to a craft that unarguably moved humanity forward and simultaneously saved lives.
And, in 1952, he was convicted of being a homosexual and sentenced to chemical castration by hormone injection, leading to his suicide a few years later. That was 1954. And, though it sadly took sixty years, the Queen has officially pardoned Turing for his non-crime.
Announcing the pardon, Grayling said: "Dr. Alan Turing was an exceptional man with a brilliant mind. His brilliance was put into practice at Bletchley Park during the second world war, where he was pivotal to breaking the Enigma code, helping to end the war and save thousands of lives.
"His later life was overshadowed by his conviction for homosexual activity, a sentence we would now consider unjust and discriminatory and which has now been repealed. Dr. Turing deserves to be remembered and recognised for his fantastic contribution to the war effort and his legacy to science. A pardon from the Queen is a fitting tribute to an exceptional man."
It is undoubtedly a good thing that Turing has been pardoned, though the need for such a pardon should never have arisen. For a government to have chemically castrated one of their very best was a crime for which I issue no pardon of my own. And that's important, because the very same queen that was queening over the UK when Turing was convicted, sentenced, and killed himself was the same queen that queeningly issued this pardon. And, amazingly, it took Elizabeth the Second four years to do so after the UK government's Gordon Brown issued an "unequivocal apology" to Turing and his family. There is a very firm lesson here for all of us in how we treat one another, even those who are different from us.
Writer David Leavitt, professor of English at Florida University and author of The Man Who Knew Too Much: Alan Turing and the Invention of the Computer (2006), said it was "great news". The conviction had had "a profound and devastating" effect on Turing, Leavitt said, as the mathematician felt he was being "followed and hounded" by the police "because he was considered a security risk".
"There was this paranoid idea in 1950s England of the homosexual traitor, that he would be seduced by a Russian agent and go over to the other side," Leavitt said. "It was such a misjudgment of Alan Turing because he was so honest, and was so patriotic."
More importantly, it was a misjudgment of Alan Turing as a human being. To use our fear and dislike of those that are different from us to completely negate the possible benefits our fellow human Alan Turing could have brought us had he not been so abused shows the very worst in all of us. So, while it may feel warm and fuzzy that Turing has been officially pardoned, I'd suggest we all keep our eye on the ball this holiday season and make an effort to judge each other not on old and antiquated biases, but on our character and actions. Had humanity done so, who knows where Alan Turing could have brought us? And I would hope that any fear some of us might have of those that are different from us would be outweighed by the fear of losing the contributions of those very same people.
There's a reason why patent trolls love east Texas -- and big part of that is that the juries there have a long history of favoring patent holders, no matter how ridiculous or how trollish. That was on display last night, when the jury in Marshall, Texas sided with patent troll Erich Spangenberg and his TQP shell company over Newegg. As we've been describing, Newegg brought out the big guns to prove pretty damn thoroughly that this guy Mike Jones and his encryption patent were both not new at the time the patent was granted and, more importantly, totally unrelated to the encryption that Newegg and other ecommerce providers rely on. Having Whit Diffie (who invented public key cryptography) and Ron Rivest (who basically made it practical in real life) present on your behalf, showing that they did everything prior to Jones' patent, while further showing that what Newegg was doing relied on their work, not Jones', should have ended the case.
But, apparently TQP's lawyers' technique of attacking Diffie's credibility somehow worked. The jury said both that the patent was valid and that Newegg infringed -- and they awarded TQP $2.3 million -- a little less than half of what TQP wanted, but still a lot more than TQP settled with many other companies (including those with much bigger ecommerce operations than Newegg). In other words, yet another travesty of justice from a jury in east Texas. Newegg will appeal, as it did in its last big patent troll lawsuit (which was much bigger), against Soverain Software. Again, Newegg had lost in East Texas, but prevailed big time on appeal. Hopefully history repeats itself.
Joe Mullin's coverage (linked above) has a bunch of little tidbits about how everyone responded to the verdict, but I think Diffie's response is the most honest. Asked how he was feeling:
"Distressed," he said. "I was hoping to be rid of this business."
Yeah, he's not the only one. The sheer ridiculousness of a jury simply not believing the very people who created the very building blocks of modern encryption, and instead buying the story of someone who did nothing special either with the concept behind his patent or with that patent once it existed, is just distressing. It shows how arbitrary jury trials can be, especially when you have jurors who simply don't understand the technology or the history at play. Blech. I think I may have to go buy an anti-patent troll t-shirt from Newegg.
The review, announced late Friday afternoon by the National Institute for Standards and Technology, will also include an assessment of how the institute creates encryption standards.
The institute sets national standards for everything from laboratory safety to high-precision timekeeping. NIST's cryptographic standards are used by software developers around the world to protect confidential data. They are crucial ingredients for privacy on the Internet, and are designed to keep Internet users safe from being eavesdropped on when they make purchases online, pay bills or visit secure websites.
But as the investigation by ProPublica, The Guardian and The New York Times in September revealed, the National Security Agency spends $250 million a year on a project called "SIGINT Enabling" to secretly undermine encryption. One of the key goals, documents said, was to use the agency's influence to weaken the encryption standards that NIST and other standards bodies publish.
"Trust is crucial to the adoption of strong cryptographic algorithms," the institute said in a statement on their website. "We will be reviewing our existing body of cryptographic work, looking at both our documented process and the specific procedures used to develop each of these standards and guidelines."
The NSA is no stranger to NIST's standards-development process. Under current law, the institute is required to consult with the NSA when drafting standards. NIST also relies on the NSA for help with public standards because the institute doesn't have as many cryptographers as the agency, which is reported to be the largest employer of mathematicians in the country.
"Unlike NSA, NIST doesn't have a huge cryptography staff," said Thomas Ptacek, the founder of Matasano Security, "NIST is not the direct author of many of most of its important standards."
Matthew Scholl, the deputy chief at the Computer Security Division of the institute, echoed that statement, "As NIST Director Pat Gallagher has said in several public settings, NIST is designed to collaborate and the NSA has some of the world's best minds in cryptography." He continued, "We also have parallel missions to protect federal IT systems, so we will continue to work with the NSA."
Some of these standards are products of public competitions among academic cryptography researchers, while others are the result of NSA recommendations. An important standard, known as SHA2, was designed by the NSA and is still trusted by independent cryptographers and software developers worldwide.
NIST withdrew one cryptographic standard, called Dual EC DRGB, after documents provided to news organizations by the former intelligence contractor Edward Snowden raised the possibility that the standard had been covertly weakened by the NSA.
Soon after, a leading cryptography company, RSA, told software writers to stop using the algorithm in a product it sells. The company promised to remove the algorithm in future releases.
Many cryptographers have expressed doubt about NIST standards since the initial revelations were published. One popular encryption library changed its webpage to boast that it did not include NIST-standard cryptography. Silent Circle, a company that makes encryption apps for smartphones, promised to replace the encryption routines in its products with algorithms not published by NIST.
If the NIST review prompts significant changes to existing encryption standards, consumers will not see the benefit immediately. "If the recommendations change, lots of code will need to change," said Tanja Lange, a cryptographer at the University of Technology at Eindhoven, in the Netherlands. "I think that implementers will embrace such a new challenge, but I can also imagine that vendors will be reluctant to invest the extra time."
In Friday's announcement, NIST pointed to its long history of creating standards, including the role it had in creating the first national encryption standard in the 1970s — the Data Encryption Standard, known as DES. "NIST has a proud history in open cryptographic standards, beginning in the 1970s with the Data Encryption Standard," the bulletin said. But even that early standard was influenced by the NSA.
During the development of DES, the agency insisted that the algorithm use weaker keys than originally intended — keys more susceptible to being broken by super computers. At the time, Whitfield Diffie, a digital cryptography pioneer, raised serious concerns about the keys. "The standard will have to be replaced in as few as five years," he wrote.
The weakened keys in the standard were not changed. DES was formally withdrawn by the institute in 2005.
The announcement is the latest effort by NIST to restore the confidence of cryptographers. A representative from NIST announced in a public mailing list, also on Friday, that the institute would restore the original version of a new encryption standard, known as SHA3, that had won a recent design competition but altered by the institute after the competition ended. Cryptographers charged that NIST's changes to the algorithm had weakened it.
The SHA3 announcement referred directly to cryptographers' concerns. "We were and are comfortable with that version on technical grounds, but the feedback we've gotten indicates that a lot of the crypto community is not comfortable with it," wrote John Kelsey, NIST's representative. There is no evidence the NSA was involved in the decision to change the algorithm.
The reversal took Matthew Green, a cryptographer at Johns Hopkins University, by surprise. "NIST backed down! I'm not sure they would have done that a year ago," he said.
The full details here aren't clear, but it looks like another "secure" service based in the US has felt the need to shut down over fears about US surveillance efforts compromising actual security. VPN provider CryptoSeal has announced that it's shuttered the service (via Hacker News):
CryptoSeal Privacy Consumer VPN service terminated with immediate effect
With immediate effect as of this notice, CryptoSeal Privacy, our consumer VPN service, is terminated. All cryptographic keys used in the operation of the service have been zerofilled, and while no logs were produced (by design) during operation of the service, all records created incidental to the operation of the service have been deleted to the best of our ability.
Essentially, the service was created and operated under a certain understanding of current US law, and that understanding may not currently be valid. As we are a US company and comply fully with US law, but wish to protect the privacy of our users, it is impossible for us to continue offering the CryptoSeal Privacy consumer VPN product.
Specifically, the Lavabit case, with filings released by Kevin Poulsen of Wired.com (https://www.documentcloud.org/documents/801182-redacted-pleadings-exhibits-1-23.html) reveals a Government theory that if a pen register order is made on a provider, and the provider's systems do not readily facilitate full monitoring of pen register information and delivery to the Government in realtime, the Government can compel production of cryptographic keys via a warrant to support a government-provided pen trap device. Our system does not support recording any of the information commonly requested in a pen register order, and it would be technically infeasible for us to add this in a prompt manner. The consequence, being forced to turn over cryptographic keys to our entire system on the strength of a pen register order, is unreasonable in our opinion, and likely unconstitutional, but until this matter is settled, we are unable to proceed with our service.
We encourage anyone interested in this issue to support Ladar Levison and Lavabit in their ongoing legal battle. Donations can be made at https://rally.org/lavabit We believe Lavabit is an excellent test case for this issue.
We are actively investigating alternative technical ways to provide a consumer privacy VPN service in the future, in compliance with the law (even the Government's current interpretation of pen register orders and compelled key disclosure) without compromising user privacy, but do not have an estimated release date at this time.
To our affected users: we are sincerely sorry for any inconvenience. For any users with positive account balances at the time of this action, we will provide 1 year subscriptions to a non-US VPN service of mutual selection, as well as a refund of your service balance, and free service for 1 year if/when we relaunch a consumer privacy VPN service. Thank you for your support, and we hope this will ease the inconvenience of our service terminating.
For anyone operating a VPN, mail, or other communications provider in the US, we believe it would be prudent to evaluate whether a pen register order could be used to compel you to divulge SSL keys protecting message contents, and if so, to take appropriate action.
From this it doesn't sound like the company had been approached by the feds yet, but is doing this in a proactive manner, highlighting the chilling effects of the US government's overreach into online security services.
PandoDaily has a fascinating story from Peiter Zatko who had known Julian Assange way back in the day, and last ran into him at Chaos Congress in 2009 (Wikileaks was going strong at this point, but it was well before the release of the "Collateral Murder" video, the accusations of rape, the Ecuadorian embassy asylum and all that). They caught up over dinner and Assange revealed to Zatko how the US government had funded and then shut down his research into a crypto-file system which would allow you to reveal a way to decrypt it that just showed innocuous files, so you could be "forced" to decrypt something without actually revealing anything. Of course, as we pointed out not that long ago, the NSA has a bit of a history of trying to stifle crypto research if it fears that the crypto might be too good, and it sounds like Assange was a victim of that, despite not being American nor working at an American university:
Julian told me his graduate work had been funded by a US government grant, specifically NSA and DARPA money, which was supposed to be used for fundamental security research. It was a time when the Bush Administration and Department of Defense were seen to be classifying a great deal of fundamental research and pulling back on university funds. These universities were getting the message that they could no longer work on the research they had been conducting, and what they had already done was classified. In a Joseph Heller-like twist, they weren’t even allowed to know what it was they had already discovered.
According to Julian, the US government cast such a wide net that even general scientific research, whose output had always been published openly, was swept up in America’s secrecy nets. As you can imagine this did not sit well with Julian, because his work had also been funded by one of these fundamental research funding lines and yanked.
Zatko notes that this experience is a big part of what drove Assange to dedicate his life to openness and helping to expose organizations that tried to keep the public ignorant of important things. While the crux of the article is that the US government inadvertently "created" Wikileaks, that seems like a bit of an exaggeration. However, it is somewhat interesting to know that Assange's work was, at one point, funded by the US government.
Last week, a great blog post by cryptographer and research professor Matthew Green was posted, providing some fantastic details about the likely attack vectors by the NSA to compromise encryption schemes. It's a well written and detailed piece from someone who clearly knows what he's talking about. Oh, and it kicks off with an amusing story about how the reporters working on the "NSA builds backdoors into encryption" story had contacted him for comments and, because they didn't reveal too many details, he was concerned about coming off as too paranoid or too much of a "crank." However, after the details came out, he realized he "wasn't cranky enough."
Oddness aside it was a fun (if brief) set of conversations, mostly involving hypotheticals. If the NSA could do this, how might they do it? What would the impact be? I admit that at this point one of my biggest concerns was to avoid coming off like a crank. After all, if I got quoted sounding too much like an NSA conspiracy nut, my colleagues would laugh at me. Then I might not get invited to the cool security parties.
All of this is a long way of saying that I was totally unprepared for today's bombshell revelations describing the NSA's efforts to defeat encryption. Not only does the worst possible hypothetical I discussed appear to be true, but it's true on a scale I couldn't even imagine. I'm no longer the crank. I wasn't even close to cranky enough.
He then goes on to explain where the most probable attacks are coming from and what we should be most worried about and what's likely still safe. I had hoped to write up something about the post in general, but today something new came up. Green noted that the Dean where he teaches, at Johns Hopkins, had asked him to remove the blog post from the university's servers. The blog post was cross-posted both to a blog on the university's servers, as well as to Green's personal blog on Blogger. The personal blog post is still up (and now about to get that much more attention for the takedown). He also notes that this "isn't my Dean's fault" though plenty of folks are curious whose fault it might be. For what it's worth, it appears that Hopkins has a close relationship with the NSA, and the school really isn't that far from the NSA's headquarters.
Either way, for a whole variety of reasons, demanding the blog post be taken down seems fairly pointless. Not only will it draw much more attention to the original post, it now creates additional scrutiny towards Johns Hopkins as to why it's stifling the speech of one of its professors on a key topic of public interest.