One of the most bizarre points that became clear in yesterday's Senate hearings on encryption was that many Senators are so focused on the big bad threat of theoretical ISIS violence in the US, that they don't understand the very real (and not at all theoretical) threat of our personal data that is being hacked into and exposed on a regular basis, often due to a lack of encryption. The ACLU's Chris Soghoian summed it up nicely with the following tweet:
Congress: OPM should have encrypted federal employee data.
Congress: Apple has blood on its hands for encrypting user data.
Congress: OPM should have encrypted federal employee data.
Congress: Apple has blood on its hands for encrypting user data.
Indeed, there has been plenty of talk, including from Congress, over the fact that the Office of Personnel Management, whose computers were hacked to reveal all sorts of information on government employees (past and present), didn't use encryption, in part because their computers were too old. To be fair, there are indications that encryption might not have mattered that much, since the hackers allegedly got working credentials to access the system, and thus may have been able to decrypt anything anyway.
However, it does seem quite telling that at the same time Congress is freaking out about the supposed evils of encryption, the National Institute of Standards and Technology (NIST) is trying to design a better system for encrypting emails via end-to-end encryption -- the very thing that the FBI and some Senators have been complaining about.
The National Institute of Standards and Technology is designing a “security platform” to authenticate mail servers using crytographic keys. The platform would let individual users encrypt emails.
The system aims to “provide Internet users confidence that entities to which they believe they are connecting are the entities to which they are actually connecting," according to a NIST draft report on the topic. A subpar system, the draft said, could result in "unauthorized parties being able to read or modify supposedly secure information, or to use email as a vector for inserting malware into the system," among other consequences. The draft report is open for comment until Aug. 14, 2015.
NIST soon plans to issue Federal Register notices to vendors developing individual parts of the end-to-end system.
In other words, as clueless Senators and FBI officials demand ways to undermine end-to-end encryption, the folks who actually understand technology (NIST) are asking for stronger end-to-end encryption. Perhaps, instead of letting FBI director James Comey prattle on about how he doesn't actually understand this stuff (as he said repeatedly), the Senators could have someone from NIST explain why end-to-end encryption is so important.
As we've been saying, the passage of the USA Freedom Act is just a small first step in the long road to real surveillance reform. On Wednesday, the House took another small step, voting overwhelmingly in favor of an amendment to an appropriations bill put forth by Rep. Thomas Massie that blocks funding to the National Institute of Science and Technology (NIST) for working with the NSA or CIA to undermine or backdoor encryption. This appears to be quite similar to part of the similar amendment last year that banned both this kind of NIST coordination, but also the NSA's use of backdoor searches under Section 702. As far as I can tell, this new amendment does not include that latter bit. Either way, this amendment passed 383 to 43.
It appears that another amendment, put forth by Rep. Ted Poe also passed by voice vote and it would block the use of funds from the DOJ/FBI from being used "to mandate or request that a person alter the product or service of the person to permit electronic surveillance of any user or service" except in cases required under existing wiretapping law.
Both of these are very big deals, and the fact that they passed so easily suggests that the House is nowhere near done on pushing for real surveillance reform. Of course, whether or not these actually go anywhere is another story. As you may recall, after passing overwhelmingly last year, under pressure to get a big omnibus bill done at the end of the year, the House leadership agreed to drop those provisions under pressure from the intelligence community.
Also, one other interesting amendment also appears to have passed easily by voice vote, which is an amendment put forth by Jared Polis, and would make it clear that the DEA cannot do bulk collection under its subpoena authority. As was detailed a few weeks ago, for many years, the DEA had been using this authority to collect tons of phone records, and the program only ended once the administration realized that the claims it was using in support of the NSA's bulk collection didn't apply to the DEA's collection, and thus they couldn't really continue it. Polis's amendment means that this particular loophole is closed for good (not that others might still be open...).
Again, all three of these may not survive all the way into law, but it does show that there's still a very strong interest in the House to continue pushing back against surveillance abuse.
It's no secret that some in the law enforcement and intelligence communities are hell bent on stopping encryption from being widely deployed to protect your data. They've made it 100% clear that they want backdoors into any encryption scheme. But when actual security folks press government officials on how they're going to do this without undermining people's own security and privacy, we get a lot of bureaucratic gobbledygook in response. Either that or magical fairy thinking about golden keys that basically any security expert will tell you are impossible without weakening security.
Not surprisingly, the law enforcement and intelligence communities are not giving up yet. The latest is that the White House appears to be floating a proposal to setup a backdoor to encryption that requires multi-party keys. That is, rather than just having a single key that can decrypt the content, it would require multiple parties with "pieces" of the "key" to come together to unlock it:
Recently, the head of the National Security Agency provided a rare hint of what some U.S. officials think might be a technical solution. Why not, said Adm. Michael S. Rogers, require technology companies to create a digital key that could open any smartphone or other locked device to obtain text messages or photos, but divide the key into pieces so that no one person or agency alone could decide to use it?
“I don’t want a back door,” said Rogers, the director of the nation’s top electronic spy agency during a speech at Princeton University, using a tech industry term for covert measures to bypass device security. “I want a front door. And I want the front door to have multiple locks. Big locks.”
Of course, this proposal is nothing new. As Declan McCullagh points out, during the first "Crypto Wars" of the 1990s, the NSA proposed the same sort of thing with two parties holding parts of the escrow key. It was a dumb idea then and it's a dumb idea now.
The idea being floated here is that by setting up such a system, it's less open to abuse by government/law enforcement/intelligence communities. And maybe that's true. It makes it marginally less likely to be abused by the government. But it can still be abused quite a bit. It's not like we haven't seen multiple government agencies team up to do nefarious things in the past, or even federal officials and private companies. Hell, just look at the recent discussions about the DEA's phone records surveillance program, where the DEA later teamed up with the NSA. And, also, that program required the more or less voluntary cooperation of telcos. So the idea that the requirement of multiple parties somehow lessens the risk seems like a stretch.
But, even if it actually did reduce the risk of direct abuse, it doesn't get anywhere near the real problem with this approach. If you're building in a back door, you're building in a vulnerability that others will eventually be able to exploit. You are flat out weakening the system -- whether or not you split up the key. You're still exposing the data to those with nefarious intent by weakening the overall system.
Thankfully, at least some in the government seem to recognize this:
“The basic question is, is it possible to design a completely secure system” to hold a master key available to the U.S. government but not adversaries, said Donna Dodson, chief cybersecurity advisor at the Commerce Department’s National Institute of Standards and Technologies. “There’s no way to do this where you don’t have unintentional vulnerabilities.”
So, now the questions is if the White House will actually listen to the cybersecurity experts at NIST -- or the people who want to undermine cybersecurity at the NSA and the FBI?
Update: While the article in question claimed that Dr. Wertheimer was the Director of Research for the NSA, an email from the NSA alerts us that Wertheimer left the NSA before writing the article.
As you may recall, one of the big Snowden revelations was the fact that the NSA "took control" over a key security standard allowing backdoors to be inserted (or, at least, a weakness that made it easy to crack). It didn't take long for people to realize that the standard in question was Dual_EC_DRBG, or the Dual Elliptic Curve Deterministic Random Bit Generator. It also came out that the NSA had given RSA $10 million to push this compromised random bit generator as the default. That said, as we noted, many had already suspected something was up and had refused to use Dual_EC_DRBG. In fact, all the way back in 2007, there was a widespread discussion about the possibility of the NSA putting a backdoor in Dual_EC_DRBG, which is why so few actually trusted it.
Still, to have the details come out in public was a pretty big deal, so it also seemed like a fairly big deal to see that the Director of Research at the NSA, Dr. Michael Wertheimer (also former Assistant Deputy Director and CTO in the Office of the Director of National Intelligence), had apparently written something of an apology in the latest Notices of the American Mathematical Society. In a piece entitled, "The Mathematics Community and the NSA," Wertheimer sort of apologizes, admitting that mistakes were made. After admitting that concerns were raised by Microsoft researchers in 2007, and again with the Snowden documents (though without saying why they were raised the second time), here's Wertheimer's "apology."
With hindsight, NSA should have ceased supporting the Dual_EC_DRBG algorithm immediately after security researchers discovered the potential for a trapdoor. In truth, I can think of no better way to describe our failure to drop support for the Dual_EC_DRBG algorithm as anything other than regrettable. The costs to the Defense Department to deploy a new algorithm were not an adequate reason to sustain our support for a questionable algorithm. Indeed, we support NIST’s April 2014 decision to remove the algorithm. Furthermore, we realize that our advocacy for the Dual_EC_DRBG casts suspicion on the broader body of work NSA has done to promote secure standards. Indeed, some colleagues have extrapolated this single action to allege that NSA has a broader agenda to
“undermine Internet encryption.” A fair reading of our track record speaks otherwise. Nevertheless, we understand that NSA must be much more transparent in its standards work and act according to that transparency. That effort can begin with the AMS now.
However, as security researcher/professor Matthew Green quickly shot back, this is a bullshit apology, because he's really only apologizing for not dropping the standard when they got caught red handed back in 2007.
The trouble is that on closer examination, the letter doesn't express regret for the inclusion of Dual EC DRBG in national standards. The transgression Dr. Wertheimer identifies is simply the fact that NSA continued to support the algorithm after major questions were raised. That's bizarre.
Green also takes on Wertheimer's weak attempt to still defend pushing the compromised Dual_EC_DRBG as ridiculous. Here were Wertheimer's arguments for why it was still okay:
The Dual_EC_DRBG was one of four
random number generators in the NIST
standard; it is neither required nor the
The NSA-generated elliptic curve
points were necessary for accreditation
of the Dual_EC_DRBG but only had to
be implemented for actual use in certain DoD applications.
The trapdoor concerns were openly
studied by ANSI X9F1, NIST, and by the
public in 2007.
But, again, those don't make much sense and actually make Wertheimer's non-apology that much worse. As Green notes, even though there were other random number generators, the now infamous RSA deal did lead some to use it since it was the "default" in a popular software library and because NIST had declared the standard safe, meaning that people trusted it. Green also goes into great detail describing how the second point is also incredibly misleading. It's worth reading his full explanation, but the short version is that despite some people fearing the NSA's plan would have a backdoor, the details and the possible "alternatives" to avoid that were completely hidden away and more or less dropped.
And that final point, well... really? Again, that's basically saying, "Well, people thought we might have put in a backdoor, but couldn't prove it, but there, you guys had your chance to debate it." Nevermind the fact that there actually was a backdoor and it wasn't confirmed until years later. And, as Green notes, many of the concerns were actually raised earlier and swept under the rug. Also, the standard was pushed and adopted by RSA as a default long before some of these concerns were raised as well.
This might all be academic, but keep this in mind: we now know that RSA Security began using the Dual EC DRBG random number generator in BSAFE -- as the default, I remind you -- in 2004. That's three years during which concerns were not openly studied by the public.
To state that the trapdoor concerns were 'openly' studied in 2007 is absolutely true. It's just completely irrelevant.
In other words, this isn't an apology. It's an apology that the NSA got caught (and didn't stop pushing things the first time it got caught), and then a weak defense of why they still went ahead with a compromised offering.
Wertheimer complains that this one instance has resulted in distrust from the mathematics and cryptography community. If so, his weak response isn't going to help very much.
Even as major NSA reform appears to have become a cruel joke, there are still some small wins happening elsewhere. As noted by Access, the House Science and Technology Committee adopted an amendment to the FIRST Act (Frontiers in Innovation, Research, Science, and Technology -- which is supposed to be about increasing funding in science and technology) that says the National Institute for Standards and Technology (NIST) no longer has to consult with the NSA on encryption standards.
As you may recall, the NSA secretly took over an encryption standard, purposely weakened it, paid RSA to make it a "default" in one of its products and basically weakened everyone's security. NIST has been dealing with the consequences ever since.
The Amendment, authored by Rep. Alan Grayson, would mean that NIST can skip dealing with the NSA altogether. As Grayson noted in a statement:
These are serious allegations. NIST, which falls solely under the jurisdiction of the Science, Space, and Technology Committee, has been given "the mission of developing standards, guidelines, and associated methods and techniques for information systems". To violate that charge in a manner that would deliberately lessen encryption standards, and willfully diminish American citizens' and business' cyber-security, is appalling and warrants a stern response by this Committee. Many businesses, from Facebook to Google, have lamented the NSA's actions in the cyber world; and some, such as Lavabit, have consciously decided to shut their doors rather than continue to comply with the wishes of the NSA. Changes need to be made at NIST to protect its work in the encryption arena.
Back in December, it was revealed that the NSA had given RSA $10 million to push weakened crypto. Specifically, RSA took $10 million to make Dual Elliptic Curve Deterministic Random Bit Generator, better known as Dual_EC_DRBG, as the default random number generator in its BSAFE offering. The random number generator is a key part of crypto, because true randomness is nearly impossible, so you need to be as random as possible. If it's not truly random, you've basically made incredibly weak crypto that is easy to break. And that's clearly what happened here. There were other stories, released earlier, about how the NSA spent hundreds of millions of dollars to effectively take over security standards surreptitiously, including at least one standard from the National Institute of Standards and Technology (NIST). People quickly realized they were talking about Dual_EC_DRBG, meaning that the algorithm was suspect from at least September of last year (though there were indications many suspected it much earlier).
The revised document retains three of the four previously available options for generating pseudorandom bits needed to create secure cryptographic keys for encrypting data. It omits an algorithm known as Dual_EC_DRBG, or Dual Elliptic Curve Deterministic Random Bit Generator. NIST recommends that current users of Dual_EC_DRBG transition to one of the three remaining approved algorithms as quickly as possible.
In September 2013, news reports prompted public concern about the trustworthiness of Dual_EC_DRBG. As a result, NIST immediately recommended against the use of the algorithm and reissued SP 800-90A for public comment.
Some commenters expressed concerns that the algorithm contains a weakness that would allow attackers to figure out the secret cryptographic keys and defeat the protections provided by those keys. Based on its own evaluation, and in response to the lack of public confidence in the algorithm, NIST removed Dual_EC_DRBG from the Rev. 1 document.
In the announcement, NIST also points out that it's reviewing its cryptographic standards development process, to try to prevent this sort of thing from happening again.
On Friday, a very big story broke on Reuters, saying that the NSA had paid RSA $10 million in order to promote Dual EC DRBG encryption as the default in its BSAFE product. It had been suspected for a few years, and more or less confirmed earlier this year, that the NSA had effectively taken over the standards process for this standard, allowing it to hide a weakness, making it significantly easier for the NSA to crack any encrypted content using it.
As plenty of people noted, the news that RSA took $10 million to promote a compromised crypto standard pretty much destroys RSA's credibility. The company, now owned by EMC, has now put out a statement in response to all of this, which some claim is the RSA denying the story. In fact, RSA itself states: "we categorically deny this allegation." But, as you read the details, that doesn't appear to be the case at all. They more or less say that they don't reveal details of contracts, so won't confirm or deny any particular contract, and that while they did promote Dual EC DRBG, and knew that the NSA was involved, they never knew that it was compromised.
In short: yes, RSA did exactly what the Reuters article claimed, but its best defense is that it didn't know that Dual EC DRBG was compromised, so they didn't take money to weaken crypto... on purpose. Even if that's what happened.
We made the decision to use Dual EC DRBG as the default in BSAFE toolkits in 2004, in the context of an industry-wide effort to develop newer, stronger methods of encryption. At that time, the NSA had a trusted role in the community-wide effort to strengthen, not weaken, encryption.
Right, but that raises questions of why RSA trusted NSA to be a good player here, rather than trying to insert compromises or backdoors into key standards.
This algorithm is only one of multiple choices available within BSAFE toolkits, and users have always been free to choose whichever one best suits their needs.
Yes, but it was the default. And, as everyone knows, a very large percentage of folks just use the default.
We continued using the algorithm as an option within BSAFE toolkits as it gained acceptance as a NIST standard and because of its value in FIPS compliance. When concern surfaced around the algorithm in 2007, we continued to rely upon NIST as the arbiter of that discussion.
Again, this doesn't make RSA look good. As has now become clear, the NSA had basically sneakily taken over the whole standardization process. RSA more or less trusting NIST without looking into the matter themselves raises questions. Especially if there was a $10 million contract that incentivized them not to dig too deeply. RSA promoted this standard as the default in BSAFE. You would hope that a company with the stature in the space like RSA would be more careful than just to rely on someone else's say so that a particular standard is secure.
RSA claiming it didn't know the standard the NSA paid them $10 million to make default was suspect is hardly convincing. Why else would the NSA suddenly pay them $10 million to promote that standard? Furthermore, it appears that news of this $10 million contract was known a bit more widely. Chris Soghoian points to an email from cypherpunk Lucky Green, from back in September, to a cryptography mailing list in which he more or less reveals the same info that Reuters reported on Friday, though without naming the company.
According to published reports that I saw, NSA/DoD pays $250M (per
year?) to backdoor cryptographic implementations. I have knowledge of
only one such effort. That effort involved DoD/NSA paying $10M to a
leading cryptographic library provider to both implement and set as
the default the obviously backdoored Dual_EC_DRBG as the default RNG.
This was $10M wasted. While this vendor may have had a dominating
position in the market place before certain patents expired, by the
time DoD/NSA paid the $10M, few customers used that vendor's
While this describes the right amount, if the NSA is really spending $250 million, it's certainly possible that it has quite a few other $10 million contracts out there to promote or avoid certain other encryption standards depending on what it desires. Hopefully, some reporters are currently reaching out to all the companies on this list to see if they've got any contracts with the NSA concerning Dual EC DRBG.
Companies taking money from NSA, but claiming that they didn't realize the encryption the contract pushed them to promote was compromised, aren't going to find a very sympathetic audience outside of the NSA. The RSA's "categorical denial" here misses the point. It certainly doesn't suggest that the Reuters story was wrong -- just that the RSA was so blinded by a mere $10 million that it didn't bother to make sure the standard wasn't compromised.
The review, announced late Friday afternoon by the National Institute for Standards and Technology, will also include an assessment of how the institute creates encryption standards.
The institute sets national standards for everything from laboratory safety to high-precision timekeeping. NIST's cryptographic standards are used by software developers around the world to protect confidential data. They are crucial ingredients for privacy on the Internet, and are designed to keep Internet users safe from being eavesdropped on when they make purchases online, pay bills or visit secure websites.
But as the investigation by ProPublica, The Guardian and The New York Times in September revealed, the National Security Agency spends $250 million a year on a project called "SIGINT Enabling" to secretly undermine encryption. One of the key goals, documents said, was to use the agency's influence to weaken the encryption standards that NIST and other standards bodies publish.
"Trust is crucial to the adoption of strong cryptographic algorithms," the institute said in a statement on their website. "We will be reviewing our existing body of cryptographic work, looking at both our documented process and the specific procedures used to develop each of these standards and guidelines."
The NSA is no stranger to NIST's standards-development process. Under current law, the institute is required to consult with the NSA when drafting standards. NIST also relies on the NSA for help with public standards because the institute doesn't have as many cryptographers as the agency, which is reported to be the largest employer of mathematicians in the country.
"Unlike NSA, NIST doesn't have a huge cryptography staff," said Thomas Ptacek, the founder of Matasano Security, "NIST is not the direct author of many of most of its important standards."
Matthew Scholl, the deputy chief at the Computer Security Division of the institute, echoed that statement, "As NIST Director Pat Gallagher has said in several public settings, NIST is designed to collaborate and the NSA has some of the world's best minds in cryptography." He continued, "We also have parallel missions to protect federal IT systems, so we will continue to work with the NSA."
Some of these standards are products of public competitions among academic cryptography researchers, while others are the result of NSA recommendations. An important standard, known as SHA2, was designed by the NSA and is still trusted by independent cryptographers and software developers worldwide.
NIST withdrew one cryptographic standard, called Dual EC DRGB, after documents provided to news organizations by the former intelligence contractor Edward Snowden raised the possibility that the standard had been covertly weakened by the NSA.
Soon after, a leading cryptography company, RSA, told software writers to stop using the algorithm in a product it sells. The company promised to remove the algorithm in future releases.
Many cryptographers have expressed doubt about NIST standards since the initial revelations were published. One popular encryption library changed its webpage to boast that it did not include NIST-standard cryptography. Silent Circle, a company that makes encryption apps for smartphones, promised to replace the encryption routines in its products with algorithms not published by NIST.
If the NIST review prompts significant changes to existing encryption standards, consumers will not see the benefit immediately. "If the recommendations change, lots of code will need to change," said Tanja Lange, a cryptographer at the University of Technology at Eindhoven, in the Netherlands. "I think that implementers will embrace such a new challenge, but I can also imagine that vendors will be reluctant to invest the extra time."
In Friday's announcement, NIST pointed to its long history of creating standards, including the role it had in creating the first national encryption standard in the 1970s — the Data Encryption Standard, known as DES. "NIST has a proud history in open cryptographic standards, beginning in the 1970s with the Data Encryption Standard," the bulletin said. But even that early standard was influenced by the NSA.
During the development of DES, the agency insisted that the algorithm use weaker keys than originally intended — keys more susceptible to being broken by super computers. At the time, Whitfield Diffie, a digital cryptography pioneer, raised serious concerns about the keys. "The standard will have to be replaced in as few as five years," he wrote.
The weakened keys in the standard were not changed. DES was formally withdrawn by the institute in 2005.
The announcement is the latest effort by NIST to restore the confidence of cryptographers. A representative from NIST announced in a public mailing list, also on Friday, that the institute would restore the original version of a new encryption standard, known as SHA3, that had won a recent design competition but altered by the institute after the competition ended. Cryptographers charged that NIST's changes to the algorithm had weakened it.
The SHA3 announcement referred directly to cryptographers' concerns. "We were and are comfortable with that version on technical grounds, but the feedback we've gotten indicates that a lot of the crypto community is not comfortable with it," wrote John Kelsey, NIST's representative. There is no evidence the NSA was involved in the decision to change the algorithm.
The reversal took Matthew Green, a cryptographer at Johns Hopkins University, by surprise. "NIST backed down! I'm not sure they would have done that a year ago," he said.
After the revelations of how the NSA basically authored a crypto standard surreptitiously with obligatory backdoors, plenty of people started exploring exactly which standard it was -- and called on the various reporters with access to Snowden's documents to come clean, mainly to protect people who were now using insecure crypto. Buried in a blog post that focuses more on the NIST's non-response to the news, the NY Times finally revealed both what standard it was, the Dual EC DRBG standard, and how Canadian intelligence basically was the cover, helping to hide the NSA's efforts:
But internal memos leaked by a former N.S.A. contractor, Edward Snowden, suggest that the N.S.A. generated one of the random number generators used in a 2006 N.I.S.T. standard — called the Dual EC DRBG standard — which contains a back door for the N.S.A. In publishing the standard, N.I.S.T. acknowledged “contributions” from N.S.A., but not primary authorship.
Internal N.S.A. memos describe how the agency subsequently worked behind the scenes to push the same standard on the International Organization for Standardization. “The road to developing this standard was smooth once the journey began,” one memo noted. “However, beginning the journey was a challenge in finesse.”
At the time, Canada’s Communications Security Establishment ran the standards process for the international organization, but classified documents describe how ultimately the N.S.A. seized control. “After some behind-the-scenes finessing with the head of the Canadian national delegation and with C.S.E., the stage was set for N.S.A. to submit a rewrite of the draft,” the memo notes. “Eventually, N.S.A. became the sole editor.”
That same article notes that people inside NIST "feel betrayed by their colleagues at the NSA," but I wonder if NIST will ever be able to regain any real sense of trust with the crypto community.
from the that's-not-going-to-calm-anyone-down dept
One of the key revelations from last week, of course, was the fact that the NSA surreptitiously took over the standards making process on certain encryption standards. Here was the key revelation:
Independent security experts have long suspected that the NSA has been introducing weaknesses into security standards, a fact confirmed for the first time by another secret document. It shows the agency worked covertly to get its own version of a draft security standard issued by the US National Institute of Standards and Technology approved for worldwide use in 2006.
"Eventually, NSA became the sole editor," the document states.
It took NIST a few days to figure out a response to this, but it's now been posted, and it says... basically nothing at all. Let's go through it piece by piece.
Recent news reports have questioned the cryptographic standards development process at NIST. We want to assure the IT cybersecurity community that the transparent, public process used to rigorously vet our standards is still in place.
Um, except that as the leaks revealed, that's not actually true. The NSA was the "sole editor" of the standard. So claiming that the standards are rigorously vetted is simply false. Furthermore, as John Gilmore recently revealed, concerning IPSec, the NSA made sure that the standards were so complicated that no one could actually vet the security.
NIST would not deliberately weaken a cryptographic standard. We will continue in our mission to work with the cryptographic community to create the strongest possible encryption standards for the U.S. government and industry at large.
That's not a response to the charges at all.
NIST has a long history of extensive collaboration with the world’s cryptography experts to support robust encryption. The National Security Agency (NSA) participates in the NIST cryptography development process because of its recognized expertise. NIST is also required by statute to consult with the NSA.
In other words, yes, the NSA is involved -- which was not a secret. But what was a secret, and what NIST does not even begin to address, is the idea that the NSA took control of the standard and became its "sole editor."
Recognizing community concern regarding some specific standards, we reopened the public comment period for Special Publication 800-90A and draft Special Publications 800-90B and 800-90C to give the public a second opportunity to view and comment on the standards.
Again, that does little to address the specific questions raised. If the standards are designed by the NSA in a manner that makes the security aspect inscrutable to even the most experienced cryptographers without simplifying the standard, then that's not doing any good.
If vulnerabilities are found in these or any other NIST standards, we will work with the cryptographic community to address them as quickly as possible.
Yes, but the "cryptographic community" seems to include the NSA... sometimes in key positions.
Basically this is a total non-response to the revelations from last week. It's just NIST saying "yes, we work with the NSA, but you have nothing to fear" without giving any basis to support the end of that claim.