A former Defense Intelligence Agency officer has taken to LinkedIn to point out to all of us griping about the broken Vulnerability Equities Process -- exposed by hackers holding NSA zero-days -- have it all wrong. Michael Tanji says the NSA isn't here to protect developers from malicious attacks. It never was and it's never going to be.
Intelligence agencies exist to gather information, analyze it, and deliver their findings to policymakers so that they can make decisions about how to deal with threats to the nation. Period. You can, and agencies often do, dress this up and expand on it in order to motivate the workforce, or more likely grab more money and authority, but when it comes down to it, stealing and making sense of other people’s information is the job. Doing code reviews and QA for Cisco is not the mission.
Suck it up, Cisco. That gaping hole uncovered by the Shadow Brokers was discovered at least three years ago by the NSA and if it chose not to tell you about it, it had its reasons. Namely: national security.
The Obama administration made sympathetic noises in the wake of the Snowden leaks, suggesting the NSA err on the side of disclosure. It simultaneously gave the agency no reason to ever do that by appending "unless national security, etc." to the statement.
But part of the phrase "national security" is the word "security." (And the other part -- "national" -- suggests this directive also covers protecting US companies from attacks, not just the more amorphous "American public.") Allowing tech companies who provide network security software and hardware to other prime hacking targets to remain unaware of security holes doesn't exactly serve the nation or its security. So, while Tanji may claim the NSA isn't in the QA business, it sort of is. The thing is the NSA prefers to exploit QA issues, rather than give affected developers a chance to patch them.
And if an NSA operative left behind a bag of tech tools in a compromised server, it really doesn't do much for the argument that the government can be trusted with encryption backdoors -- the sort of thing FBI Director James Comey is still hoping will materialize as a result of his never ending "going dark" sales pitch. Julian Sanchez, writing for Cato, points out the NSA's mistake should lead to some pretty severe trust issues.
This hack also ought to give pause to anyone swayed by the government’s assurances that we can mandate government backdoors in encryption software and services, allowing the “good guys” (law enforcement and intelligence agencies) to access the communications of criminals and terrorists without compromising the security of millions of innocent users. If even the NSA’s most closely guarded hacking tools cannot be secured, why would any reasonable person believe that keys to cryptographic backdoors could be adequately protected by far less sophisticated law enforcement agencies? The Equation Group hack is a disturbingly concrete demonstration of what network security experts have been saying all along: Once you create a backdoor, there is no realistic way to guarantee that only the good guys will be able to walk through it.
So, that's one huge problem with both the hoarding of exploits and the NSA's refusal to actually participate in the Vulnerability Equities Process. The definition the NSA has chosen for "national security" doesn't mesh with statements made by its cybersecurity overseers.
Back in 2014, federal cybersecurity coordinator Michael Daniel insisted in a post on the White House blog that the process is strongly weighted in favor of disclosure. The government, he assured the public, understands that “[b]uilding up a huge stockpile of undisclosed vulnerabilities while leaving the Internet vulnerable and the American people unprotected would not be in our national security interest.”
Maybe things have changed in the past couple of years, but they haven't changed as much as Michael Tanji claims. He states that the NSA is no longer charged with playing cyber-defense.
The one element in the intelligence community that was charged with supporting defense is no more. I didn’t like it then, and it seems pretty damn foolish now, but there you are, all in the name of “agility.” NSA’s IAD had the potential to do the things that all the security and privacy pundits imagine should be done for the private sector, but their job was still keeping Uncle Sam secure, not Wal-Mart.
That's simply not true. The NSA may secretly wish it had been completely rerouted to "attack" mode. That would more easily justify the hoarding of vulnerabilities and its ongoing refusal to hand over info to affected developers. But it's still supposed to be playing defense -- which means it has an obligation to both the American public who use software/hardware the NSA would rather see left unpatched, as well as the developers it's purposefully leaving open to malicious attacks.
The NSA has decided the best way to handle these competing directives is to muddy the waters by making them inseparable.
Because computers are now the easiest way to spy on people, and because everyone — even U.S. adversaries — uses the same Internet, there has long been what officials like to call a "healthy" or "creative" tension between the foreign espionage mission and the information assurance mission of the NSA.
Crudely put, the IA's cyber mission is to find security holes in Internet infrastructure and common software and patch them; the signals intelligence mission is to find the same holes and keep them open as long as possible so they can be used to spy on foreigners.
When the two directorates merge, some fear that the much larger and better funded signals intelligence mission will simply absorb the IA mission.
As it stands now, the offensive side of the NSA's cybersquad is roughly twice the size of its defensive team -- which clearly indicates which end of the equation the NSA believes is more important to its national security mission.
The NSA's actions in regards to the Vulnerability Equities Process shows it believes some forms of national security are more equal than others. It's far more interested in ensuring its collections continue to be fed than it is with patching security holes -- holes it has often created -- that affect millions of US citizens and dozens of hacker-tempting firms.
It also shows the government is not to be trusted when it demands "good guy only" access. It can't protect the backdoors it's already created and it has only the slightest interest in protecting the nation from the bad guys that will inevitably find its secret entrances.