Former NSA Official Argues The Real Problem With Undisclosed Exploits Is Careless End Users

from the sorry-about-all-the-ransomware dept

As leaked NSA software exploits have been redeployed to cause computer-based misery all over the world, the discussion about vulnerability disclosures has become louder. The argument for secrecy is based on the assumption that fighting an existential threat (terrorism, but likely also a variety of normal criminal behavior) outweighs concerns the general public might have about the security of their software/data/personal information. Plenty of recent real-world examples (hospital systems ransomed! etc.) do the arguing for those seeking expanded disclosure of vulnerabilities and exploits.

Former Deputy Director of the NSA Rick Ledgett appears on the pages of Lawfare to argue against disclosure, just as one would have gathered by reading his brief author bio. Ledgett’s arguments, however, feel more like dodges. First off, Ledgett says the NSA shouldn’t have to disclose every vulnerability/exploit it has in its arsenal, an argument very few on the other side of the issue are actually making. Then he says arguments against exploit hoarding “oversimplify” the issue.

The WannaCry and Petya malware, both of which are partially based on hacking tools allegedly developed by the National Security Agency, have revived calls for the U.S. government to release all vulnerabilities that it holds. Proponents argue that this would allow patches to be developed, which in turn would help ensure that networks are secure. On its face, this argument might seem to make sense—but it is a gross oversimplification of the problem, one that not only would not have the desired effect but that also would be dangerous.

At this point, you’d expect Ledgett to perform some de-simplification. Instead, the post detours for a bit to do some victim-blaming. It’s not the NSA’s fault if undisclosed exploits wreak worldwide havoc. It’s the end users who are the problem — the ones who (for various reasons) use outdated system software or don’t keep current with patches. This isn’t a good argument to make for the very reasons outlined in Ledgett’s opening paragraph: software vendors can’t patch flaws they’re unaware of. This is where disclosure would help protect more users, even if it meant the loss of some surveillance intercepts.

Then Ledgett argues the NSA’s leaked exploits weren’t really the problem. If they hadn’t been available, the malware purveyors just would have used something else.

The actors behind WannaCry and Petya, believed by some to be from North Korea and Russia, respectively, had specific goals when they unleashed their attacks. WannaCry seemed to be straightforward but poorly executed ransomware, while Petya appeared to have a more sinister, destructive purpose, especially in the early Ukraine-based infection vector. Those actors probably would have used whatever tools were available to achieve their goals; had those specific vulnerabilities not been known, they would have used others. The primary damage caused by Petya resulted from credential theft, not an exploit.

This is undoubtedly true. Bad actors use whatever tools help them achieve their ends. It’s just that these specific cases — the cases used by Ledgett to argue against increased disclosure — were based on NSA exploits vendors hadn’t been informed of yet. The patches that addressed more current vulnerabilities weren’t issued until after the NSA told Microsoft about them, and it only did that because its toolset was no longer under its control.

Ledgett also points out that the NSA does better than most state entities in terms of disclosure:

Most of the vulnerabilities discovered by the U.S. government are disclosed, and at the National Security Agency the percentage of vulnerabilities disclosed to relevant companies has historically been over 90 percent. This is atypical, as most world governments do not disclose the vulnerabilities they find.

Maybe so, but there’s not much honor than just being better than the worst governments. Ledgett only says the NSA is better than “most.” This doesn’t turn the NSA into a beacon of surveillance state forthrightness. All it does is place it above governments less concerned about the security and wellbeing of their citizens.

Ledgett then goes back to the well, claiming a) the two recent attacks had nothing to do with the NSA, and b) disclosing vulnerabilities would make the NSA less effective.

WannaCry and Petya exploited flaws in software that had either been corrected or superseded, on networks that had not been patched or updated, by actors operating illegally. The idea that these problems would be solved by the U.S. government disclosing any vulnerabilities in its possession is at best naive and at worst dangerous. Such disclosure would be tantamount to unilateral disarmament in an area where the U.S. cannot afford to be unarmed… Neither our allies nor our adversaries would give away the vulnerabilities in their possession, and our doing so would probably cause those allies to seriously question our ability to be trusted with sensitive sources and methods.

The problem here is that Ledgett ignores the obvious: leaked NSA tools helped create the problem. The NSA never disclosed these vulnerabilities to affected software vendors — at least not until it became obvious it could no longer keep these tools secret.

I’m guessing the NSA is already living through the last part of Ledgett’s paragraph. A set of effective, still-undisclosed vulnerabilities being digitally spirited away and dumped into the public’s lap probably makes it less likely foreign surveillance partners will be sharing their malware toolkits with the NSA.

This leads right into another argument against vulnerability hoarding: it has been shown with complete clarity that the NSA can’t guarantee its exploits will never be used by criminals and malicious governments. The leak of its toolkit shows any suggestion that only the “good guys” will have access to undisclosed vulnerabilities is both ignorant and arrogant. The NSA isn’t untouchable. Neither are all the surveillance partners the NSA has shared its tools with.

In the end, it’s the private sector’s fault, according to Ledgett. The solution is for vendors to write better software and end users to patch more frequently. This is good advice, but not an absolution of the NSA’s vulnerability secrecy.

The NSA needs to do better balancing its needs and the security of the general public. Very few people are arguing the NSA should have zero undisclosed exploits. But the exploits dumped by the Shadow Brokers affected older versions of Microsoft system software dating back to Windows XP and they still weren’t patched until the exploits had already been made public. These were exploits some in the NSA thought were too powerful, and yet, the NSA did nothing until the malware offspring of its secret exploit stash were taking down systems all over the world.

Filed Under: , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Former NSA Official Argues The Real Problem With Undisclosed Exploits Is Careless End Users”

Subscribe: RSS Leave a comment
Anonymous Coward says:

The solution is for vendors to write better software…..

And if companies manage to write secure software, the NSA would agitate for backdoors to be implemented, just like they want backdoors in encryption. Such backdoors will also be exploitable by the bad guys, when like those exploits, somebody accidentally or explicitly leaks them.

Anonymous Coward says:

I think it is 100% NSA’s fault. Basic Security 101. The biggest security vulnerability in any system is the human factor. If they found a security hole, certainly other governments have also found these holes. Security through obscurity is not a good plan, especially from a department that deals with security.

madasahatter (profile) says:

Re: Re:

The spooks do not want accountability but are passing the buck for their own ineptitude. First, any large program will have bugs that can be exploited. Second, many of these bugs will be found. Third, a few pose serious risks to computers and networks. Fourth, all the spooks in the world are hunting bugs. Fifth, crackers are hunting bugs. Sixth, both the spooks and crackers will find harmful bugs.

Anonymous Coward says:

Re: Re: Re:

Exactly. There is NO such thing as a perfect system. Never will be. All systems have failures, errors, and “holes”. This is a fact and principle of Systems Theory. If there was just ONE single system that was perfect then all mankind problems would have disappeared already and we will all look the same and earn the exact same salary. Hell, maybe we would all be just one organism.

But what do those idiots do instead of patching the hole and alerting everyone of the hole? They keep the hole open and then “by their own ineptitude” they let others duplicate (exploit) that hole.

When a system has a problem you fix it!

ECA (profile) says:


If you think, MS installed backdoor into your computers,,
Adobe flash, the same..
Itunes?? DOEs report on you.
MS music player had a “Phone home” until a few years back..that reported all your music played and used on the program.

There is 1 consideration..a TOTALLY secure system is a BITCH..If you forget your password, or CANT get it…You ERASE EVERYTHING AND START OVER..

Personanongrata says:

Liar, Liar, Pants on Fire

Former NSA Official Argues The Real Problem With Undisclosed Exploits Is Careless End Users

The Undisclosed Exploits have nothing to do with surveillance and everything to do with gaining access to a persons machine/device.

The unconstitutional surveillance the criminals at NSA (etal) are carrying out is accomplished by intercepting network traffic from key hubs (eg where undersea fiber optic cables make landfall) and then storing the data in various repositories (eg Pine Bluff Utah).

Gaining access to a machine/device in order to trespass into a persons private property is what the criminals at NSA use the undisclosed exploits to accomplish.

Some of the exploits in NSA’s toolkit are likely in part a collaborative effort between software/hardware manufacturers and the criminals of the NSA working together in the development stages to create Easter-eggs (ie back doors) for gaining surreptitious access to a persons private property.

The italicized/bold text below was excerpted from a report titled Revealed: The NSA’s Secret Campaign to Crack, Undermine Internet Security

Beginning in 2000, as encryption tools were gradually blanketing the Web, the N.S.A. invested billions of dollars in a clandestine campaign to preserve its ability to eavesdrop. Having lost a public battle in the 1990s to insert its own “back door” in all encryption, it set out to accomplish the same goal by stealth.

The agency, according to the documents and interviews with industry officials, deployed custom-built, superfast computers to break codes, and began collaborating with technology companies in the United States and abroad to build entry points into their products. The documents do not identify which companies have participated.

How many terrorists or terror plots have been stopped using NSA’s (ie Five Eyes )criminal global surveillance regime?

The italicized/bold text below was excerpted from the website a report titled U.S. Mass Surveillance Has No Record of Thwarting Large Terror Attacks, Regardless of Snowden Leaks:

A White House panel concluded in December 2013 that the NSA’s bulk collection of Americans’ telephone information was “not essential in preventing attacks.” A member of the panel took it one step further, when he told NBC News that there were no examples of the NSA stopping “any [terror attacks] that might have been really big” using the program.

The answer is zero.

What the unconstitutional and criminal surveillance is good for is blackmail, industrial espionage, insider stock trading tips and keeping tabs on personal relationships.

stderric (profile) says:

Re: 0 undisclosed exploits

They could offer to disclose everything and promise to start obeying the Constitution, on the condition that people update & patch their systems with much greater diligence. I’d take the deal.

No, we shouldn’t have to buy our rights back from our own security agencies, but over the past decades the government has worn through my hard shell of idealism to the soft nougat of pragmatism underneath.

Anonymous Coward says:

Re: Re: 0 undisclosed exploits

Yes but those updates should be provided for free by vendor. In that case it would not be buying our rights back, they simply wouldn’t be guaranteed (as in if you don’t update now you could be not covered).

But I agree with you, we should never pay for our rights. That is why VPNs suck (pay for your right to anonymity).

orbitalinsertion (profile) says:

So, who is the end user who left their exploits unsecured on a computer somewhere?

What group of end users is it, of which the aforementioned end user is a member, who have notoriously insecure, unpatched, and poorly configured systems which have repeatedly exposed metric craptons of data?

Who? Who are these pebkac monkeys? I. Just. Don’t. Know…

Ninja (profile) says:

There are two problems here.

One is regular updating by the companies. Microsoft does it and I believe they are doing a good security job (much better than, say, 10 years ago actually).

Then there’s the end user. How the fucking fucks do they expect anybody to protect themselves against undisclosed exploits that don’t even need real user input to get in? I mean, even if you could prevent by being completely paranoid about security not everybody would have the expertise to take these added steps even if you disconsider the added hassle to operate the system that comes with it.

No, the problem is you shouldn’t be hoarding exploits. If you must, just use them, gather some intel and disclose as soon as possible.

Personanongrata says:

Exploit This!

Italized/bold text below was excerpted from the web page a report titled:

How the CIA made Google, Inside the secret network behind mass surveillance, endless war, and Skynet

INSURGE INTELLIGENCE, a new crowd-funded investigative journalism project, breaks the exclusive story of how the United States intelligence community funded, nurtured and incubated Google as part of a drive to dominate the world through control of information. Seed-funded by the NSA and CIA, Google was merely the first among a plethora of private sector start-ups co-opted by US intelligence to retain ‘information superiority.’

The origins of this ingenious strategy trace back to a secret Pentagon-sponsored group, that for the last two decades has functioned as a bridge between the US government and elites across the business, industry, finance, corporate, and media sectors. The group has allowed some of the most powerful special interests in corporate America to systematically circumvent democratic accountability and the rule of law to influence government policies, as well as public opinion in the US and around the world. The results have been catastrophic: NSA mass surveillance, a permanent state of global war, and a new initiative to transform the US military into Skynet.

Hat tip:

DannyB (profile) says:

Careless end users?

An undisclosed exploit probably also means an undisclosed vulnerability which that exploit takes advantage of.

What possible care is an end user expected to take against an undisclosed and therefore unknown vulnerability?

Even if you know that the rebel alliance has the death star plans, what measures can you take if you don’t know what the vulnerability is and how it will be exploited? Once you try to analyze how the X wing fighters are attacking, it is probably too late.

Anonymous Coward says:

Yes, all government funded agencies should report all vulnerabilities (preferably after being corrected immediately) to the public. Yes, the NSA shouldn’t have built malware software and used it for any purpose (which I think is illegal for civilians). I agree with pretty much this whole post but…

They’re right. No matter how much you try to secure something the end-user is always the most vulnerable part of an IT ecosystem. As long as you have hired at least one person dumb enough to open an email attachment called “sexyladies.exe” you have more liabilities than security assets.



i had what every came to be called the sony root kit for 4 years

BEFORE it got …unexploited…..

the backdoor for all this was clever and required some neat compiling but …..

now ask yourself how 4 of us never had probs till we let one jerk get hold of the binaries….and why sony had such a hard time solving what they put into practice on there products….

ask yourself if its careless users, or careless security on my compatriots part to let a idiot get hold of the code whom then pandered it to sony like he was some smart ass…

perhaps this former unemployed bum of a govt agent might do well to really start thinking with his brain rather then his politics….

ever see pirated apps, i was warned 20 years ago that nearly 95% of all the applications that can be pirated are exploitable as well….what does this tel you when you see windows ten ….ohhhh all that spyware built right in ohhhhhhhhhhh
ya good f-in idea right?

now no one is gonna see it coming
people will get so peeved they might start doing the unthinkable

leaving this so called internet until it smartens back up

i doubt that happens for a while yet….

back to my video games ….that are all hackable but i bought anyways

oh and do keep raising prices so more and more people cant afford to sue the net and stuff on it….

the goal is no more about saving the net or freedoms we can have its ripping this shit to pieces that has taken over it all.

the fbi makes malware …don’t you know back in 2000 they had 65 million honey pot servers on teh net and its ar mroe now….how you think all those ipv4 addresses became rare….

i could go on and on what ive seen but …..maybe ill write a book and sell it on amazon ROFL

Anonymous Coward says:

First off, a factual issue.

It’s just that these specific cases — the cases used by Ledgett to argue against increased disclosure — were based on NSA exploits vendors hadn’t been informed of yet.

This isn’t true, or is at least misleading. The NSA never needed to inform Microsoft because Microsoft patched the vulnerability the day after it became public. This was 7 months before WannaCry hit, and 8 months before Petya.

8 months is a long time. If the NSA had informed Microsoft, the same machines would have been vulnerable. The problem is IT negligence on the part of affected businesses.

Second, I don’t think the NSA should be doing Microsoft’s job. The NSA discovered this exploit for the purpose of intelligence operations, and it was in fact being used for intelligence operations. This is the whole point of the NSA. Disclosing the vulnerability would have compromised those intelligence operations. You’re really asking the NSA to do something that isn’t their job. Your problem isn’t the NSA’s actions, it’s the NSA’s mission.

Now I’m gonna let you in on a little hacker secret. Virtually all exploits are found by analyzing Microsoft patches and patch notes. The NSA leak did not matter. If the NSA had earlier informed Microsoft, and Microsoft had earlier released a patch, the attacks would have just happened earlier. Microsoft isn’t the defense here, it’s just another vector by which the vulnerability could be discovered.

I will grant you that if the NSA did not research the bug, then the vulnerability may have never been discovered, and WannaCry/Petya would not have happened. Maybe they shouldn’t be in the business of cyber security research at all, because every discovered vulnerability is a time bomb simple waiting to go off.

If you want to prevent another Petya, I think there’s only two approaches.

1) We need cultural changes to handle computers in a less exploitable way. This could be done culturally by getting people to just take security more seriously (Mr Ledgett’s approach). Microsoft has the most sway here I believe.

2) We need to hold the bad actors (NK and Russia allegedly) responsible.

Anonymous Coward says:

Re: Re:

“Disclosing the vulnerability would have compromised those intelligence operations. “

Complete bullshit. Remember, intelligence is war, and there is no winners in war.

It is not NSA job to patch vulnerabilities but it’s their DUTY to make it public as soon as it “discovers” any. Otherwise we end up with Wanna Cry problems. Again, not all machines infected were due to the users negligence or IT’s negligence.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...