Leaked NSA Exploits Shifting From Ransomware To Cryptocurrency Mining

from the now...-for-my-next-trick dept

Will we ever see a complete postmortem of the damage done by leaked NSA software exploits? All signs point to “no.”

[M]ore than a year since Microsoft released patches that slammed the backdoor shut, almost a million computers and networks are still unpatched and vulnerable to attack.

Although WannaCry infections have slowed, hackers are still using the publicly accessible NSA exploits to infect computers to mine cryptocurrency.

This report, from Zack Whittaker at TechCrunch, says there’s really no endpoint in sight for the unintended consequences of exploit hoarding. But at this point, it’s really no longer the NSA or Microsoft to blame for the continued rampage. Stats from Shodan show more than 300,000 unpatched machines in the United States alone.

EternalBlue-based malware still runs rampant, but the focus has shifted from ransom to cryptocurrency. An unnamed company recently watched the NSA’s exploit turn its computers into CPU ATMs.

Nobody knows that better than one major Fortune 500 multinational, which was hit by a massive WannaMine cryptocurrency mining infection just days ago.

“Our customer is a very large corporation with multiple offices around the world,” said Amit Serper, who heads the security research team at Boston-based Cybereason.

“Once their first machine was hit the malware propagated to more than 1,000 machines in a day,” he said, without naming the company.

Fun stuff. And all made possible by the US government. Sure, indirectly, but it’s not like no one in the private sector ever expressed concerns about the agency’s vulnerability hoarding and the possibility of exactly this sort of thing happening. The exploit the NSA thought was too good to give up was taken from it and handed over to the malware-crafting masses to inflict misery around the world. Enemies were made — and not all of them were software and hardware developers.

There will never be a full accounting of the damage done. Yes, the NSA never thought its secret stash would go public, but that doesn’t excuse its informal policy of never disclosing massive vulnerabilities until it’s able to wring every last piece of intel from their deployment. And there’s a chance this will happen again in the future if the agency isn’t more proactive on the disclosure front. It was foolhardy to believe its tools would remain secret indefinitely. It’s especially insane to believe this now.

Filed Under: , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Leaked NSA Exploits Shifting From Ransomware To Cryptocurrency Mining”

Subscribe: RSS Leave a comment
27 Comments
Anonymous Hero says:

> Stats from Shodan show more than 300,000 unpatched machines in the United States alone.

This is a topic in itself. In the last week or so I’ve been wondering how to incentivize patching. I swear, it has nothing to do with a recent update that borked my laptop’s integrated camera.

Unfortunately, the remote chance of a malware attack vs. the disruption an update can wreak is easy to make in a bubble, but with work deadlines and the rest of life in play, there’s a strong incentive to stay with what works.

Anyone have any ideas on how to incentivize patching? I mean, I’d love something like, “Every time I install updates, Microsoft will pay me $100.” It’s a great deal, especially because I run Linux.

Anonymous Coward says:

Re: Re:

If you run Linux, run two or more machines with the same distro, and update one and check everything works before updating the other(s). That way you have a backup, in case of hardware or software failure. Also run a distro that has manual kernel installs, which allow you to keep the previous kernel and its drives available for fallback in case a kernel upgrade causes driver issues.

Lawrence D’Oliveiro says:

Re: Re: If you run Linux, run two or more machines with the same dis

It’s easy enough to do a file-level backup of the entire OS installation before an upgrade. Then if you need to, you can roll it back afterwards. Boot off a USB stick with a copy of SystemRescueCD on it, and then you can revert the regular OS install from your backup.

rsync is a great tool for this.

Anonymous Coward says:

Re: Re: Re: If you run Linux, run two or more machines with the same dis

My approach is colored a bit by running Manjaro, where an update can break things installed from the AUR. There, figuring out what is broken, and fixing it by re-installing is a better approach than a rollback. Also, synergy makes a desktop, cluster practical, by sharing the keyboard mouse and clipboard, and extra screens come in useful for reference manuals etc. while working on a project.

I.T. Guy says:

Re: Re:

In my 15+ years as a tech supporting widows I have yet to have a patch break a personal machine to the point of having to re-image. Not that it does not happen. In my experience I have had to rebuild many more machines due to ad/spy/virus than due to a bad patch.

Most people don’t realize it until they are paying Geek Squad 200 bucks to restore their machine.

“incentivize patching?”
On the other end… fines for an unpatched machine. LOL. Just kidding.

ECA (profile) says:

VIVA LINUX

This is so much fun..
MS thinking that creating a Server system that can compete with an OS designed to be SERVER SOFTWARE..
the only comment I get from Admins is that MS is easier..
The only problem I see, is PAYING for it yearly, and expecting them to be AHEAD of hacking… And MS isnt the only one you will need, and PAY for.

Anonymous Coward says:

Re: VIVA LINUX

MS thinking that creating a Server system that can compete with an OS designed to be SERVER SOFTWARE..
the only comment I get from Admins is that MS is easier..
The only problem I see, is PAYING for it yearly, and expecting them to be AHEAD of hacking… And MS isnt the only one you will need, and PAY for.

Admins use Windows because in large deployment scenarios it is easier to manage.

That’s not to say linux hasn’t had improvements in recent years, it certainly has. Actually, systemd despite being universally hated, was a step in that direction before it turned into a borg drone and started assimilating everything. Linux is still far from ready for large scale deployment however.

One of the most basic functions that Admins take for granted in Windows is centralized configuration policies. This works under Windows regardless of version, and can be fine grained enough to only apply to the systems that actually need it. There is no equivalent functionality under linux. The best you can hope for is rsyncing /etc, but that assumes the same distro and version of said distro is used everywhere the "policy" is applied, due to config differences between versions of an application, maintainer compile time settings per application, and other distro specific changes. Case in point: Go configure an Asterisk server under Ubuntu 18.04. If you try to set the astdatadir variable in /etc/asterisk/asterisk.conf it will crash on startup due to the value being set at compile time to a non-standard directory so the maintainer could integrate their update-alternatives package. This also has the habit of breaking user defined system recordings for use with IVFs. Now, that’s just one package, but it’s an important thing to keep track of if you decide to rsync that package’s config from another system, and as I said, it’s one package. Now imagine doing that for hundreds of packages across multiple distros.

Another feature Admins take for granted with Windows is the ability to define wireless and VPN profiles for laptops. Once again, this is not easy to do under a standard linux distro. NetworkManager could do this, but it’s API does not currently support such capability, and it’s configs are non-portable between machines of different hardware configurations. NetworkManager also has a bad habit of assigning MAC addresses to profiles when activated for the first time, making any future hardware change cause the profile to be disabled and a new one made. NetworkManager also has a bad set of GUI config tools that tend not to work correctly which confuse the users. Don’t get me started on dnsmasq vs. systemd-resolved vs. resolvconf either. I can’t set the DNS settings in our VPN software because it will cause the connection to fail on the clients when they try to use a non-installed DNS config manager to set the config. Worse, the error message returned by the clients is flat out wrong. Because it was the client that failed to set it’s DNS config, not some server error. And that’s assuming you can get the wireless working to begin with….

Now some of you may say: "Well just use puppet / some other tool!" Well there’s two problems with that: 1. It costs more money. For a system that bills itself as the free alternative, having to walk into a board meeting to justify spending money on making it work the way we need it to, and to then also claim that there is a non-zero chance it will still not be up to the task afterwards is not the best move from a business perspective. Especially when Microsoft and friends will happily sell you a system that does work out of the box, and that any competent admin will know how to use, no questions asked. 2. With puppet and others you still have to sort out the distro differences yourself and there is no standard between them or the different automation solutions. Windows has the GUI included by default and a large and mostly non-fragmented user base. Which means when problems come up you have a much bigger chance of getting help, and the solutions being consistent across versions. Honestly it would have been easier just adapting Group Policy to linux where possible. Which brings me to my last point…

Windows has the advantage because people know what to expect from it. Linux requires a lot of retooling and re-education to make it work. Both on the end user side and the admin side of things. Granted that’s to be expected of moving to a new system, but the issue is: The amount of change mandated is too daunting of a task to complete in such a short amount of time. It’s also not cheaper given you need to pay more than you got the system for to retrain everyone to use it and maintain it. Given the penalty for making the switch without paying upfront is a non-working system you need to conduct day-to-day operations for thousands of employees, it’s no surprise that Admins are hesitant to make the jump.

Disclaimer: I say this as someone who has tried and failed to make that jump, and paid the price for it.

ECA (profile) says:

Re: Re: VIVA LINUX

And if you really want to add to this…..the automated ADMIN TYPE programs are SPIT…
Thinking a prgram can monitor the activities of the SIGNED IN PERSON HAS SHOWN, that it STUPID…
Letting someone be online beyond a certain TINE FRAME is F’ STUPID…knowing what they are DOING IS THE ADMIN JOB…

Someone Stealing a TERABYTE of date…IS THE ADMINS PROBLEM.

Anonymous Anonymous Coward (profile) says:

Follow the Money

I am wondering what proof we have that the NSA is not still using these exploits? For that matter, what proof do we have that the NSA isn’t responsible for the Cryptocurrency Mining?

I can imagine there are some black ops for they don’t want to ask for funding in any kind of an outright way. If one wants to keep a secret, tell no one. If one must, tell one other. Funding takes more than one.

Anonymous Coward says:

But at this point, it’s really no longer the NSA or Microsoft to blame for the continued rampage.

I disagree. Microsoft substantially contributed, over many years, to users’ distrust of patching. They are not alone in this, but they are a major contributor, especially with how they abused their patch infrastructure to push Windows 10 on users who did not want it. Further, their longstanding policy that they patch only recent releases (which, by itself, is not unreasonable), combined with a series of unpopular releases that people would rather not use, makes it simply impossible for some users to get patches to the version of Windows they want to use, even if those users are willing to install patches. Look at how many people still cling to Windows XP, due in large part to problems with the later releases (bad default UI, missing drivers for legacy hardware, …). Those people can never be patched, unless Microsoft reopens XP support (which will never happen) or releases a Windows iteration that convinces them to move forward (which, after how Microsoft handled Windows 8 and Windows 10, also seems very unlikely).

Put all that together and you have many users who would rather run the risk of getting infected than deal with the certainty of unpleasant patches.

Anonymous Coward says:

> Look at how many people still cling to Windows XP, due in large part to problems with the later releases (bad default UI, missing drivers for legacy hardware, …).

That could happily be me. I’m much less satisfied with Windows 7. I’d be happy to go all the way back to a Windows 3.1. At that stage in development the operating system was far less locked down giving the user more control over customization of OS function. It was great to learn on. Break it. Fix it. Make it better for the lessons learned.

Today Linux is the closest experience to this.

The road out of Windows 3.1 has been incrementally toward less useful locked down and dumbed down. Terrible from the consumer perspective. Wonderful from the profit driven perspective. Windows XP marked a line of demarcation where the user experienced was shoe-horned and stuffed full of features sufficiently annoying to users as to be willing to not path or move elsewhere.

New iterations continue to be more annoying. I predict that trend continues. Eventually the pool of users sufficiently annoyed and eagerly looking for the new ship to jump to will see Microsoft using their market position to strangle out potential competition in the crib.

The sort of attack we’re seeing on the Linux kernal the last few weeks.

Anonymous Anonymous Coward (profile) says:

Re: Re:

I hear ya. Remember WordPerfect reveal codes? Tried the same thing in MS Word and was sadly disappointed. I only bring it up as it was an explicit example of MS denigrating program capabilities.

My first personal computer was an Apple IIc. A whole 128 KB of ram and one 350 KB floppy disk drive. Had to switch between OS and program and data floppies. Used the hell out of their spreadsheet and text programs.

Then I worked for a company that only had DOS style machines and Lotus 123, and WordPerfect (I forget the version). What I learned was how to do the things I did in Apple in dos based programs, but I learned how locked down the Apple OS was. They gave me enough to run the programs. DOS however gave a whole lot more, and as I tried, and erred, more and more and more. I learned that Apple wanted control, and DOS less. Then came Windows and the control started to disappear. Then came Linux and the control came back.

What’s next?

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...