US Government Making Another Attempt To Regulate Code Like It Regulates International Weapons Sales

from the a-zero-day-may-now-last-20-years dept

When code is treated like weapons, bad things happen. Governing bodies have previously treated encryption as weaponry, ensuring that only the powerful will have access to strong encryption while the general public must make do with weaker or compromised variants.

More recently, the US government went after the creator of a 3D-printed gun, claiming the very existence of printing instructions violated international arms regulations. So, it’s not just the end result that’s (potentially) covered under this ban (the actual weapon) but the data and coding itself. That’s currently being fought in court, carrying with it some potentially disturbing implications for several Constitutional rights.

Now, it appears the conflation of physical weapons/weaponized code is possibly going to make things much, much worse. The EFF notes that the US government’s adoption of recommended changes to an international arms trafficking agreement (the Wassenaar Arrangement) will likely cause very serious problems for security researchers and analysts in the future.

The BIS’s version of the Wassenaar Arrangement’s 2013 amendments contains none of the recommended security research exceptions and vastly expands the amount of technology subject to government control.

Specifically, the BIS proposal would add to the list of controlled technology:

Systems, equipment, components and software specially designed for the generation, operation or delivery of, or communication with, intrusion software include network penetration testing products that use intrusion software to identify vulnerabilities of computers and network-capable devices.


Technology for the development of intrusion software includes proprietary research on the vulnerabilities and exploitation of computers and network-capable devices.

On its face, it appears that BIS has just proposed prohibiting the sharing of vulnerability research without a license.

As if things weren’t already dangerous enough for security researchers, what with companies responding with threats and lawyers — rather than apologies and appreciation — when informed of security holes and the US government always resting its finger on the CFAA trigger. Violating the terms of this agreement could see researchers facing fines of up to $1 million and/or 20 years in prison.

Wassenaar was originally limited to physical items used in conventional weapons, like guns, landmines and missiles. It was amended in December 2013 to include surveillance tech, mainly in response to stories leaking out about Western companies like Gamma (FinFisher) and Hacking Team selling exploits and malware to oppressive governments, which then used these tools to track down dissidents and journalists.

The push to regulate the distribution of these tools had its heart in the right place, but the unintended consequences will keep good people from doing good things, while doing very little to prevent bad people from acquiring and deploying weaponized software.

The Wassenaar Arrangement’s attempt to wrestle a mostly ethereal problem into regulatable problem was, for the most part, handled well. It defined the software it intended to control very narrowly and provided some essential exceptions:

Notably, the controls are not intended apply to software or technology that is generally available to the public, in the public domain, or part of basic scientific research.

But, even so, it still contained the potential to do more harm than good.

We have significant problems with even the narrow Wassenaar language; the definition risks sweeping up many of the common and perfectly legitimate tools used in security research.

Either interpretation (Wassenaar, BIS) is a problem. The BIS version is much worse, but both will result in a less-secure computing world, despite being implemented with an eye on doing the opposite, as Robert Graham at Errata Security points out.

[G]ood and evil products are often indistinguishable from each other. The best way to secure your stuff is for you to attack yourself.

That means things like bug bounties that encourage people to find 0-days in your software, so that you can fix them before hackers (or the NSA) exploit them. That means scanning tools that hunt for any exploitable conditions in your computers, to find those bugs before hackers do. Likewise, companies use surveillance tools on their own networks (like intrusion prevention systems) to monitor activity and find hackers.

Thus, while Wassenaar targets evil products, they inadvertently catch the bulk of defensive products in their rules as well.

And the results will disproportionately negatively affect those who need these protections the most. This is the end result of controls written with physical items (which originates from physical manufacturing plants and travel on physical means of conveyance) in mind but copied-pasted to handle “items” that can traverse the internet with no known originating point.

That’s not to say export controls would have no leverage. For example, these products usually require an abnormally high degree of training and technical support that can be tracked. However, the little good export controls provide is probably outweighed by the harm — such as preventing dissidents in the affected countries from being able to defend themselves. We know they do little good know because we watch Bashar Al Assad brandish the latest iPhone that his wife picked up in Paris. Such restrictions may stop the little people in his country getting things — but they won’t stop him.

The “open-source” exception in Wassenaar can be useful, up to a point. Researchers could post their findings to Github, as Graham points out, to ensure they’re still protected. This, of course, means the Arrangement is still mostly useless, as the moment it’s put into the public domain, any entity cut out of the distribution loop by this agreement can immediately make use of posted vulnerabilities and exploits. It also makes research destined to be open-sourced forbidden weaponry until the point it’s actually made public. So, a laptop full of research is a prohibited weapon, while a Github post containing the same is not.

When security researchers discover 0-day, they typically write a proof-of-concept exploit, then present their findings at the next conference. That means they have unpublished code on their laptop, code that they may make public later, but which is not yet technically open-source. If they travel outside the country, they have technically violated both the letter and the spirit of the export restrictions, and can go to jail for 20 years and be forced to pay a $1 million fine.

Pro tip:

Thus, make sure you always commit your latest changes to GitHub before getting on a plane.

Statements made by the BIS aren’t exactly comforting. The BIS’s implementation doesn’t include an open-source exception, but supposedly, this will still be taken into consideration when the US government starts throwing around fines and prison sentences. Randy Wheeler of the BIS:

“We generally agree that vulnerability research is not controlled, nor is the technology related to choosing a target or finding a target, controlled.” However, she undermined her message by stating that any software that is used to help develop 0-day exploits for sale would be covered by the proposal.

Again, bad for researchers. This gives the government leeway to imply intent when prosecuting, because the allowed and the forbidden look very similar while still in their formative stages.

[T]he only difference between an academic proof of concept and a 0-day for sale is the existence of a price tag.

Even if the exploit is not on the market at the point the government steps in, it would take very little to insinuate that it would have been headed to market, if not for the speedy intervention of regulators.

There is some good news, however. The BIS is accepting comments on its proposed adoption (and partial rewrite) of the amendments to the Wassenaar Arrangement. The comment period ends on July 20, 2015, so sooner rather than later would be good if you’re interested in steering the government away from doing further damage to the livelihoods of security researchers.

Filed Under: , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “US Government Making Another Attempt To Regulate Code Like It Regulates International Weapons Sales”

Subscribe: RSS Leave a comment
Uriel-238 (profile) says:

Re: Re: To some degree you are correct.

Malicious programs as as dangerous as the range of damage done by their payloads. Explosives in an open field make for a spectacle and nothing more. Put it in the middle of a hospital, however…

But you can plant a bomb pretty much anywhere and it will function. Malware has to be compatible with the system it affects, and has to get injected somehow, which is harder to do.

And making malware usually just destroys data (which can cause damage on its own, if it’s the database of a bank or a hospital), but in order to affect real damage, it has to rely on the resources available, and on the programmer knowing how to sabotage a facility from within (so to sabotage a nuclear power plant, you’d need a programmer and a nuclear engineer familiar with the target plant).

That way, making targeted malware is a lot harder than a targeted sabotage raid.

And then, in the aftermath it’s highly likely the enemy will obtain your code (much like Iran now having Stuxnet) and they will use it against you and to inform their response.

DannyB (profile) says:

Fun memories from the early 1990's.

You could not export encryption.

But what about a textbook on encryption? What about a really good textbook? A book that includes source code examples and listings?

Last time, the US wasn’t willing to stop people from exiting the country with a textbook in their hand. Maybe this time that will change.

DannyB (profile) says:

Security Researchers

If security researchers in the US are in danger because of the code they work with, then using my special psychic powers, I can foresee what is going to happen.


The fog is clearing, I am getting a vision . . .

Security research will be done outside the US. Other countries will have all the good hacking tools. The US will also not have done any real research into how to defend against attacks, because knowing how to defend against attacks requires understanding how the attack will work. Otherwise you end up with software like . . .

MickeySoft Maginot Line Defender ! Professional Edition

The US has no penetration tools and weak defenses . . . and . . . something major is happening now . . .

. . . oh darn, the vision is doing dark . . .

Anonymous Coward says:

Re: Re: Security Researchers

Why bother having a law that says the NSA can do it? It is pretty clear the NSA does what they want, then finds a way to misinterpret the law to say it was probably legitimate, and was in good faith even if it was illegal. Not having a secret law saves everyone the time and trouble of pretending to comply with it.

DannyB (profile) says:

Re: Re: Re:2 Security Researchers

So that evidence can be used in court

Hey, man, get with the times. We now have secret courts. It fits the pattern that follows from the spying.

Massive spying on citizens
Secret laws
Secret interpretations of laws
Secret courts
Secret court orders
Secret arrests (in the middle of the night)
Secret evidence (that the defense cannot access)
Secret trials
Secret convictions
Secret incarceration
Widespread police brutality condoned, maybe even encouraged
Government torture programs

Sound like what we were fighting in the previous century?

Anonymous Coward says:

However, she undermined her message by stating that any software that is used to help develop 0-day exploits for sale would be covered by the proposal.

I find the automatic code completion on Microsoft Visual Studio (which is itself proprietary, although you can use it to write open-source code) to be very helpful in developing all kinds of programs, including 0-day exploits. Does that now mean that Microsoft Visual Studio is a controlled program? 🙂

Anonymous Coward says:


Politicians determined to regulate technology that have no chance of understanding. They sound like raving fundamentalists, standing at their own alter, claiming only they know the right path when they can’t even find the exit from their church to see the real world. GOD save us from the narcissistic idiots.

Anonymous Coward says:

>Politicians determined to regulate technology that have no chance of understanding. They sound like raving fundamentalists, …

I appreciate the attempt at an expression of sanity. But at the end you let slip a level of religious bigotry that might offend some people. You might want to be more careful when you aren’t trolling.

But this isn’t about regulating technology. This is an attempt to regulate _thought_. It’s like, say, taxing paper–and for the same reason: to avoid people being infected with the alien and seditious ideas written thereon. And it’s likely to have the same direct effect.

Tea Party. Bring your own hatchet and USB. Paper is optional.

Anonymous Coward says:

Re: Re:

I think the comparison is apt.

For example:

1) “This is an attempt to regulate thought
—- check

2) “to avoid people being infected with the alien and seditious ideas “
—- check

3) “Tea Party”
—- check

btw, raving fundamentalists of all flavors are a very real threat, much more so than some code.

Lawrence D’Oliveiro says:

Why Code Is Not Like A Weapon

Encryption code lets you shop online and pay for things safely. It preserves your privacy. It lets you be sure the person you are communicating with is who they say they are.

In short, code lets you do constructive things.

A weapon does none of these things. It has only destructive uses. It is only useful for causing damage to people or property, nothing more.

This is why code is not like a weapon, and should not be regulated like one.

DannyB (profile) says:

Re: Why Code Is Not Like A Weapon

Don’t you think 3D printed guns are like a weapon?

The new 3D printing must be heavily regulated. There are terrible threats to society that you may not have considered.

Here is only one example of the danger. An unregulated 3D printer could be used, in secret, to circumvent Arizona’s legal limit of two dildos per household. Regulating 3D printers will help keep us all safe from such dangers.

Lawrence D’Oliveiro says:

Re: Re: Why Code Is Not Like A Weapon

Does a 3D-printed gun have any constructive use? No. It may not be very good as a weapon, but it is still a weapon, and should be regulated as such.

As John “Danger Man” Drake said: “I don’t like guns. They’re noisy, and they hurt people.”

Uriel-238 (profile) says:

Re: Re: Re: The constructive use of printed Guns

Part of the advantage of the great 3D shape library (which includes all the parts for numerous common weapons) is not fully recognized here in the US, rather it’s well understood in Africa, where small and brutal asymmetrical conflicts take place, and one faction is often grossly oppressing the other.

Printed gun parts can serve as prototypes for making the components for real: You use the prototype to craft the pouring mold and from that make the parts out of metal. Stamp and machine as you require.

Factions will be able to afford to arm their people that couldn’t previously when guns were controlled by dealers. And factions that before had a massive advantage will quickly see cause to not capture, rape, torture, enslave and massacre their opponents at their pleasure, since their upper hand won’t be so certain and eventual comeuppance is assured.

So the international community has very good cause to expand, maintain and sustain an open-source library of printable guns. And they will stop only when we develop something better than firearms with which to fight wars.

So unless your own western government who couldn’t care less about African people and their petty wars (because Africa and dark skin and no oil) plans to build a great censorship firewall around your nation to keep out those gun shapes, they won’t be able to stop people from printing gun parts and occasionally making printed guns.

Also, since the US is full of gun enthusiasts, some of which like to customize and accessorize their weapons, and a 3D printer is an amazing tool for such a hobby, by criminalizing the printing of gun parts you’ll just make criminals out of those hobbyists. Because they’re not going to stop when the state tells them they don’t get to use 3D printers for their hobby.

PS: Replace Africa with South America if you like and it’s all still true.

Lawrence D’Oliveiro says:

Re: Re: Re:2 Why Code Is Not Like A Weapon

Wow, how much more fanciful do you want to get? Is that really how you folks in the US see things—that the solution to the world’s problems is “more guns”? You don’t think the world’s trouble spots are already so awash with cheap, plentiful AK-47s, RPGs and the like, you think they really need this wonderful new “3D printing” technology to make the crucial difference?

Even the world’s most unpopular governments have no trouble shooting back at people who shoot at them. Just call the attackers “terrorists”, and public opinion will raise nary a whisper.

But if government troops shoot at a peaceful protest–now that can make all the difference…

Uriel-238 (profile) says:

Re: Re: Re:3 "Fanciful"

My statement was descriptive, not prescriptive. I was talking of circumstances that are already changing, not how they should be changed.

But you seem to have a strong notion of how to fix some terrible situations in the world’s badlands, so do tell. I’m sure that those suffering from the hard realities of oppression would love to better understand how your preference for pacifistic principle should override their desires for life and liberty.

Granted the current US regime is a dick and a bully regarding how it chooses where to apply military intervention. But the theaters about which I speak are ones in which the United States officially couldn’t care less.

Rekrul says:

“We generally agree that vulnerability research is not controlled, nor is the technology related to choosing a target or finding a target, controlled.” However, she undermined her message by stating that any software that is used to help develop 0-day exploits for sale would be covered by the proposal.

So that means that compilers and hex editors are now controlled software…

Rekrul says:

Re: Re:

I think you are reading the changes in a way that allows you to push an anti regulation argument and you don’t offer an alternative.

History has shown that if a law can be abused, it will be abused.

Look at how the CFAA (Computer Fraud and Abuse Act) was used to try and bully Arron Swartz even though he hadn’t actually committed a real crime. Look at how the laws against child pornography are being used to punish the same children that they’re supposed to protect. Look at how activist groups are now being labeled as “domestic terrorists”.

They don’t use vague language in laws because they’re too shortsighted to see how it could be abused, they make the language intentionally vague so that they can have the freedom to abuse it however they want.

Anonymous Coward says:

Like the researchers state, the difference between an exploit and valid software is very hazy. This would also outlaw jail breaking software as most use 0 days that are not published, so as not to be circumvented immediately by the hardware vendor. All in all, this is just a stupid agreement that most likely will be ignored, and if prosecuted for petty violations will probably cause more damage than they are helping to protect to the global community.

Uriel-238 (profile) says:

The Nth country experiment

demonstrated to us that the enemy will acquire the knowledge they need to make the weapons they want. Thank goodness nuclear weapons are kinda tricky.

Terrorists, revolutionaries, saboteurs and industrial spies will get their gun-parts designs and code anyway, and quickly. The better thing to do is develop security protocols that defend against all known attacks.

But the FBI loves its backdoors.

And big corporations love their pristine publicity.

If they criminalize the security engineers. The security engineers will go criminal. I’m sure others will pay them well and treat them better to ply their craft.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...