What To Do About Lawless Government Hacking And The Weakening Of Digital Security

from the rethinking-the-problems dept

The EFF has put a lot of thought into how we should deal with the issue of government hacking and how it impacts digital security, and so we're reposting Andrew Crocker's excellent article here.

In our society, the rule of law sets limits on what government can and cannot do, no matter how important its goals. To give a simple example, even when chasing a fleeing murder suspect, the police have a duty not to endanger bystanders. The government should pay the same care to our safety in pursuing threats online, but right now we don't have clear, enforceable rules for government activities like hacking and "digital sabotage." And this is no abstract question—these actions increasingly endanger everyone's security.

The problem became especially clear this year during the San Bernardino case, involving the FBI's demand that Apple rewrite its iOS operating system to defeat security features on a locked iPhone. Ultimately the FBI exploited an existing vulnerability in iOS and accessed the contents of the phone with the help of an "outside party." Then, with no public process or discussion of the tradeoffs involved, the government refused to tell Apple about the flaw. Despite the obvious fact that the security of the computers and networks we all use is both collective and interwoven—other iPhones used by millions of innocent people presumably have the same vulnerability—the government chose to withhold information Apple could have used to improve the security of its phones.

Other examples include intelligence activities like Stuxnet and Bullrun, and law enforcement investigations like the FBI's mass use of malware against Tor users engaged in criminal behavior. These activities are often disproportionate to stopping legitimate threats, resulting in unpatched software for millions of innocent users, overbroad surveillance, and other collateral effects.

That's why we're working on a positive agenda to confront governmental threats to digital security. Put more directly, we're calling on lawyers, advocates, technologists, and the public to demand a public discussion of whether, when, and how governments can be empowered to break into our computers, phones, and other devices; sabotage and subvert basic security protocols; and stockpile and exploit software flaws and vulnerabilities.

Smart people in academia and elsewhere have been thinking and writing about these issues for years. But it's time to take the next step and make clear, public rules that carry the force of law to ensure that the government weighs the tradeoffs and reaches the right decisions.

This long post outlines some of the things that can be done. It frames the issue, then describes some of the key areas where EFF is already pursuing this agenda—in particular formalizing the rules for disclosing vulnerabilities and setting out narrow limits for the use of government malware. Finally it lays out where we think the debate should go from here.

Recognizing That Government Intrusion and Subversion of Digital Security Is a Single Issue

The first step is to understand a wide range of government activities as part of one larger threat to security. We see the U.S. government attempt to justify and compartmentalize its efforts with terms like "lawful hacking" and "computer network attack." It is easy for the government to argue that the FBI's attempts to subvert the security of Apple iOS in the San Bernardino case are entirely unrelated to the NSA's apparent sabotage of the Dual_EC_DRBG algorithm. Likewise, the intelligence community's development of the Stuxnet worm to target the Iranian nuclear program was governed by a set of rules entirely separate from the FBI's use of malware to target criminals using Tor hidden services.

These activities are carried out by different agencies with different missions. But viewing them as separate—or allowing government to present it that way—misses the forest for the trees. When a government takes a step to create, acquire, stockpile or exploit weaknesses in digital security, it risks making us all less safe by failing to bolster that security.

Each of these techniques should involve consideration of the tradeoffs involved, and none of them should be viewed as risk-free to the public. They require oversight and clear rules for usage, including consideration of the safety of innocent users of affected technologies.

There is hope, albeit indirectly. In the United States, high-ranking government officials have acknowledged that "cyber threats" are the highest priority, and that we should be strengthening our digital security rather weakening it to facilitate government access. In some cases, this is apparently reflected in government policy. For instance, in explaining the government's policy on software vulnerabilities, the cybersecurity coordinator for the White House and the Office of the Director of National Intelligence have both stated in blog posts that the there is a "strong presumption" in favor of disclosing these vulnerabilities to the public so they can be fixed.

But the government shouldn't engage in "policy by blog post." Government action that actively sabotages or even collaterally undermines digital security is too important to be left open to executive whim.

Finding Models for Transparency and Limits on When Government Can Harm Digital Security

While government hacking and other activities that have security implications for the rest of us are not new, they are usually secret. We should demand more transparency and real, enforceable rules.

Fortunately, this isn't the first time that new techniques have required balancing public safety along with other values. Traditional surveillance law gives us models to draw from. The Supreme Court's 1967 decision in Berger v. New York is a landmark recognition that electronic wiretapping presents a significant danger to civil liberties. The Court held that because wiretapping is both invasive and surreptitious, the Fourth Amendment required "precise and discriminate" limits on its use.

Congress added considerable structure to the Berger Court's pronouncements with the Wiretap Act, first passed as Title III of the Omnibus Crime Control and Safe Streets Act of 1968. First, Title III places a high bar for applications to engage in wiretapping, so that it is more of an exception than a rule, to be used only in serious cases. Second, it imposes strict limits on using the fruits of surveillance, and third, it requires that the public be informed on a yearly basis about the number and type of government wiretaps.

Other statutes concerned with classified information also find ways of informing the public while maintaining basic secrecy. For example, the USA Freedom Act, passed in 2015 to reform the intelligence community, requires that significant decisions of the FISA Court either be published in redacted form or be summarized in enough detail to be understood by the public.

These principles provide a roadmap that can be used to prevent government from unnecessarily undermining our digital security. Here are a few areas where EFF is working to craft these new rules:

Item 1: Rules for When Government Stockpiles Vulnerabilities

It's no secret that governments look for vulnerabilities in computers and software that they can exploit for a range of intelligence and surveillance purposes. The Stuxnet worm, which was notable for causing physical or "kinetic" damage to its targets, relied on several previously unknown vulnerabilities, or "zero days," in Windows. Similarly, the FBI relied on a third party's knowledge of a vulnerability in iOS to access the contents of the iPhone in the San Bernardino case.

News reports suggest that many governments—including the U.S.—collect these vulnerabilities for future use. The problem is that if a vulnerability has been discovered, it is likely that other actors will also find out about it, meaning the same vulnerability may be exploited by malicious third parties, ranging from nation-state adversaries to simple thieves. This is only exacerbated by the practice of selling vulnerabilities to multiple buyers, sometimes even multiple agencies within a single government.

Thanks to a FOIA suit by EFF, we have seen the U.S. government's internal policy on how to decide whether to retain or disclose a zero day, the Vulnerabilities Equities Process (VEP). Unfortunately, the VEP is not a model of clarity, setting out a bureaucratic process without any substantive guidelines in favor of disclosure, More concerning, we've seen no evidence of how the VEP actually functions. As a result, we have no confidence that the government discloses vulnerabilities as often as claimed. The lack of transparency fuels an ongoing divide between technologists and the government.

A report published in June by two ex-government officials—relying heavily on the document from EFF's lawsuit—offers a number of helpful recommendations for improving the government's credibility and fueling transparency.

These proposals serve as an excellent starting point for legislation that would create a Vulnerabilities Equities Process with the force of law, formalizing and enforcing a presumption in favor of disclosure. VEP legislation should also:

  • Mandate periodic reconsideration of any decision to retain a vulnerability;
  • Require the government to publish the criteria used to decide whether to disclose;
  • Require regular reports to summarize the process and give aggregate numbers of vulnerabilities retained and disclosed in a given period;
  • Preclude contractual agreements that sidestep the VEP, as in the San Bernardino case, where the FBI apparently signed a form of non-disclosure agreement with the "outside party." The government should not be allowed to enter such agreements, because when the government buys a zero day, we should not have to worry about defending ourselves from a hostile state exploiting the same vulnerability. If tax dollars are going to be used to buy and exploit vulnerabilities, the government should also eventually use them to patch the security of affected systems, with benefits to all.

Above all, formalizing the VEP will go a long way to reassuring the public, especially members of the technology industry, that the U.S. government takes its commitment to strengthening digital security seriously.

Item 2: Preventing Disproportionate Use of Government Malware and Global Hacking Warrants

EFF has also long been concerned about state-sponsored malware. It's at the heart of our suit against the government of Ethiopia. Even in the United States, when the government seeks court permission to use malware to track and surveil suspects over the Internet, it can endanger innocent users as well as general network security.

A particularly egregious example is the Playpen case, involving an FBI investigation into a Tor hidden service that hosted large amounts of child pornography. The FBI seized the site's server and operated it as a honey pot for visitors. A single warrant authorized the FBI to install malware on any and all visitors' computers in order to breach the anonymity otherwise provided by Tor. By not specifying particular users—even though the list of users and logs of their activity was available to the FBI—the warrant totally failed to satisfy the Fourth Amendment requirement that warrants particularly describe persons and places to be searched.

What's more, the FBI asked the court to trust that it would operate its malware safely, without accidentally infecting innocent users or causing other collateral damage. Once defendants began to be charged in these cases, the government staunchly refused to turn over certain information about how the malware operated to the defense, even under seal, arguing that it would compromise other operations. As a result, defendants are left unable to exercise their right to challenge the evidence against them. And of course, anyone else whose computer is vulnerable to the same exploit remains at risk.

In these cases, the FBI flouted existing rules: the Playpen warrant violated both the Fourth Amendment and Rule 41 of the Federal Rules of Criminal Procedure. Other cases have involved similarly overboard uses of malware. EFF has been working to explain the danger of this activity to courts, asking them to apply Fourth Amendment precedent and require that the FBI confront serious threats like Playpen in a constitutional manner. We have also been leaders of a coalition to stop an impending change that would loosen the standards for warrants under Rule 41 and make it easier for the FBI to remotely hack users all over the world.

Item 3: A "Title III for Hacking"

Given the dangers posed by government malware, the public would likely be better served by the enactment of affirmative rules, something like a "Title III for Hacking." The legislative process should involve significant engagement with technical experts, soliciting a range of opinions about whether the government can ever use malware safely and if so, how. Drawing from Title III, the law should:

  • Require that the government not use invasive malware when more traditional methods would suffice or when the threats being addressed are relatively insignificant;
  • Establish strict minimization requirements, so that the targets of hacking are identified with as much specificity as the government can possibly provide;
  • Include public reporting requirements so that the public has a sense of the scope of hacking operations; and
  • Mandate a consideration of the possible collateral effects—on individuals and the public interest as a whole—on the decision to unleash malware that takes advantage of known or unknown vulnerabilities. Even if the VEP itself does not encompass publicly known vulnerabilities ("N-days"), using remote exploits should impose an additional requirement on the government to mitigate collateral damage, through disclosure and/or notice to affected individuals.

The same principles should apply to domestic law enforcement activities and foreign intelligence activities overseen by the FISA Court or conducted under the guidelines of Executive Order 12333.

Of course, these sorts of changes will not happen overnight. But digital security is an issue that affects everyone, and it's time that we amplify the public's voice on these issues. We've created a single page that tracks our work as we fight in court and pursue broader public conversation and debate in the hopes of changing government practices of sabotaging digital security. We hope you join us.


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    That One Guy (profile), 3 Aug 2016 @ 9:25am

    Nice step two

    While I agree that getting the government and lawmakers to create some clear guidelines and limits on what can and cannot be done(even better if they actually follow them), at this point that should be treated as the second step, a more long-term goal.

    The first step should be a continuation and expansion of what's already being done, making it so that they have no choice but to respect public privacy and security because they cannot break it without significant expenditure of resources.

    When you've got numerous government agencies that care more about their ability to gather intel or make their jobs easier than the safety and security of the public some new 'guidelines' aren't going to do much about changing their actions. Make it so they have no choice, that they have to respect privacy and security of someone besides themselves and then they might be willing to listen. Before then you might as well be talking to a wall.

    reply to this | link to this | view in chronology ]

    • icon
      Hephaestus (profile), 3 Aug 2016 @ 10:17am

      Re: Nice step two

      When you've got numerous government agencies that care more about their ability to gather intel or make their jobs easier than the safety and security of the public

      Most of the time it is defense contractors and security firms selling the government a bill of goods, in an attempt to sell services or products (software) to them. The intel communities around the world believe, that more information is better, and that is not the case. The more information you have, the less likely it is that you will find anything of value due to false positives. In the end government electronic snooping makes us less safe not more.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 Aug 2016 @ 10:15am

    The rules are already in place; i.e., the Constitution of the United States of America.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 Aug 2016 @ 11:58am

    Interesting factoids

    The arrest and conviction rates from wiretaps have a distinct pattern:

    At best, about 45% of wiretap arrests lead to a conviction.

    Only about 1 in 6 arrests lead to a conviction that year. (perhaps plea bargains?)

    Not quite 1 in 2 arrests a year later lead to a conviction.

    The chart lists "arrests that year" and "convictions that year" based on the year of the wiretap. But the conviction is not always in the year of the arrest, leading to entries like "233 arrests, 496 convictions". Seems dishonest, but what ya gonna do?

    Also note that a not-insignificant number of arrests are based on wiretaps as much as 9 years previous.

    reply to this | link to this | view in chronology ]

    • icon
      John Fenderson (profile), 3 Aug 2016 @ 12:39pm

      Re: Interesting factoids

      "Only about 1 in 6 arrests lead to a conviction that year. (perhaps plea bargains?)"

      Plea bargains are agreements to plead guilty to a lesser offense. The person is still convicted, though, just for something else instead.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 Aug 2016 @ 12:30pm

    Missing a "sponsored" article tag..

    reply to this | link to this | view in chronology ]

  • identicon
    MakeItLoudAirIt, 3 Aug 2016 @ 12:32pm

    EFF I recommend...

    "we're calling on lawyers, advocates, technologists, and the public to demand a public discussion of whether, when, and how governments can be empowered to break into our computers, phones, and other devices; sabotage and subvert basic security protocols; and stockpile and exploit software flaws and vulnerabilities."


    I would recommend spending some of the EFF's hard earned money on commercials for radio and television, a blitz of never ending commercials.

    Commercials akin to the ad council commercials asking people to get up and get out and get healthy but focusing on people whom have had their data stolen; and how it has cost them in time, money and stress and push the point that people need learn, with each consecutive commercial over the months educating them on how to secure their data, what browser plugins they should use, how to manage passwords, et.

    Bottom up strategy, if people at home know the issues, then it makes its way into business and then into government until eventually there's a very public discussion with all the players/victims involved.

    Educate, but don't rely on the internet for education, push for air time!

    reply to this | link to this | view in chronology ]

    • icon
      John Fenderson (profile), 3 Aug 2016 @ 12:49pm

      Re: EFF I recommend...

      That's a hard call. I think that what you're saying is true -- public awareness and education is critical.

      However, TV campaigns cost a ton of cash, and the need to raise that cash can all too easily turn an outfit into little more than a fundraising operation.

      I think the EFF does fantastic and invaluable work right now, and would hate to see that work adversely impacted by a new venture. I think that agencies (and people, and software, and so on) tend to be at their best when they "do one thing really well".

      Perhaps what we need is a separate outfit to do that sort of work in conjunction with the EFF and such.

      reply to this | link to this | view in chronology ]

  • icon
    Padpaw (profile), 3 Aug 2016 @ 12:41pm

    the rule of law has become an illusion in America in my opinion.

    The state does what it wants and ignores any laws that protects their victims from being treated badly.

    reply to this | link to this | view in chronology ]

    • icon
      John Fenderson (profile), 3 Aug 2016 @ 12:58pm

      Re:

      Personally, I think that's overstating it by quite a lot. The US is not there quite yet. However, I agree that the trend has been toward increasing tyranny.

      The reason I stay optimistic is that this is not new territory for the US. We've been here a couple times before in history, and have managed not just to survive, but to come out of it a little better. It can be messy getting there, but we've done it before and we can do it again.

      I guess I agree with Theodore Parker when he said that in the long view, the arc of history bends towards justice.

      It's just that things can suck an awful lot in the short view.

      reply to this | link to this | view in chronology ]

      • icon
        Padpaw (profile), 3 Aug 2016 @ 1:06pm

        Re: Re:

        I am an advocate for revolution, so hopefully your views will trump mine in the long run. As a cynic I doubt it though.

        reply to this | link to this | view in chronology ]

      • identicon
        Rekrul, 4 Aug 2016 @ 4:31pm

        Re: Re:

        Personally, I think that's overstating it by quite a lot. The US is not there quite yet. However, I agree that the trend has been toward increasing tyranny.

        The government regularly violates the Constitution by spying on everyone.

        The TSA regularly abuses people at the "border".

        Police regularly take property from people and any attempts to add some limits to this process are shot down.

        All police are elevated to a protected class where they can beat a person until they're hospitalized and it's considered justifiable force, while if that person tries to pull away from said beating, they get charged with felony assault of a police officer.

        Even when a cop is caught on video doing something wrong, the DA, police unions and courts will do everything in their power to see that they don't face any consequences for it.

        When a person is accused of a crime, multiple charges are piled on in an attempt to scare the person into a plea bargain, even if they're innocent.

        If the DA has evidence that would exonerate the person, it often gets withheld or intentionally "lost".

        If the person goes to trial, the judge will outright lie to the jury and tell them that they MUST render a verdict according to the law even if they feel that justice would be better served with a non-guilty verdict.

        Said person's funds will also often be frozen to prevent them from mounting a competent defense and even if they aren't, the cost of defending themselves in a trial will probably bankrupt most people, after which they have to find a lawyer willing to work on commission so that they can sue the state to get their money back. Some states allow limited reimbursement of legal expenses, but that says nothing about any other bills they may have incurred while they were trying to prove their innocence, like mortgage payments, rent, utility bills, medical bills, etc.


        I'd say that we're pretty darn close...

        reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Techdirt Gear
Shop Now: Techdirt Logo Gear
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.