Administration Says Child Porn Provides A 'Model' For Hunting Terrorists Online

from the it-could-not-be-more-wrong dept

The administration is trying to draft tech companies into the War on Terror. Encryption -- despite being given an unofficial "hands-off" by President Obama -- is still being debated, with FBI Director James Comey and a few law enforcement officials leading the charge up the hill they apparently want to die on.

One of the aspects discussed was how to deter online communications involving terrorists. Trying to deputize tech companies is a non-starter, considering the potential for collateral damage. But that's not stopping the administration from trying to do exactly that, and it's willing to deploy the most terrible participant in its parade of horrors.

“I do have a lot of confidence that those companies that are run by patriotic Americans are not interested in seeing their tools or their technology used by terrorists to harm innocent Americans,” [White House press secretary Josh] Earnest told reporters in Washington before the meeting occurred. “That’s certainly not what they were designed for.”

[...]

“There is a precedent for us to confront this kind of problem,” Earnest said. “We know that there are some people who try to make money based on the selling and trafficking of child pornography and they are using websites to do that. And we have been able to work effectively with the tech community to counter those efforts.”
Earnest makes two implications here, both of them disingenuous. The first is that any reluctance expressed by tech companies should be viewed as evidence these companies just don't love America enough. The second is that an unwillingness to intervene on the US government's behalf is hypocritical, considering the voluntary efforts these companies undertake to identify and remove child pornography.

The problem is that alleged terroristic content and child pornography aren't comparable -- at least not to the extent that the administration portrays it. For one, child pornography -- for the most part -- is difficult to mistake for protected speech. One of the few exceptions to the First Amendment deals specifically with this content. In addition, most identified files have unique hashes which can readily be identified when they pop up elsewhere on the web.

The other problem is that even if files associated with terrorism or potential acts of terrorism are uniquely identifiable via hashes, that doesn't immediately elevate possession of these files to a criminal act. Jonathan Zittrain at Just Security points out that this makes all the difference in the world.
To be sure, child pornography filtering — and reporting — may be a place to draw a clear line. Not only is child pornography near-universally reviled and banned, but the matching algorithm for previously-identified images boasts no false positives, and, perhaps most important, possession of the file is not only clear evidence of the crime, but quite typically the crime itself. A terrorist to-do list is primarily only evidence, not itself a crime.
The hypothetical situation proposed in Zittrain's post -- that Google, et al begin to treat terrorist content like child pornography -- is a potential ground zero for all sorts of collateral damage. Google already scans email for child porn (as well as for potential advertising keywords), but adding terrorism to this short list would put dissidents, journalists, researchers and activists in the government's crosshairs. Possession of terrorist-related materials isn't a criminal act, but that wouldn't prevent unwarranted (in the original sense, not the Fourth Amendment sense [although that wouldn't be far behind…]) surveillance of citizens who aren't terrorists or terrorist sympathizers.

And once Google has expanded its dragnet to include terrorist material, it wouldn't take long for other aggrieved parties to jump on the proxy surveillance bandwagon. After all, a criminal offense is a criminal offense -- whether it's the circulation of terrorist-related content or anything else certain entities feel needs more policing on the Wild West Web.
If a search for contraband documents expands beyond the comparatively well-bounded area of child pornography, there could be little stopping it from progressing incrementally to an Orwellian level of invasiveness. For example, to prevent claimed copyright infringement, we could see services compelled to scan private communications for musical tracks or videos, or links to that content. Facebook has at times done just that for its private messaging service. Whatever one’s views on copyright, the upside of applying the search technique there is surely lower than that of catching murderers, though the logic underlying the search may ultimately prove powerful enough to make it common.
Add to that the fact that the government -- once it has persuaded Google, etc. to look for certain content -- will continue to add to the list of things tech companies should look for. Child porn is Patient Zero. Terrorism seems to be the next step. After that, mission creep is inevitable.

Zittrain does provide reasons why Google should scan for terrorist material, and they sound exactly like the reasons the government would state when pressuring tech companies to engage in further pro bono web policing: the additional searches would be minimally intrusive and could conceivably save lives. But the problem remains unaddressed by the "positives" of these tactics. Child porn possession is a crime. Possession of a circulated plan for a terrorist attack is not. It may be suspicious but it is not, in and of itself, a criminal act.

Finally, even with a tailored search for files with unique hash values, the search itself is still a general search. It would be an automated dragnet encompassing not only users of whatever service deployed it, but also those sending email/messages/etc. to users of that service. That's a "general search" -- the kind the Fourth Amendment is supposed to deter -- no matter how the government spins it. And it will spin it, if it gets a chance. That's why the pressure is being applied towards voluntary actions rather than legislated "fixes." If the government can talk Google and its competitors into performing its general searches for it, it can avoid dealing with the constitutional issues that would certainly arise if it chose to perform this on its own.

Filed Under: child porn, fbi, filtering, james comey, monitoring, terrorism, white house


Reader Comments

The First Word

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 25 Jan 2016 @ 10:16am

    Re: Re: Re: Re: Re:

    The difference is that possession of CP is in itself illegal. Possession of ideas is not. Terrorist propaganda are ideas, and therefore not illegal.
    A better comparison is "if someone who did not create the bomb, was in possession of a bomb, or passed it along, should they be prosecuted?".
    Let's compare apples to apples here.

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Insider Shop - Show Your Support!

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.