Court Tells Child Sexual Abuse Investigators That The Private Search Warrant Exception Only Works When There's A Private Search

from the thus-answering-the-question-'when-is-a-search-not-a-search' dept

Private searches that uncover contraband can be handed off to law enforcement without the Fourth Amendment getting too involved. Restrictions apply, of course. For instance, a tech repairing a computer may come across illicit images and give that information to law enforcement, which can use what was observed in the search as the basis for a search warrant.

What law enforcement can’t do is ask private individuals to perform searches for it and then use the results of those searches to perform warrantless searches of their own. A Ninth Circuit Appeals Court case [PDF] points out another thing law enforcement can’t do: assume (or pretend) a private search has already taken place in order to excuse its own Fourth Amendment violation. (h/t Rianna Pfefferkorn)

Automated scanning of email attachments led to a series of events that culminated in an unlawful search. Here’s the court’s description of this case’s origination:

The events giving rise to Luke Wilson’s conviction and this appeal were triggered when Google, as required by federal law, reported to the National Center for Missing and Exploited Children (NCMEC) that Wilson had uploaded four images of apparent child pornography to his email account as email attachments. No one at Google had opened or viewed Wilson’s email attachments; its report was based on an automated assessment that the images Wilson uploaded were the same as images other Google employees had earlier viewed and classified as child pornography. Someone at NCMEC then, also without opening or viewing them, sent Wilson’s email attachments to the San Diego Internet Crimes Against Children Task Force (ICAC), where an officer ultimately viewed the email attachments without a warrant. The officer then applied for warrants to search both Wilson’s email account and Wilson’s home, describing the attachments in detail in the application.

You can see where things went wrong: the warrantless search engaged in by the officer to view images neither of the other parties had actually opened or inspected. Apparently, Fourth Amendment violations are standard practice at the San Diego ICAC.

NCMEC then forwarded the CyberTip to the San Diego Internet Crimes Against Children Task Force (“ICAC”). Agent Thompson, a member of the San Diego ICAC, received the report. He followed San Diego ICAC procedure, which at the time called for inspecting the images without a warrant whether or not a Google employee had reviewed them.

A footnote attached to this paragraph states the new “standard procedure” is to obtain a warrant before opening a CyberTip “when the provider has not viewed the images.” The court notes it is “not clear from the record” that this is standard practice at other ICAC offices, or whether they’ve also been instructed to obtain warrants first from now on. So, more challenges are likely on the way.

The lower court refused to suppress the evidence obtained from Wilson’s email account and home, deciding the private search that had never actually occurred was a private search, salvaging the warrantless search that immediately followed the forwarding of the tip by NMCEC.

The Appeals Court disagrees.

First, the government search exceeded the scope of the antecedent private search because it allowed the government to learn new, critical information that it used first to obtain a warrant and then to prosecute Wilson. Second, the government search also expanded the scope of the antecedent private search because the government agent viewed Wilson’s email attachments even though no Google employee—or other person—had done so, thereby exceeding any earlier privacy intrusion. Moreover, on the limited evidentiary record, the government has not established that what a Google employee previously viewed were exact duplicates of Wilson’s images. And, even if they were duplicates, such viewing of others’ digital communications would not have violated Wilson’s expectation of privacy in his images, as Fourth Amendment rights are personal.

Matching hashes is not enough. And that’s all Google and NMCEC had when they forwarded the tip down the line to law enforcement. Just because both entities retain hashes (NMCEC retains images as well) that matched the hashes of the attachment doesn’t mean there’s no subjective expectation of privacy in one’s own email account. A strong probability that the files were child porn is the perfect basis for a warrant request. Unfortunately, the officer decided to engage in a search without one.

Wilson did not have an expectation of privacy in other individuals’ files, even if their files were identical to his files. The corollary of this principle must also be true: Wilson did have an expectation of privacy in his files, even if others had identical files. If, for example, police officers search someone else’s house and find documents evidencing wrongdoing along with notes indicating that I have identical documents in my house, they cannot, without a warrant or some distinct exception to the warrant requirement, seize my copies. I would retain a personal expectation of privacy in them, and in my connection to them, even if law enforcement had a strong basis for anticipating what my copies would contain. A violation of a third party’s privacy has no bearing on my reasonable expectation of privacy in my own documents. The government does not argue otherwise.

All of the evidence is suppressed, since it all relies on the initial lawless search. The ICAC in San Diego has, belatedly, put a warrant requirement in place. It won’t salvage this conviction, which has been reversed. And it may result in similar suppressions and reversals if the same search-first procedure was used in other child porn cases. But it’s always easier to bypass the warrant and get to the searching. After all, not every court will see the facts the same way, as is evidenced by the lower court’s refusal to suppress the evidence. But it’s now crystal clear in the Ninth Circuit: get a warrant.

Filed Under: , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Court Tells Child Sexual Abuse Investigators That The Private Search Warrant Exception Only Works When There's A Private Search”

Subscribe: RSS Leave a comment
13 Comments
This comment has been deemed insightful by the community.
That One Guy (profile) says:

For want of five bloody minutes of work...

The agency tasked with spotting and dealing with that sort of content told them that the attachments were matches and they still couldn’t be bothered to get a warrant just in case, apparently because warrentless searches were/are department procedure

Congrats you incompetent and/or corrupt goons, your indifference towards basic legal and constitutional requirements turned what would have been a slam dunk conviction into an innocent verdict and might very well snowball into the same result in other cases, that time saved by not bothering with warrants certainly paid off in spades now didn’t it.

This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

Re: For want of five bloody minutes of work...

Well courts so often are willing to bend & flex the alleged rights we have to keep cops happy & because the citizen obviously was a bad guy.

Creating exceptions and holes that then keep being widened by cops to lazy to slap some names into a form isn’t how it is supposed to work.

Innocent until proven guilty is still the law of the land & just because someone hands you a sealed envelope that someone else handed them that they all claim has evidence you are a bad bad man isn’t evidence.
Gathering evidence has rules & as much as it sucks when you don’t follow those rules & you screw 100’s of cases you did in the same slipshod way, perhaps maybe just maybe we need to remind them to follow the actual rules & make sure they cross every t dot every i rather than try to find ways to salvage the case they screwed by breaking the rules.

This comment has been deemed insightful by the community.
James Burkhardt (profile) says:

Incompetance, a play in 4 parts.

Okay, Google has provided probable cause that this content includes illegal photographs of a minor. Rock solid justification for a search. I just need a warrant and this is a slam dunk.

Oops. I looked at the file before getting the warrant.. thats okay, all I need to do is not mention that I looked at the files on my warrant application – I have probable cause without looking at them so I can save this in good faith by not using my knowledge from the illegal search.

Oops. I guess I told the court I searched the files. Oh well. Power ahead. Fake it till you make it.

What do you mean I broke the law?

Scary Devil Monastery (profile) says:

Re: Warrants

I am not an expert, but this is what it looks like to me; if the suspect is someone with the wrong skin color and/or an outspoken critic of the local police then turning their life inside out without a warrant as harrassment and walking away is the norm, whereas if you’re actually trying to secure a conviction of a guilty party then that same method doesn’t work so well.
…and all too many police officers seem so used to pulling the first stunt that habit now keeps them from successfully doing their job.

freelunch says:

the courts are drawing a funny distinction

The ICAC’s failure to apply for a warrant it would easily get seems like a basic error. So the "cops" part of this is easy. But the "courts" part is not so easy.

The courts are drawing a funny distinction. "No one at Google had opened or viewed Wilson’s email attachments; its report was based on an automated assessment that the images Wilson uploaded were the same as images other Google employees had earlier viewed and classified as child pornography." The appeals court makes the same distinction " the government agent viewed Wilson’s email attachments even though no Google employee—or other person—had done so, thereby exceeding any earlier privacy intrusion."

So what are they going to do when some of these scum start changing the image file just a little bit to escape the hash check, and the image is identified by a machine learning algorithm as extremely likely — for you who aren’t probabalists, significantly more likely than that this "e" arose because I typed the letter e — to be child pornography. (I won’t say "AI" because classifying the machine learning algorithm as a person offends common sense.) The right answer seems to be "grant the warrant" and perhaps that is not inconsistent with what these courts found.

But the distinction is funny in another way. There was "no earlier privacy intrusion" when a computer at Google looked at the file and reported it? Whoa! Google "knows" a lot about me, meaning that their computers do. No privacy intrusion until a human looks at it? Maybe "privacy intrusion" is a very limited term of art, but the principle seems funny.

Where I am coming from here is that almost nothing Google does is done by humans; not suggesting a response to a search, not suggesting an ad for the ad auction associated with the search, not flagging an image as child pornography. Almost everything is done by machines, and what the machines do might be reviewed humans very infrequently. So it seems like a funny distinction.

This comment has been deemed insightful by the community.
R.H. (profile) says:

Re: the courts are drawing a funny distinction

There was no privacy intrusion by Google because of two things. One the user, by way of Google’s Terms of Service, agreed to let Google look at their attachments. Two, Google is not the government so they aren’t limited by the Fourth Amendment.

What the court is saying is that until some being with a mind (let’s use the term "person" from here on out) looks at the evidence, no search (as defined by the Fourth Amendment) has occurred. This means that if no person at Google or the NCMEC actually looks at the images, no private search has yet occurred and the police need to get a warrant before looking at the forwarded images themselves.

freelunch says:

Re: Re: the courts are drawing a funny distinction

Thanks, R.H. Your second paragraph seems accurate, but to me it appears just to repeat what is funny about the doctrine. A private search has not occurred if a private person just told a computer — not a person — to look for the evidence and tell the police if any is found.

Perhaps the key is that the "just have to ask for a warrant" step prevents the oddity of the definition from doing any very large harm in practice.

That Anonymous Coward (profile) says:

Re: Re: Re: the courts are drawing a funny distinction

It isn’t a private person telling the computer to look for evidence.
They are running a program that seeks matches to known CP image hashes.
Every image crossing their network is checked again the master database.
The companies never look at the contents themselves.
TOS warns them these checks are made & will be forwarded to the proper authorities (Not that people read those).
TOS can’t suspend a persons rights, not even a suspected pedophile.
They hash the file because its a very fast way to check files.
While collisions aren’t common, they can happen & minor changes can alter the hash for what to the human eye looks like the same image.
We can’t throw someone in jail because a hash matched, someone needs to get permission from the court to check the image because now the government is performing a search based on evidence but a court needs to be convinced its on the up and up, not just a random report.
NCMCE is the main clearing house for these checks, they have a great track record at being correct but I do not know if any court has ruled their system is completely perfect and I doubt that would still be enough to pretend rights aren’t rights.
It provides a match that carries more weight than well we think he might be creepy or we saw him walking by a park.
They got sloppy because well, the target is almost always a bad guy, but that isn’t how rights work.
They literally could have had a prefilled form where they just type in a few details and put it before a Judge & it would have been approved.

They got a lead from what is considered a trusted informant that a crime was happening, but failed to follow the rules.
People might be mad & pissed off a pedophile got off, but we’ve seen what happens when they play fast and loose with peoples rights they tend to even ignore the "good guys" rights as well.

Rocky says:

Re: Re: Re: the courts are drawing a funny distinction

In this instance it’s very important to define "to look at" when it is a computer doing it. Comparing hashes isn’t looking at evidence, it just says "these 2 files may be the same". If the hashes match, you must then actually do a real comparison to ascertain that the files actually match.

Not doing that is like trying your door-key on every lock you come across, and if you happen to be able to unlock a random door that must mean that it leads to your apartment/house.

Sok Puppette says:

Re: Re: Re:2 the courts are drawing a funny distinction

Anything "may" fail. It is literally more probable that a dozen non-communicating humans would all independently look carefully at the the same plain white square, and all happen to misidentify that square as child porn, than that the hashes on two non-identical files would happen to match. Two to the one hundred twenty-eighth power is a very large number.

On the other hand, the idea that no search occurs until a human looks at something is just psychotic. When Google’s computer hashed the file, any sane legal system would recognize that as a search. When Google took a copy of the file and provided it to NCMEC or law enforcement or whoever, any sane legal system would regard that as a seizure.

Of course, it’s also true that, given that Google systematically does this in aid of law enforcement efforts, any sane legal system would regard Google as an agent of law enforcement, and require a warrant for Google to search any message to begin with… but two insanes do not make a sane.

Shaun Wilson (profile) says:

Re: the courts are drawing a funny distinction

I agree with what many are saying here: just get a warrant! Basically I would completely get rid of the current "private search" doctrine and merely concider a private search evidence to be concidered when a warrant is applied for. This deals with the contortions over weather an algorithmic search is really a "search" as defined by the courts – in the ordinary meaning of the term I definetely concider it a search, just a slightly less objectional search the narower the targets. (Eg: known child porn hashes probably ok, apples fuzzy hash system maybe, trying to calculate wether the amount of skin tone in an image makes it porn probably not, and potential defamatory statements/memes about polititions definetly not.)

This would also help with cranks for example who constantly report "crimes" that arent crimes as well – their reports should no longer be concidered probable cause after crying wolf too many times.

Lostinlodos (profile) says:

Re: the courts are drawing a funny distinction

One has to wonder though; at what point is a computer search within the realm of human search?

Because google here already did multiple “searches” and “scans”.
Google’s AI scanned the text for advertising. So a computer has read the email. Google scanned the attachments to look for viri. Se that’s another scan. Then it ran the images against a known series of other CSB materials.

So what we have here is the digital equivalent of:
Went to the mail box. Opened up the box. Read the letter. Washed the contents. And looked to see if it matched known illegal material. Then put it all back together with some new tape.
Not only that, they duplicated the material by sending it to another party.

Is there a point where such actions can become the equivalent of a human search in the first place?
All the same things were done that a human could/would do in a private non-destructive search.

I know I’m turning slightly towards rights for AI as a separate being but I’m trying to dodge that here. At some point… can a computer doing the exact same things reach equivalence?

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...