Suing Apple To Force It To Scan iCloud For CSAM Is A Catastrophically Bad Idea
from the this-would-make-it-harder-to-catch-criminals dept
There’s a new lawsuit in Northern California federal court that seeks to improve child safety online but could end up backfiring badly if it gets the remedy it seeks. While the plaintiff’s attorneys surely mean well, they don’t seem to understand that they’re playing with fire.
The complaint in the putative class action asserts that Apple has chosen not to invest in preventive measures to keep its iCloud service from being used to store child sex abuse material (CSAM) while cynically rationalizing the choice as pro-privacy. This decision allegedly harmed the Jane Doe plaintiff, a child whom two unknown users contacted on Snapchat to ask for her iCloud ID. They then sent her CSAM over iMessage and got her to create and send them back CSAM of herself. Those iMessage exchanges went undetected, the lawsuit says, because Apple elected not to employ available CSAM detection tools, thus knowingly letting iCloud become “a safe haven for CSAM offenders.” The complaint asserts claims for violations of federal sex trafficking law, two states’ consumer protection laws, and various torts including negligence and products liability.
Here are key passages from the complaint:
[Apple] opts not to adopt industry standards for CSAM detection… [T]his lawsuit … demands that Apple invest in and deploy means to comprehensively … guarantee the safety of children users. … [D]espite knowing that CSAM is proliferating on iCloud, Apple has “chosen not to know” that this is happening … [Apple] does not … scan for CSAM in iCloud. … Even when CSAM solutions … like PhotoDNA[] exist, Apple has chosen not to adopt them. … Apple does not proactively scan its products or services, including storages [sic] or communications, to assist law enforcement to stop child exploitation. …
According to [its] privacy policy, Apple had stated to users that it would screen and scan content to root out child sexual exploitation material. … Apple announced a CSAM scanning tool, dubbed NeuralHash, that would scan images stored on users’ iCloud accounts for CSAM … [but soon] Apple abandoned its CSAM scanning project … it chose to abandon the development of the iCloud CSAM scanning feature … Apple’s Choice Not to Employ CSAM Detection … Is a Business Choice that Apple Made. … Apple … can easily scan for illegal content like CSAM, but Apple chooses not to do so. … Upon information and belief, Apple … allows itself permission to screen or scan content for CSAM content, but has failed to take action to detect and report CSAM on iCloud. …
[Questions presented by this case] include: … whether Defendant has performed its duty to detect and report CSAM to NCMEC [the National Center for Missing and Exploited Children]. … Apple … knew or should have known that it did not have safeguards in place to protect children and minors from CSAM. … Due to Apple’s business and design choices with respect to iCloud, the service has become a go-to destination for … CSAM, resulting in harm for many minors and children [for which Apple should be held strictly liable] … Apple is also liable … for selling defectively designed services. … Apple owed a duty of care … to not violate laws prohibiting the distribution of CSAM and to exercise reasonable care to prevent foreseeable and known harms from CSAM distribution. Apple breached this duty by providing defective[ly] designed services … that render minimal protection from the known harms of CSAM distribution. …
Plaintiff [and the putative class] … pray for judgment against the Defendant as follows: … For [an order] granting declaratory and injunctive relief to Plaintiff as permitted by law or equity, including: Enjoining Defendant from continuing the unlawful practices as set forth herein, until Apple consents under this court’s order to … [a]dopt measures to protect children against the storage and distribution of CSAM on the iCloud … [and] [c]omply with quarterly third-party monitoring to ensure that the iCloud product has reasonably safe and easily accessible mechanisms to combat CSAM….”
What this boils down to: Apple could scan iCloud for CSAM, and has said in the past that it would and that it does, but in reality it chooses not to. The failure to scan is a wrongful act for which Apple should be held liable. Apple has a legal duty to scan iCloud for CSAM, and the court should make Apple start doing so.
This theory is perilously wrong.
The Doe plaintiff’s story is heartbreaking, and it’s true that Apple has long drawn criticism for its approach to balancing multiple values such as privacy, security, child safety, and usability. It is understandable to assume that the answer is for the government, in the form of a court order, to force Apple to strike that balance differently. After all, that is how American society frequently remedies alleged shortcomings in corporate practices.
But this isn’t a case about antitrust, or faulty smartphone audio, or virtual casino apps (as in other recent Apple class actions). Demanding that a court force Apple to change its practices is uniquely infeasible, indeed dangerous, when it comes to detecting illegal material its users store on its services. That’s because this demand presents constitutional issues that other consumer protection matters don’t. Thanks to the Fourth Amendment, the courts cannot force Apple to start scanning iCloud for CSAM; even pressuring it to do so is risky. Compelling the scans would, perversely, make it way harder to convict whoever the scans caught. That’s what makes this lawsuit a catastrophically bad idea.
(The unconstitutional remedy it requests isn’t all that’s wrong with this complaint, mind. Let’s not get into the Section 230 issues it waves away in two conclusory sentences. Or how it mistakes language in Apple’s privacy policy that it “may” use users’ personal information for purposes including CSAM scanning, for an enforceable promise that Apple would do that. Or its disingenuous claim that this isn’t an attack on end-to-end encryption. Or the factually incorrect allegation that “Apple does not proactively scan its products or services” for CSAM at all, when in fact it does for some products. Let’s set all of that aside. For now.)
The Fourth Amendment to the U.S. Constitution protects Americans from unreasonable searches and seizures of our stuff, including our digital devices and files. “Reasonable” generally means there’s a warrant for the search. If a search is unreasonable, the usual remedy is what’s called the exclusionary rule: any evidence turned up through the unconstitutional search can’t be used in court against the person whose rights were violated.
While the Fourth Amendment applies only to the government and not to private actors, the government can’t use a private actor to carry out a search it couldn’t constitutionally do itself. If the government compels or pressures a private actor to search, or the private actor searches primarily to serve the government’s interests rather than its own, then the private actor counts as a government agent for purposes of the search, which must then abide by the Fourth Amendment, otherwise the remedy is exclusion.
If the government – legislative, executive, or judiciary – forces a cloud storage provider to scan users’ files for CSAM, that makes the provider a government agent, meaning the scans require a warrant, which a cloud services company has no power to get, making those scans unconstitutional searches. Any CSAM they find (plus any other downstream evidence stemming from the initial unlawful scan) will probably get excluded, but it’s hard to convict people for CSAM without using the CSAM as evidence, making acquittals likelier. Which defeats the purpose of compelling the scans in the first place.
Congress knows this. That’s why, in the federal statute requiring providers to report CSAM to NCMEC when they find it on their services, there’s an express disclaimer that the law does not mean they must affirmatively search for CSAM. Providers of online services may choose to look for CSAM, and if they find it, they have to report it – but they cannot be forced to look.
Now do you see the problem with the Jane Doe lawsuit against Apple?
This isn’t a novel issue. Techdirt has covered it before. It’s all laid out in a terrific 2021 paper by Jeff Kosseff. I have also discussed this exact topic over and over and over and over and over and over again. As my latest publication (based on interviews with dozens of people) describes, all the stakeholders involved in combating online CSAM – tech companies, law enforcement, prosecutors, NCMEC, etc. – are excruciatingly aware of the “government agent” dilemma, and they all take great care to stay very far away from potentially crossing that constitutional line. Everyone scrupulously preserves the voluntary, independent nature of online platforms’ decisions about whether and how to search for CSAM.
And now here comes this lawsuit like the proverbial bull in a china shop, inviting a federal court to destroy that carefully maintained and exceedingly fragile dynamic. The complaint sneers at Apple’s “business choice” as a wrongful act to be judicially reversed rather than something absolutely crucial to respect.
Fourth Amendment government agency doctrine is well-established, and there are numerous cases applying it in the context of platforms’ CSAM detection practices. Yet Jane Doe’s counsel don’t appear to know the law. For one, their complaint claims that “Apple does not proactively scan its products or services … to assist law enforcement to stop child exploitation.” Scanning to serve law enforcement’s interests would make Apple a government agent. Similarly, the complaint claims Apple “has failed to take action to detect and report CSAM on iCloud,” and asks “whether Defendant has performed its duty to detect and report CSAM to NCMEC.” This conflates two critically distinct actions. Apple does not and cannot have any duty to detect CSAM, as expressly stated in the statute imposing a duty to report CSAM. It’s like these lawyers didn’t even read the entire statute, much less any of the Fourth Amendment jurisprudence that squarely applies to their case.
Any competent plaintiff’s counsel should have figured this out before filing a lawsuit asking a federal court to make Apple start scanning iCloud for CSAM, thereby making Apple a government agent, thereby turning the compelled iCloud scans into unconstitutional searches, thereby making it likelier for any iCloud user who gets caught to walk free, thereby shooting themselves in the foot, doing a disservice to their client, making the situation worse than the status quo, and causing a major setback in the fight for child safety online.
The reason nobody’s filed a lawsuit like this against Apple to date, despite years of complaints from left, right, and center about Apple’s ostensibly lackadaisical approach to CSAM detection in iCloud, isn’t because nobody’s thought of it before. It’s because they thought of it and they did their fucking legal research first. And then they backed away slowly from the computer, grateful to have narrowly avoided turning themselves into useful idiots for pedophiles. But now these lawyers have apparently decided to volunteer as tribute. If their gambit backfires, they’ll be the ones responsible for the consequences.
Riana Pfefferkorn is a policy fellow at Stanford HAI who has written extensively about the Fourth Amendment’s application to online child safety efforts.
Filed Under: 4th amendment, class action, csam, evidence, proactive scanning, scanning
Companies: apple


Comments on “Suing Apple To Force It To Scan iCloud For CSAM Is A Catastrophically Bad Idea”
What I read: “someone wants to give lots of power to anyone who can generate false positives for certain image matching services”
Re:
At least that’s not trivial to do with AI.
yada yada
Wow, what a way to test/Force Apple to OPEN its encryption.
For all the hacking and Upsetting the police agencies, Cause they CANT open any apple phone when its locked. That they cant sit at the Tower, and record Anything they can read/see that isnt Encrypted.
What a way to find out if Apple has the keys or knows HOW to get them from the phone.
Re:
“Ma Bell rolled over. Why won’t Apple?”
lawyer searching for zombie horses
While I realize that having to beat the dead horse yet again is tedious, I am thankful to your knowledge, and your efforts to keep us informed.
“They then sent her CSAM over iMessage and got her to create and send them back CSAM of herself.”
So in this magical world, in these lawyers minds, there is a detection tool that can identify never before seen content as being CSAM?
How do you know they sent actual CSAM, did you get the client it send it to you?
Re:
Companies are expected to be clairvoyant. They should ‘just know’ that something will hurt somebody someday and prevent it from happening, but not violate anyone’s privacy, (except people we don’t care about,) or do so in a way that makes them money, or agree to help an administration or regime we don’t like, while fully cooperating with ones we do like.
How hard can that be?
Re: Worse...
Worse than that – what’s the turnaround time between, say an Apple automatic filter detecting the offesne and agents kicking in the perp’s door? The implication here in the article is that minimal time elapsed between the receipt of CSAM and the victim first making her own and sending it. (Perhaps ensuring her door gets kicked in too).
If it was less than 24 hours, good odds that the problem was already created – in which case th question is – how does an automated filter help? Other than eliminating any of the evidence from the prosecution, of course. The trauma is already done.
Worse than that even more, if agents kick in the perp’s door and find more CSAM as a result of the proposed Apple alert, is that new evidence also fruit of the poisoned tree and not admissible? The agents would have to prove they had probable cause before the alert in order to even suggest what they find in the perp’s house is admissible.
Not to mention, once it’s known that Apple systematically scans all content, of course the freaks will find other avenues to exchange pictures.
Re: not all that difficult
Sure. There are mechanisms, generally called ``hashtags”, for that sort of thing. When you send or post such material, you include the octothorpe character # followed immediately by the string “CSAM”. The scanner can then detectthe material.
For new material where you want extra attention or need to direct the training algorythm, you can use ‘#’ + “NEW” as well as the standard ‘#’ + “CSAM” tag.
How about parents do parenting
How about parents do parenting?!
Why is a child with a smartphone, sorry iPhone (fashion accessory), unattended?
CSAM is just a foot in the door
Once the government has successfully forced Apple to scan for CSAM content, they’ll want Apple to scan for Right-leaning, Left-leaning, LGBTQ, abortion, Dungeons and Dragons content, and whatever the bug-a-boo of the month is. Give them an inch and they’ll expect a mile.
Re:
You can bet big media companies will be standing outside the door with duffel bags full of “campaign donations” begging for them to scan for copyrighted material.
Re: Also...
Where does it stop? If your content is public enough, can spouses sue to obtain all text messages and call detals from their soon-to-be-ex’s phone? Can your ex-employer sue for Apple to show all phone content to see if you absconded with proprietary information? Whys hould the govrnment have exclusive access if the information is there for perusal?
That’s glossing over how mandatory reporting itself conflicts with the Fourth. The status quo is a bit of a clusterfuck. Mind you, better than the likely alternative clusterfuck
If the status quo was relying on no one ever making a filing, that was itself ultimately untenable. A pedophile could’ve just as easily filed the case themselves.
There are remedies for constitutional issues, namely balancing tests like strict scrutiny. We already apply it to consumer protection issues like required labeling/fraud etc. It’s definitely harder (and it’s arguable whether it should be applied in this particular case), but it’s also not actually that unique. The Fourth is no bigger roadblock than the First, outside of how we choose it to be as a society/jurisprudence.
The only way to win is to lose completely
That’s almost impressive, it’s not often you see a fractally wrong legal case that isn’t being filed for political reasons…
‘We’re not gunning for encryption’… says the legal team demanding an action that would require destroying it.
‘We’re doing this for the victims of CSAM’… says the legal team that is asking the court to hand CSAM producers and distributors the biggest win in the past ever by all but ensuring that while they might be found it will be impossible for them to be charged, tried and convicted.
I’m not sure which is more likely, the lawyers involved are just as stupid as they appear or they spotted the massive problems and dismissed them because hey, a paycheck’s a paycheck, but either way for the sake of everyone not a CSAM producer/distributor I can only hope that whatever judge this lawsuit lands in front of shuts it down hard.
This comment has been flagged by the community. Click here to show it.
No surprise that the usual suspects are arguing vociferously against policing devices and the cloud for CSAM.
Hmmmmm.
Re:
The usual suspects are against wrecking the law and irrevocably destroying the evidence sought by obtaining it in aclearly unlawful manner which will haveit thrown out of every case rought before a court.
i appreciate your fantast partisan nonsense, but you would be your own “usual suspect” here, dumbass.
Re:
We know.
They’re called Republicans and the people who support them.
A few statements of fact:
So… what exactly does this group want from Apple?
That they go ahead and implement the phone-side scanning solution that they floated and then pulled when many many people illustrated how it didn’t have enough privacy protections in place and could be used to fish through phones for any content deemed undesirable by a nation state?
Seems to me that this lawsuit should be dead in the water as soon as Apple’s lawyers point all this stuff out. They already do what the plaintiff wants, what they do doesn’t accomplish what the plaintiff wants it to (wouldn’t have prevented the incident), and Apple has been actively implementing and seeking to implement further protections that don’t open the door to illegal privacy breaches.
Personally, my solution with my kids was: sure, you get a phone. You don’t get a data package. Talk to me if anything dodgy shows up on your phone, no matter who it’s from; we’ll discuss it with zero immediate consequences, but I’ll provide all the advice you need in how to handle it. Any phone misuse without that transparency however, and the phone is permanently gone until I decide said child is mature enough to handle it.
I haven’t had to test that at all, as (as far as I know), my kids have been transparent and are mature enough/paranoid enough to not distribute PII via the Internet.
Great, so let’s sue every law enforcement agency over their utter failures to prevent or end almost anything, including CSAM, and including the stuff that gets clearly reported to them, also including the occasions where they are provided substatial evidence from the get. They have the actual power to do anything, but choose to squander it on petty bullshit (although it is frequently no so petty to their victims). Also why isn’t there a cop stationed everywhere every second? That would cut down on non-cop crimes (citation needed) and probably child abuse of any form (again, non-cop variety) as well.
Yes, yes, let’s make a law, via the courts, that forces Apple to ‘nerd harder’. I do smell a rat here, in that the government, most likely the FBI, is secretly backing the case in the hopes of making Apple give up it’s “we can’t” stance in regards to breaking encryption for all of its products.
I’m beginning to think that most politicians successfully escaped from school without so much as a by-your-leave regarding both basic math and basic causality and its natural follow-on, ramifications.
It’s enough to make a grown man weep.
Here’s an alternative solution:
1) Crimes happen.
2) A victim reports them.
3) Criminal gets prosecuted.
Everyone has a fundamental right to privacy and ideas like this trample on that.
Why wasn’t Jane Doe prosecuted for producing the CSAM?