Breaking Encryption To Aid Client-Side Scanning Isn’t The Solution To The CSAM Problem

from the undermining-security-to-generate-band-aids dept

Plenty of legislators and law enforcement officials seem to believe there’s only one acceptable solution to the CSAM (child sexual abuse material) problem: breaking encryption.

They may state some support for encryption, but when it comes to this particular problem, many of these officials seem to believe everyone’s security should be compromised just so a small percentage of internet users can be more easily observed and identified. They tend to talk around the encryption issue, focusing on client-side scanning of user content — a rhetorical tactic that willfully ignores the fact that client-side scanning would necessitate the elimination of one end of end-to-end encryption to make this scanning possible.

The issue at the center of these debates often short-circuits the debate itself. Since children are the victims, many people reason no sacrifice (even if it’s a government imposition) is too great. Those who argue against encryption-breaking mandates are treated as though they’d rather aid and abet child exploitation than allow governments to do whatever they want in response to the problem.

Plenty of heat has been directed Meta’s way in recent years, due to its planned implementation of end-to-end encryption for Facebook Messenger users. And that’s where the misrepresentation of the issue begins. Legislators and law enforcement officials claim the millions of CSAM reports from Facebook will dwindle to almost nothing if Messenger is encrypted, preventing Meta from seeing users’ communications.

This excellent post by cybersecurity expert Susan Landau for Lawfare punctures holes in these assertions, pointing out that the “millions” of reports Facebook generates annually are hardly indicative of widespread sexual abuse of children.

Yes, the transition of CSAM sharing to online communication services has resulted in a massive increase in reports to NCMEC (National Center for Missing and Exploited Children).

The organization received 29 million reports of online sexual exploitation in 2021, a 10-fold increase over a decade earlier. Meanwhile the number of video files reported to NCMEC increased over 40 percent between 2020 and 2021.

But that doesn’t necessarily mean there are more children being exploited than ever before. Nor does it mean Facebook sees more CSAM than other online services, despite its massive user base.

Understanding the meaning of the NCMEC numbers requires careful examination. Facebook found that over 90 percent of the reports the company filed with NCMEC in October and November 2021 were “the same as or visually similar to previously reported content.” Half of the reports were based on just six videos. 

As Landau is careful to point out, that doesn’t mean the situation is acceptable. It just means tossing around phrases like “29 million reports” doesn’t necessarily mean millions of children are being exploited or millions of users are sharing CSAM via these services.

Then there’s the uncomfortable fact that a sizable percentage of the content reported to NMCEC doesn’t actually involve any exploitation of minors by adults. Landau quotes from Laura Draper’s 2022 report on CSAM and the rise of encrypted services. In that report, Draper points out that some of the reported content is generated by minors for other minors: i.e., sexting.

Draper observed that CSAE consists of four types of activities exacerbated by internet access: (a) CSAM, which is the sharing of photos or videos of child sexual abuse imagery; (b) perceived first-person (PFP) material, which is nude imagery taken by children of themselves and then shared, often much more widely than the child intended; (c) internet-enabled child sex trafficking; and (d) live online sexual abuse of children. 

While these images are considered “child porn” (to use an antiquated term), they are not actually images take by sexual abusers, which means they aren’t actually CSAM, even if they’re treated as such by NMCEC and reported as such by communication services. In these cases, Landau suggests more education of minors to inform them of the unintended consequences of these actions, first and foremost being that they can’t control who these images are shared with once they’ve shared them with anyone else.

The rest of the actions on that list are indeed extremely disturbing. But, as Landau (and Draper) suggest, there are better solutions already available that don’t involve undermining user security by removing encryption or undermining their privacy by subjecting them to client-side scanning.

[C]onsider the particularly horrific crime in which there is live streaming of a child being sexually abused according to requests made by a customer. The actual act of abuse often occurs abroad. In such cases, aspects of the case can be investigated even in the presence of E2EE. First, the video stream is high bandwidth from the abuser to the customer but very low bandwidth the other way, with only an occasional verbal or written request. Such traffic stands out from normal communications; it looks neither like a usual video communication nor a showing of a film. And the fact that the trafficker must publicly advertise for customers provides law enforcement another route for investigation.

Unfortunately, government officials tend to portray E2EE as the root of the CSAM problem, rather than just something that exists alongside a preexisting problem. Without a doubt, encryption can pose problems for investigators. But there are a plethora of options available that don’t necessitate making everyone less safe and secure just because abusers use encrypted services in order to avoid immediate detection.

Current processes need work as well. As invaluable as NCMEC is, it’s also contributing to a completely different problem. Hash matching is helpful but it’s not infallible. Hash collisions (where two different images generate identical hashes) are possible. Malicious actors could create false collisions to implicate innocent people or hide their sharing of illicit material. False positives do happen. Unfortunately, at least one law enforcement agency is treating the people on the receiving end of erroneous flagging as criminal suspects.

Responding to an information request from ICCL, the Irish police reported that NCMEC had provided 4,192 referrals in 2020. Of these, 409 of the cases were actionable and 265 cases were completed. Another 471 referrals were “Not Child Abuse Material.” The Irish police nonetheless stored “(1) suspect email address, (2) suspect screen name, [and] (3) suspect IP address.” Now 471 people have police records because a computer program incorrectly flagged them as having CSAM.

Stripping encryption and forcing service providers to engage in client-side scanning will only increase the number of false positives. But much of what’s being proposed — both overseas and here in the United States — takes the short-sighted view that encryption must go if children are to be saved. To come up with better solutions, legislators and law enforcement need to be able to see past the barriers that immediately present themselves. Rather than focus on short-term hurdles, they need to recognize online communication methods will always be in a state of fluctuation. What appears to be the right thing to do now may become utterly worthless in the near future.

Think differently. Think long term. Think about protecting the privacy and security of all members of society—children and adults alike. By failing to consider the big picture, the U.K. Online Safety Act has taken a dangerous, short-term approach to a complex societal problem. The EU and U.S. have the chance to avoid the U.K.’s folly; they should do so. The EU proposal and the U.S. bills are not sensible ways to approach the public policy concerns of online abetting of CSAE. Nor are these reasonable approaches in view of the cyber threats our society faces. The bills should be abandoned, and we should pursue other ways of protecting both children and adults.

The right solution now isn’t to make everyone less safe and secure. Free world governments shouldn’t be in such a hurry to introduce mandates that lend themselves to abuse by government entities and used to justify even more abusive surveillance methods deployed by autocrats and serial human rights abusers. Yes, the problem is important and should be of utmost concern. But that doesn’t mean governments should, for all intents and purposes, outlaw encryption just because it seems to be quickest, easiest solution to a problem that’s often misrepresented and misperceived.

Filed Under: , , ,
Companies: meta, ncmec

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Breaking Encryption To Aid Client-Side Scanning Isn’t The Solution To The CSAM Problem”

Subscribe: RSS Leave a comment
12 Comments
Anonymous Coward says:

Landau suggests more education of minors to inform them of the unintended consequences of these actions, first and foremost being that they can’t control who these images are shared with once they’ve shared them with anyone else.

… for instance, to the school’s spyware vendor when you are charging your phone via your school laptop.

Anonymous Coward says:

Re:

My old high school used to force an extension on student Chromebooks that was student spyware masquerading as an update check service. It lasted about a year before complaints made them remove it and replace it with GoGuardian.

GoGuardian is also spyware but at the very least they’re not trying to hide what it actually is.

This comment has been deemed insightful by the community.
NotTheMomma (profile) says:

As I have stated YEARS ago on here and several other boards, if you make encryption breakable, that means all government records on a computer are open now public knowledge or secret on the honor system. If you have a god key to get in, it will take a day, maybe 2, before that key is recreated and available online. Think you can have your own special, unbroken encryption? HAH! That will be duplicated in minutes if not sooner. The gathered intelligence of every government combined will not outsmart the remaining computer nerds in the world. Lets not forget that those people who have the key can be bought. Everyone can be bought. We just need to haggle the price.

ECA (profile) says:

For all of htis

HOW well did they do, BEFORE the internet?
THey could monitor the Wired Phone system, and the old cellphones were not well protected.
So What the hell do they think they are going to capture, if they CANT/Didnt capture Much BEFORE.
THEN, When they did, What did they do with the perpetrator? His friends?

IF they werent WELL known, and Very popular, NOT A HELL OF A LOT.
Cause they could not Link anyone TO that 1 person.
BUT, how about that POPULAR person who had’ MANY MANY FRIENDS??
Where is the line of people going to court?

Anonymous Coward says:

It’s been said before, and I’m a firm believer in this, the current predictions are that it’ll take about 20 to 25 seconds!! to implement a TSR on the client computer that will report back to the query “Ain’t no CSAM here, please carry on.”

Yes, TSR is a holdover from the days of DOS, but the implication is the same – a small piece of code that looks for a certain string of characters (as in, “Is there any CSAM here?”), and returns a negative report.

I further predict that no matter what any government tries to do to stamp out these snippets of code, they will be available far and wide, both legally (in app stores) and otherwise. For a long time to come. Even after governments give up trying this crap. ‘Cause you know, they’ll say one thing, then turn around and keep doing it anyway.

That One Guy (profile) says:

'In order to protect the children we must make them less safe!'

Draper observed that CSAE consists of four types of activities exacerbated by internet access: (a) CSAM, which is the sharing of photos or videos of child sexual abuse imagery; (b) perceived first-person (PFP) material, which is nude imagery taken by children of themselves and then shared, often much more widely than the child intended; (c) internet-enabled child sex trafficking; and (d) live online sexual abuse of children.

Now imagine that encryption was a thing of the past and privacy was only for those rich or tech savvy enough to use alternatives(that would exist), if kid was sexting with a fellow kid the odds that someone else would find and snag that picture/video just got exponentially larger, and if someone is already that detestable the chance for blackmail/extortion resulting in the creation of more, less voluntary pics/vids just shot through the roof as well.

Encryption protects kids just as much if not more than it does adults, the idea that encryption needs to be destroyed to ‘protect’ the children is like claiming that seat belts need to be removed from cars because a kidnapper might use one to restrain one of their victims.

Nemo says:

Lacey and His Friends

That’s a book compilation by David Drake, the notorious military SF writer. Nation Without Walls is a story about a world where privacy’s illegal, but crimes still happen anyway. The main character’s bound to be viewed as repulsive, ignoring that he’s living in a very different future, and a grim one, too.

It includes at least one sequel story, but the unintended consequences of literally having no privacy make the dystopian nature of that world clear. The first story was written in ’77, and the effects of his ‘Nam tour were still pretty raw with him, so it’s rough. It’s also prescient, since we are rapidly heading that way on the internet, right down to those with power being able to retain a good deal of privacy, as in the stories.

The parallels between a future imagined in the ’70s and articles like this are striking.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...