Whatever Problem EARN IT Is Trying To Solve, It Doesn't

from the that-seems-like-a-problem dept

I’ve already talked about the potential 1st Amendment problems with the EARN IT Act and the potential 4th Amendment problems with it as well. But a recent post by Riana Pfefferkorn at Stanford raises an even bigger issue in all of this: what actual problem is EARN IT trying to solve?

This sounds like a simple question with a potentially simple answer, but the reality, once you start to dig in, suggests that either (1) the backers of EARN IT don’t actually know, or (2) if they do know, they know what they actually want is unconstitutional.

Supporters of EARN IT will say, simply, the problem they’re trying to solve is the prevalence of child sexual abuse material (CSAM) online. And, that is a real problem (unlike some other moral panics, CSAM is a legitimate, large, and extraordinarily serious problem). But… CSAM is already very, very illegal. So, if you dig in a little further, supporters of EARN IT will say that the problem they’re really trying to solve is that… internet companies don’t take CSAM seriously enough. But, the law (18 USC 2258A already has pretty strict requirements for websites to report any CSAM they find to NCMEC (the National Center for Missing & Exploited Children) — and they do. NCMEC reported that it received almost 21.4 million reports of CSAM from websites. Ironically, many supporters of EARN IT point to these numbers as proof that the websites aren’t doing enough, while also saying it proves they don’t have any incentive to report — which makes no sense at all.

So… is the problem that those 21.4 million reports didn’t result in the DOJ prosecuting enough abusers? If so… isn’t the problem somewhere between NCMEC and the DOJ? Because the DOJ can already prosecute for CSAM and Section 230 doesn’t get in the way of that (it does not immunize against federal criminal law). And, as Riana noted in her article, this very same Senate Committee just recently heard about how the FBI actually knew about an actual serial child sex abuser named Larry Nasser, and turned a blind eye.

And, if NCMEC is the problem (namely in that it can’t process the reports fast enough), then this bill doesn’t help at all there either, because the bill doesn’t give NCMEC any more funding. And, if the senators are correct that this bill would increase the reports to NCMEC (though it’s not clear why that would work), wouldn’t that just make it even more difficult for NCMEC to sort through the reports and alert law enforcement?

So… is the problem that companies aren’t reporting enough CSAM? If you read the sponsors’ myths and facts document, they make this claim — but, again, the law (with really serious penalties) already requires them to report any CSAM. Taking away Section 230 protections won’t change that. Reading between the lines of the “myths and facts” document, they seem to really be saying that the problem is that not every internet service proactively scans every bit of content, but as we’ve discussed that can’t be the problem, because if that is the problem, EARN IT has a massive 4th Amendment problem that will enable actual child sex abusers to suppress evidence!

Basically, if you look step by step through the potential problems that supporters of the bill claim it tries to solve, you immediately realize it doesn’t actually solve any of them. And, for nearly all of the potential problems, it seems like there’s a much more efficient and effective solution which EARN IT does not do. Riana’s post has a handy dandy table walking down each of these paths, but I wanted to make it even clearer, and felt that a table isn’t the best way to walk through this. So here is her chart, rewritten (all credit to her brilliant work):

If online services don’t report CSAM in violation of 2258A, and the real problem is large-scale, widespread, pervasive noncompliance by numerous providers that knowingly host CSAM without removing or reporting it (NOT just occasional isolated incidents), then there’s a very long list of potential remedies:

  • Conduct a congressional investigation to determine the extent of the problem
  • Hold a hearing to ask DOJ why it has never once brought a 2258A prosecution
  • DOJ prosecutes all those providers for illegally hosting CSAM under 2252A as well as violating 2258A?s reporting requirements
  • Amend 2258A(e) to increase penalties for noncompliance
  • Amend Dodd-Frank to include 2258A compliance in corporate disclosure requirements (akin to Form SD)
  • Encourage FTC investigation of noncompliant companies for unfair or deceptive business practices
  • Encourage private plaintiffs to file securities-fraud class actions against publicly-traded providers for misleading investors by secretly violating federal reporting duties

If that’s the actual problem (which supporters imply, but when you try to get them to say it outright they hem and haw and won’t admit it), then it seems like any of the above list would actually be helpful here. And the real question we should be asking is why hasn’t the DOJ done anything here?

But what does EARN IT actually do?

  • Amend Section 230 instead of enforcing existing law
  • Don?t demand that DOJ explain why they aren?t doing their job

Okay, so maybe the supporters will say (as they sometimes admit) that most web sites out there actually do report CSAM under 2258A, but there are still some providers who don’t report it and these are occasional, isolated instances of failure to report by multiple providers, OR repeated failure to report by a particular rogue provider (NOT a large-scale problem across the whole tech industry). If anything, that seems more probably than the first version, which doesn’t seem to be reported by any facts. However, here again, there are a bunch of tools in the regulator’s tool box to deal with this problem:

  • Conduct a congressional investigation to determine the extent of the problem
  • Hold a hearing to ask DOJ why it has never once brought a 2258A prosecution
  • DOJ prosecutes those isolated violations or the particular rogue provider

Again, what it comes down to in this scenario is that the DOJ is not doing it’s job. The law is on the books, and the penalties can be pretty stiff (first failure to report is $150,000 and each subsequent failure is another $300,000). If it’s true that providers are not doing enough here, such penalties would add up to quite a lot and the question again should be why isn’t the DOJ enforcing the law?

But instead of exploring that, here’s what EARN IT actually does:

  • Amend Section 230 instead of enforcing existing law
  • Don?t demand that DOJ explain why they aren?t doing their job

Okay, so next up, Riana points out that maybe it’s possible that the DOJ does regular investigations of websites failing to report CSAM in violation of 2258A, but those investigations are consistently resolved without charges or fines and do not become public. Then, there’s a pretty simple option for Congress:

  • Hold hearings to have DOJ explain why their investigations never result in charges

But, instead, here’s what Congress is doing with EARN IT (stop me if you’ve heard this one before):

  • Amend Section 230 instead of enforcing existing law
  • Don?t demand that DOJ explain why they aren?t doing their job

Okay, okay, so maybe the reality is that the DOJ does in fact criminally prosecute websites for 2258A violations, but the reason there is no public record of any such prosecution ever is that all such court records are under seal. This would be… odd, first of all, given that the DOJ loves to publicize prosecutions, especially over CSAM. But, again, here’s what Congress could do:

  • Tell DOJ to move for courts to unseal all sealed records in 2258A cases
  • Require DOJ to report data on all 2258A prosecutions since 2258A?s enactment
  • Amend 2258A to require regular reporting to Congress by DOJ of enforcement statistics
  • Investigate whether providers (especially publicly-traded ones) kept 2258A fines a secret

But, instead, here’s what EARN IT does:

  • Amend Section 230 instead of enforcing existing law
  • Don?t demand that DOJ reveal to Congress its 2258A enforcement details

So, maybe the real problem is simply that the DOJ seems to be ignoring any effort to enforce violations of 2258A. If that’s the case, Congress has tools in its toolbox:

  • Hold a hearing to ask DOJ why it has never once brought a 2258A prosecution
  • Amend 2258A by adding a private right of action so that victims can do the work that DOJ isn?t doing

Instead, EARN IT…

  • Amend Section 230 instead of enforcing existing law
  • Don?t demand that DOJ explain why they aren?t doing their job

So… that’s basically all the possible permutations if the problem is — as some supporters claim repeatedly — that companies are regularly violating 2258A and not reporting CSAM that they find. And, in almost every case, the real questions then should be why isn’t the DOJ enforcing the law? And there are lots of ways that Congress should deal with that. But EARN IT does literally none of them.

About the only thing that supporters of EARN IT have claimed in response to this point is that, because EARN IT allows for state AGs and civil suits, it is “adding more cops to the beat” to take on failures to report under 2258A. But… that’s kinda weird. Because wouldn’t it make a hell of a lot more sense to first find out why the existing cops don’t bother? Because no one has done that. And, worse, when it comes to the civil suits, this response basically means “the DOJ doesn’t care to help victims of CSAM, so we’re leaving it up to them to take matters into their own hands.” And that doesn’t seem like a reasonable solution no matter how you look at it.

If anything, it looks like Congress putting the burden for the DOJ’s perpetual failings… on the victims of CSAM. Yikes!

Of course, there are other possible problems here as well, and Riana details them in the chart. In these cases, the problems aren’t with failure to report CSAM, but elsewhere in the process. So… if websites do properly report CSAM to NCMEC’s CyberTipline, perhaps the problem is that CSAM isn?t being taken down promptly enough or reported to NCMEC ?as soon as reasonably possible? as required by 2258A(a)(1)(A)(i).

Well, then, as Riana notes, there are a few things Congress could do:

  • Debate whether to insert a firm timeframe into 2258A(a)(1)(A)(i)
  • Hold a hearing to ask ICS providers of various sizes why delays happen and whether a specific timeframe is feasible

Instead, what EARN IT actually does is…

  • Amend Section 230

Okay, so if companies are reporting to NCMEC in compliance with 2258A, perhaps the problem is the volume of reports is so high that NCMEC is overwhelmed.

Well, then, the possible solutions from Congress would seem to be:

  • Hold a hearing to ask NCMEC what it would take to process all the reports they already get
  • Appropriate those additional resources to NCMEC

But, what EARN IT does is…

  • Amend Section 230 to induce providers to make even more reports NCMEC can?t keep up with
  • Give zero additional resources to NCMEC

Okay, so maybe the websites do properly report CSAM to NCMEC, and NCMEC is able to properly alert the DOJ to the CSAM such that the DOJ should be able to go prosecute the actual abusers, but the DOJ doesn?t act on the reports providers make, and doesn?t make its own mandatory reports to Congress about internet crimes against children. That would be horrifying, but again, it would seem like there’s a pretty clear course of action for Congress:

  • Order GAO to conduct a study on what happens to CyberTips passed by NCMEC to DOJ
  • Hold a hearing to ask DOJ why it isn?t acting on tips or filing its required reports
  • Appropriate additional resources to DOJ

All of those would help, if this is the problem, but instead, here’s what EARN IT actually does:

  • Earmark $1 million for IT improvements
  • Don?t demand that DOJ explain why they aren?t doing their job

You might sense a pattern here.

And finally, perhaps websites do report CSAM in compliance with 2258A to NCMEC’s CyberTipline, and maybe NCMEC does relay important information to the DOJ… and horrifyingly, perhaps federal law enforcement is failing child sex abuse victims just as the FBI turned a blind eye to Larry Nassar?s abuse of dozens of child gymnasts for years.

Well, then it seems fairly obvious what Congress should do:

But here’s what EARN IT does in that situation:

  • Amend Section 230, effectively delegating enforcement for child sexual abuse to states and victims themselves

As Riana summarizes:

No matter what the problem with online CSAM is, EARN IT isn?t going to fix it. It?s only going to make things worse, both for child victims and for everyone who uses the internet. The truth about EARN IT is that either there isn?t a serious noncompliance problem among providers that?s pervasive enough to merit a new law, but Congress just can?t resist using Section 230 as a political punching bag to harm all internet users in the name of sticking it to Big Tech? or there is a problem, but the DOJ is asleep at the wheel ? and EARN IT is a concession that Congress no longer expects them to do their jobs.

Either option should be shameful and embarrassing for the bill?s supporters to admit. Instead, this horrible legislation, if it passes, will be hailed as a bipartisan victory that shows Congress can still come together across the aisle to get things done. Apparently, harming Americans? rights online while making CSAM prosecutions harder is something both parties can agree on, even in an election year.

So, whatever problem the backers of EARN IT think they’re solving for, EARN IT doesn’t do it. That seems like it should be a big fucking deal. But, instead of responding to these points, the sponsors claim that people highlighting this “don’t care about CSAM.”

Filed Under: , , , , , , ,
Companies: ncmec

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Whatever Problem EARN IT Is Trying To Solve, It Doesn't”

Subscribe: RSS Leave a comment
13 Comments

This comment has been flagged by the community. Click here to show it.

ECA (profile) says:

But

Is there any effort to show/display data or Pictures of WHAT they think is the problem?
Then another subject comes to mind, in International Data. Yep, you can submit Tons of it, but its coming from other nations. Its the accessibility to it.
Then we get to numbers and enforcement and How many people can handle it, IF even 1 million of the complaints, end up in the USA. And then you find out its a mother/father showing off old pics of their kids, Naked near a Plastic swimming pool.

That One Guy (profile) says:

The 'I know you are but what am I?' strategy I see

But, instead of responding to these points, the sponsors claim that people highlighting this "don’t care about CSAM."

Sure hope the ISS windows are heavily tinted, that level of projection has got to be blinding even up in orbit.

Any time a supporter of EARN IT try to pull the ‘for the children!’/’You don’t care about the children!’ lie articles like this need to be thrown in their face, because for all their screaming about how anyone how isn’t fully on their side wants to see children exploited they are showing that no-one supports the exploitation of children more than a politician and that of all the things they could do that actually might work the only thing they can think of is shifting the blame and exploiting children to try to gut encryption.

Anonymous Coward says:

Re: Re:

Or NCMEC, hiring more investigators, etc.

Near as I see it, what EARN IT would do is two things:

  1. FURTHER overwhelm NCMEC and mind you according to Riana they’re already overwhelmed with the amount of reports they get right now and cannot process them all.
  2. Because of platforms being fearful of liability, they’re more likely to just take down entire swathes of content on the off-chance there might be CSAM (which is another can of worms) but this would cause less CSAM to be reported because UGC is gonna dry up much like with FOSTA.
Anonymous Coward says:

And, as Riana noted in her article, this very same Senate Committee just recently heard about how the FBI actually knew about an actual serial child sex abuser named Larry Nasser, and turned a blind eye.
I actually don’t know much about this case. But, if we’re to assume the FBI occasionally makes mistakes and misses sex abusers, it’s a bit rich to hold platforms to a much higher standard than them. And mind you, that might be giving the FBI more of a benefit of a doubt than they deserve.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re:

Separately, i wonder what the stats are on how much material reported is duplicate.

Most of it, as the basis for reporting is matching hashes of previously found material. If the same stuff did not keep on turning up time after time, automated tools would be much less useful.

Viseslav says:

I think you're missing the big picture.

You presume that the problem is not with Big Tech, but rather somewhere along the line between law enforcement not doing enough or NCMEC not being sufficiently funded. However, it is important to note where these report numbers come from. IP addresses originating from the United States accounts for a mere 3% of all reports. In fact, just three countries (India, Pakistan, and the Philippines) account for one out of every four reports. Moreover, it is estimated that 83 percent of CSAM is hosted in Europe.

So on top of number of reports as well as hosting of CSAM being done outside the U.S., Big Tech companies such as Facebook (which alone accounts for roughly 85 percent of all reports to NCMEC) are moving to encrypt Messenger. Yet you think the issue is with law enforcement not doing enough, or NCMEC not being sufficiently staffed.

The reality is that the problem is much, much, much bigger than an increase in funding or more aggressive policing. It’s unreasonable to expect American police to constantly land in other countries and arrest predators. That’s why Big Tech is crucial in this fight: they can provide the incriminating evidence to local authorities. By-default encryption will vaporize this evidence.

You might ask how. Well, first of all, virtually all CSAM is found not in the dark web, but in the surface web. This is in spite of Tor providing two decades’ worth of tremendous potential in concealing identities.

Mr. Masnick, the numbers are ugly. What was 3,000 reports a year in 1998 is now as high as 20 million reports (that include some 70 million images and videos) a year in 2020. You have companies such as Amazon and Microsoft refusing to scan cloud storage for CSAM, and Snap scanning photos but not videos for whatever arbitrary reason. Software like PhotoDNA is not only getting outdated, with Microsoft refusing to open source it so that developers around the world can improve its detection rate, but it also doesn’t work when faced with never-seen-before imagery or the concerning rise in livestreamed abuse. What belies the success of Big Tech at reporting child porn is the fact that Big Tech has access to the material. By definition, such access is revoked with end-to-end encryption. Subsequently, reports to NCMEC will decrease, and abusers will worry less about the consequences of sharing their crimes. And while there exist alternatives to tackle CSAM, such as stings, they are woefully ineffective, akin to shooting an arrow at a tsunami.

Saying encryption will not significantly increase these numbers is akin to stating that overusing antibioitics will make humans safer. There was a time when CSAM was hidden in the corners, printed in homemade magazines, and saved onto VHS tapes to be mailed. The Internet came around, and surely, the problem got much worse. Likewise, having opt-out encryption enabled for everyone will hide a greater number of abuse and evidence from eyes of law enforcement. That’s a natural consequence, and I don’t understand the criticism I receive when I point out that encyption is as morally agnostic as owning guns.

I’m not inherently against E2EE. I understand that a world without encryption is probably not a good world. However, what I constantly get from you Mr. Masnick and other proponents of widespread encryption is the thinking that encryption doesn’t come at a cost, that there will be alternative ways of catching criminals. But catching them isn’t magical. It requires evidence, and less evidence means less criminals caught. I’m pessimistic about the future. I think Facebook will encrypt Messenger next year, and CSAM will proliferate like never before. Proponents of universal E2EE will win. But the price is great.

Leave a Reply to Jojo Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...