Senators' 'Myths & Facts' About EARN IT Is Mostly Myths, Not Facts

from the they're-lying-to-you dept

I already wrote a long post earlier about the very very real problems with the EARN IT Act — namely that it would make the problem of child sexual abuse material significantly worse by repeating the failed FOSTA playbook, and that it would attack encryption by making it potential “evidence” in a case against a tech company for any CSAM on its site. But with the bill, the sponsors of the bill, Senators Richard Blumenthal and Lindsey Graham, released a “Myth v. Fact” document to try to counter the criticisms of EARN IT. Unfortunately, the document presents an awful lot of “myths” as “facts.” And that’s a real problem.

The document starts out noting, correctly:

The reporting of CSAM and online child sexual exploitation provides law enforcement with vital information on active predators and ongoing cases of rape and abuse.

More reports, and more accurate information within those reports, means more children freed from one of the most horrific and life-changing crimes imaginable.

But it ignores that taking away Section 230 protections doesn’t magically make them do more reporting. It does the opposite. Because now making the effort to find and report CSAM actually puts you at risk of greater liability under EARN IT. The bill literally creates less incentive for a website to build systems to find and report CSAM because merely doing so gives you the knowledge (scienter) necessary under the law to face liability. The bill gets this all exactly backwards.

The document then has the following listed as a “myth”:

Given that some tech companies report significant amounts of CSAM to the National Center for Missing and Exploited Children (NCMEC) and provide technical resources to address child exploitation, the tech industry is doing enough to address this crime.

To address it, it lists these facts:

For tech companies that are already taking clear steps to report CSAM, little will change under this bill.

Except, that’s not true at all. Because now, if those companies make any mistakes — which they will because you can’t get everything right — they face potentially crippling liability. The idea that no one will go after them because they report a lot of CSAM is completely divorced from reality. We see companies getting sued all the time in similar circumstances. Under FOSTA now we’re seeing companies like Salesforce and Mailchimp being sued because other companies used their services and then sex traffickers used those other services, and somehow that magically makes Salesforce and Mailchimp liable. The same thing would happen under EARN IT.

According to NCMEC?s 2020 statistics on reports of the online exploitation of children, while Facebook issued over 20 million reports that year, in contrast Amazon (which hosts a significant percentage of global commerce and web infrastructure) reported 2,235 cases.

Maybe that’s because Facebook is a user-generated content social media platform and Amazon… is not? I mean, I don’t even need to say this is “comparing apples to oranges” here because it’s “comparing Facebook to Amazon.” The two companies do very, very different things that are simply not comparable.

Of course, the underlying (and pretty fucking scary) suggestion here is that Amazon should be scanning every AWS instance for bad stuff, which raises really serious privacy concerns. It’s amazing that the very same Senators pushing this bill, which now they’re basically saying should require websites to spy on everything, will turn around next week and argue that these companies are “collecting too much information” and are engaged in “surveillance capitalism.”

So which is it, Senators? Should Amazon be spying on everyone, or are they not spying hard enough?

There is a sustained problem of underreporting and neglect of legal obligations by some tech companies. During a Senate Judiciary Committee hearing on the EARN IT Act, NCMEC disclosed that it had reported nearly nine times more cases of CSAM material hosted on Amazon to Amazon, than Amazon had found itself, and that Amazon had not taken legally required action on those cases.

Again, this is taken incredibly out of context, and when put back into context means that Amazon isn’t spying on all their customers’ data. That should be seen as a good thing? Note what this “fact” doesn’t say: when Amazon was alerted to CSAM by NCMEC did it remove it? Did it add hashes to the NCMEC database? Because that’s what matters here. Otherwise, these Senators are just admitting that they want more surveillance by private companies and less privacy for the public.

Before introducing the EARN IT Act, a bipartisan group of Senators sent detailed questions to more than thirty of the most prominent tech companies. The responses showed that even startups and small firms were willing and able to build safety into their platform using automated tools. Meanwhile, some large companies like Amazon admitted that they were not even using common and free tools to automatically stop CSAM despite substantial and known abuse of their platforms by predators.

They’re really throwing Amazon under the bus here. But this “fact” again demonstrates that most internet companies that host user generated content are already doing what is appropriate and finding, reporting, and removing CSAM. The only example they have of a company that is not is Amazon, and that’s because Amazon is in a totally different business. They’re not a platform for user generated content, they’re just a giant computer for other services. Those other services, built on top of Amazon, can (and do!) scan their own systems for CSAM.

This whole “fact” list is basically a category error, in which they lump Amazon in with other companies because whoever wrote this can’t find any actual problem out there with actual social media companies.

It is clear that many tech companies will only take CSAM seriously when it becomes their financial interest to do so, and the way to make that a reality is by permitting survivors and state law enforcement to take the companies to court for their role in child exploitation and abuse.

Except this document’s own fact check said that every company they asked was doing what was necessary. And, it already is very much in every company’s “financial interest” to find, report, and remove CSAM because if you don’t you already face significant legal consequences, since hosting CSAM is already very, very much illegal.

The next “myth” listed is:

This bill opens up tech companies to new and unimaginable liability that necessitated CDA Section 230?s unqualified immunities two decades ago

Which is… absolutely true. And we don’t need to look any further than what happened with FOSTA to see that this is true. But the Senators deny it. Because they’re lying.

The EARN IT Act creates a targeted carve out for the specific, illegal act of possession or distribution of child sexual abuse material.

And FOSTA created “a targeted carve out for the specific, illegal act of human trafficking” but in practice has resulted in a series of totally frivolous lawsuits against ancillary services used by a company that was then used by sex traffickers.

Any tech company that is concerned that its services or applications could be used to distribute CSAM has plenty of tools and options available to prevent this crime without hindering their operations or creating significant costs.

The detection, prevention, and reporting of CSAM is one of the most easily addressed abuses and crimes in the digital era. There are readily accessible, and often free, software and cloud services, such as PhotoDNA, to automate the detection of known CSAM material and report it to NCMEC.

The naming of PhotoDNA is interesting here. It’s a Microsoft project (big tech!) that is very important in finding/reporting/removing CSAM. But Microsoft actually limits who can use it, and I’ve heard of multiple websites that were not allowed to use PhotoDNA. I don’t think Techdirt would qualify to use PhotoDNA, for example. In the meantime, Cloudflare actually introduced its own tool that I think came about because Microsoft made it difficult to impossible for many websites to use PhotoDNA.

But just the fact that PhotoDNA and Cloudflare’s solution exist and are being used again suggests that “the problem” here doesn’t actually exist. As noted in the first post, we don’t see companies being sued for CSAM and using Section 230 as a defense, because that’s not the problem.

Also, left out of the “fact” is the actual “fact” that PhotoDNA has very real limitations. That article, published a few months ago, notes that (1) Microsoft and NCMEC seem to go out of their way to avoid allowing researchers to study PhotoDNA, (2) contrary to Microsoft/NCMEC claims, the algorithm is able to be reversed (i.e., enabling users to recreate CSAM images from hashes!) (3) it is easily defeatable with minor changes to images and (4) it is subject to false positives. In other words, while PhotoDNA is an important tool for fighting CSAM, it has real problems, and mandating it (as this “fact” suggests is the goal of EARN IT) could create significant (and potentially dangerous) consequences.

The next “myth” is… just weird.

Requiring companies to be on the lookout for child abuse will harm startups and nascent businesses.

No one has made that argument. The actual argument is that adding very serious liability for anyone making any mistake as they’re on the lookout for child abuse will do tremendous harm to startups and nascent businesses.

No other type of business in the country is provided such blanket and unqualified immunity for sexual crimes against children.

Except… tech companies aren’t given a “blanket and unqualified immunity for sexual crimes against children.” This “fact” is just wrong. What Section 230 does it provide immunity for third party speech — but not if federal crimes are involved, which is certainly the case with CSAM. The whole attempt to blame Section 230 here is just weird. And wrong.

Startups and small businesses have a critical role in the fight against online CSAM. Smaller social media sites and messaging applications, such as Kik Messenger, are routinely used by abusers. The EARN IT Act will ensure that abusers do not flock to small platforms to evade the protections and accountability put in place on larger platforms.

So, now we’re blaming Kik? Okay. Except the timing on this is interesting, as just a few days ago the DOJ literally announced that it had arrested a woman for distributing CSAM on Kik, showing again that when law enforcement actually bothers to do so, it can find and arrest those responsible.

Moreover, there are simple, readily accessible, and often free, software and cloud services, such as PhotoDNA, that can be used by any tech company to automate the detection of known CSAM material and report it to NCMEC.

Again, PhotoDNA involves a “qualification” process, and has significant problems. If the point of this bill is to force every website to use PhotoDNA, write that into the law and deal with the fact that a mandated filter raises other Constitutional concerns. Instead, these Senators are basically saying “every website must use PhotoDNA, but we can’t legally say that, so wink, wink.”

Indeed, it’s pretty funny that right after more or less admitting that they’re demanding mandatory filters, they claim this is the next “myth”:

The EARN IT Act violates the First Amendment.

The “fact” they used to reply to this kinda gives away the game:

Child sexual abuse is not protected speech. Possession of child pornography is a criminal violation and there is no defensible claim that the First Amendment protects child sexual abuse material.

That’s correct, but… misleading. No one is concerned about taking down CSAM (again, pretty much every major internet platform already does this as it’s already required by law). The concern is that by mandating filters that are not publicly reviewable, you end up taking down other speech. And that other speech may be protected. Again, look at the link above regarding research into PhotoDNA, which suggests that the “false positive” problem with PhotoDNA is very, very real.

And then we get to the encryption stuff with the next “myth”:

The EARN IT Act is simply an attempt to ban encryption.

Actually, it seems to be only partially an attempt to ban encryption. The Senators’ “facts” on this is just… again, the part that is actually mythical:

The EARN IT Act does not target, limit, or create liability for encryption or privacy services. In fact, in order to ensure the EARN IT Act would not be misconstrued as limiting encryption, specific protections were included in the bill to explicitly state that a court should not consider offering encryption or privacy services as an independent basis for legal liability.

Weasel words. Again, see what I wrote in the last post about the encryption section. It says it can’t be the “independent basis” for liability, but it explicitly states that the use of encryption can still be used as evidence against a website under this law. So it very much increases the legal liability for any website that uses encryption, because it will be used against them in court.

Stopping the abuse of children is not at odds with preserving online privacy. Some online platforms have been using automated tools to check images and videos against CSAM databases for more than a decade without endangering privacy or creating consumer concerns. As Facebook has testified to the Senate Judiciary Committee, tech companies can readily implement tools to detect child sexual abuse while offering strong encryption tools.

This is correct, but does not address the point. Of course, you can fight CSAM while preserving privacy, but this bill makes that much more difficult by adding liability risk to anyone who uses encryption (and anyone who tries to go above and beyond in fighting CSAM, but some slips through).

Then there’s a “myth” that’s actually a fact. Section 230 exempts federal crimes, and CSAM is a federal crime — and the real issue is that law enforcement tends not to spend much time and resources on fighting the actual creators and distributors of CSAM:

Since CDA 230 already exempts federal crimes, the solution to this problem is increasing resources for law enforcement and hiring more federal prosecutors.

The “facts” the Senators present in response are incredibly misleading.

We support increasing resources for law enforcement officials fighting sex crimes against children. But no amount of money can compensate for the disengagement of the online platforms actually hosting this material.

That second sentence is a non sequitur, since (again…) EARN IT doesn’t do anything to stop sites from hosting CSAM. It just opens them up to being sued for trying to stop it!

Hiring more federal investigators cannot replace having companies committed to the fight against child abuse, especially when it comes to monitoring the content posted on online platforms and checking closed groups for abuse.

Again, companies are committed to fighting child abuse, and EARN IT makes it more risky for them to “monitor” the content posted online!

By requiring that only the Department of Justice can bring criminal cases for child sexual exploitation crimes, CDA Section 230 drastically limits the number and types of cases that are brought.

No, it avoids bogus, wasteful lawsuits like the ones that were brought against Salesforce and Mailchimp under FOSTA.

States and survivors have a well-established role in holding offenders accountable, especially with respect to child sexual exploitation, for a reason: under enforcement of child protection laws fails victims and fosters more abuse.

Yes, and they can already hold “offenders accountable” for sexual exploitation. The problem is that this bill distracts from going after actual offenders, and instead blames random internet services that the offenders used for not magically knowing they were being used by offenders.

The EARN Act would ensure that there is more than one cop on the beat by enabling states and civil litigants to seek justice against those who enable child sexual exploitation.

No. It would allow that anyone can go after just about any website for incidental usage by an actual offender, rather than going after the offenders themselves.

This bill is bad and dangerous. It will make the very real problem of CSAM worse and undermine encryption at the same time. This “myth v. fact” sheet reverses the myths and facts in the services of getting bogus “for the children” headlines for Senators desperate to look like they’re doing something about a real problem, while they’re really moving to make the problem much, much worse.

Filed Under: , , , , , , ,
Companies: amazon, facebook, kik, microsoft

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Senators' 'Myths & Facts' About EARN IT Is Mostly Myths, Not Facts”

Subscribe: RSS Leave a comment
16 Comments
This comment has been deemed insightful by the community.
PaulT (profile) says:

Re: Re:

Sadly, it doesn’t really matter. Whether or not it’s because of sheer incompetence in understanding the data in front of them, or it’s a misleading stat presented to convince the clueless that they have to "do something", the end result is the same. The fix is the same either way – change the political system so that there’s less self-serving con artists and outright morons able to grandstand like this – but it’s not one likely to appear in the near future.

This comment has been deemed insightful by the community.
Anonymous Coward says:

As Facebook has testified to the Senate Judiciary Committee, tech companies can readily implement tools to detect child sexual abuse while offering strong encryption tools.

Only if they have access to file contents on their servers, or on the user systems. That is only if they use encryption between users and their servers, or place spyware on the user systems.

That loud humming is George Orwell spinning in his grave.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

'We wouldn't care so why would they?'

Time and time again reading this article I kept coming back to the thought of ‘we expect in others what we would do ourselves’, and how again and again they effectively if not outright made the argument that the only way a platform would care about CSAM is if there were penalties involved for not getting rid of it.

That… say a lot about the politicians involved here I’d say, and what it says is anything but flattering.

On a more general note this looks to serve as another great example of ‘If the only way you can defend your position is with dishonesty that’s a sign that even you know it’s indefensible.’

Anonymous Coward says:

Re: 'We wouldn't care so why would they?'

And they continue to conveniently ignore how we have all the information on the record from the time, well documented and evidenced, of how prior to 230, all the various companies involved in the internet of the time ACTIVELY BRUSHED ANY AND ALL EVIDENCE OF THIS STUFF GOING ON WITHIN THEIR ECOSYSTEMS UNDER THE RUG SO AS TO AVOID ANY KIND OF LIABILITY. Yet somehow assuming going back to that status quo will somehow have a different outcome this time around?

Anonymous Coward says:

Re: Re: 'We wouldn't care so why would they?'

Yet somehow assuming going back to that status quo will somehow have a different outcome this time around?

The goal is a clear attack against encryption, and freedom of speech. While also providing a nice opportunity to throw Amazon under the bus for easy political points during an election year.

The assumption you’ve written however, is the one they wanted you to come to and repeat for them.

Anonymous Coward says:

Re: Re: Re: 'We wouldn't care so why would they?'

I mean, Im not sure I'd say they WANT news outlets and the general public all united in pointing out how not having 230 incentivises criminal activity by service providers, as is proven by the lengths we know for a fact they went to when it wasnt a thing. Since that kind of exposes the core lie at the heart of their duplicitous attempts at justification for their stance, and leads the average reader to the question of what they get out of it that they are happy to allow an end to policing of such content by providers as a byproduct.

That Anonymous Coward (profile) says:

Still waiting for them to clutch their pearls and deal with the fact that the USG ran the largest CSAM (fucking stupid labels) website on the planet, allowed new content to be uploaded & traded, and I’m still shocked how any court could overlook the idea that ‘we let them keep abusing kids and uploading it so we could catch more of them’ isn’t how things are supposed to work.

They authorized the funding used to run the site & somehow the law doesn’t allow us to go after the money men who run these sites and profit off of the abuse of children?

I like your imaginary solution to the imaginary problem, I think you should be much more worried about the actual problem that y’all were responsible for the largest CSAM website & never paid the piper.

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...