People Are Lying To The Media About EARN IT; The Media Has To Stop Parroting Their False Claims
from the that's-not-how-any-of-this-works dept
Update: After this post went up, Tech Review appears to have done a major edit to that article, and added a correction about the completely false claim regarding Section 230 protecting CSAM. The article still has problems, but is no longer quite as egregiously wrong. The post below is about the original article.
MIT’s Tech Review has an article this week which is presented as a news article claiming (questionably) that “the US now hosts more child sexual abuse material (CSAM) online than any other country,” and claiming that unless we pass the EARN IT Act, “the problem will only grow.” The problem is that the article is rife with false or misleading claims that the reporter didn’t apparently fact check.
The biggest problem with the article is that it blames this turn of events on two things: a bunch of “prolific CSAM sites” moving their servers from the Netherlands to the US and then… Section 230.
The second is that internet platforms in the US are protected by Section 230 of the Communications Decency Act, which means they can’t be sued if a user uploads something illegal. While there are exceptions for copyright violations and material related to adult sex work, there is no exception for CSAM.
So, this is the claim that many people make, but a reporter in a respectable publication should not be making it, because it’s just flat out wrong. Incredibly, the reporter points out that there are “exceptions” for copyright violations, but she fails to note that the exception that she names, 230(e)(2), comes after another exception, 230(e)(1), which literally says:
(1) No effect on criminal law
Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.
It’s almost as if the reporter just accepted the claim that there was no exception for CSAM and didn’t bother to, you know, look at the actual law. Child sexual abuse material violates federal law. Section 230 directly exempts all federal law. The idea that 230 does not have an exception for CSAM is just flat out wrong. It’s not a question of interpretation. It’s a question of facts and MIT’s Tech Review is lying to you.
The article then gets worse.
This gives tech companies little legal incentive to invest time, money, and resources in keeping it off their platforms, says Hany Farid, a professor of computer science at the University of California, Berkeley, and the co-developer of PhotoDNA, a technology that turns images into unique digital signatures, known as hashes, to identify CSAM.
People keep saying that companies have “little legal incentive” to deal with CSAM as if 18 USC 2258A doesn’t exist. But it does. And that law says pretty damn clearly that websites need to report CSAM on their platforms. If a website fails to do so, then it can be fined $150k for its first violations and up to $300k for each subsequent violation.
I’m not sure how anyone can look at that and say that there is no legal incentive to keep CSAM off their platform.
And, just to make an even clearer point, you will be hard pressed to find any legitimate internet service that wants that content on its website for fairly obvious reasons. One, it’s reprehensible content. Two, it’s a good way to have your entire service shut down when the DOJ goes after you. Three, it’s not good for any kind of regular business (especially ad-based) if you’re “the platform that allows” that kind of reprehensible content.
To claim that there is no incentive, legal or otherwise, is just flat out wrong.
Later in the article, the reporter does mention that companies must report the content, but then argues this is different because “they’re not required to actively search for it.” And this gets to the heart of the debate about EARN IT. The supporters of EARN IT insist that it’s not a “surveillance” bill, but then when you drill down into the details, they admit that what they’re mad at are just a few companies that are refusing to install these kinds of filtering technologies. Except that as we’ve detailed (and which the article does not even bother to contend with), if the US government is passing a law that mandates filters, that creates a massive 4th Amendment problem that will make it more difficult to actually go after CSAM purveyors legally (under the 4th Amendment the government can’t mandate a general search like this, and if it does, that will enable those prosecuted to suppress the evidence).
Also, we’ve gone through this over and over again. If the real problem is the failure of companies to find and report CSAM, then the real issue is why hasn’t the DOJ done anything about it? They already have the tools under both Section 230 (exempted from CSAM) and 2258A to bring a prosecution. But they have not. And EARN IT does nothing to better fund the DOJ or even ask why the DOJ never actually brings any of these prosecutions?
Incredibly, some of the “experts,” all of whom are among the people who will benefit from EARN IT passing (as the reporter apparently didn’t bother to even ask anyone else), kind of make this point clear, without even realizing it:
Besides “bad press” there isn’t much punishment for platforms that fail to remove CSAM quickly, says Lloyd Richardson, director of technology at the Canadian Centre for Child Protection. “I think you’d be hard pressed to find a country that’s levied a fine against an electronic service provider for slow or non-removal of CSAM,” he says.
Well, isn’t that the issue then? If the problem is that countries aren’t enforcing the law, shouldn’t we be asking why and how to get them to enforce the law? Instead, they want this new law, EARN IT, that doesn’t do anything to actually increase such enforcement, but rather will open up lots of websites to totally frivolous lawsuits if they dare do something like offer encrypted messaging to end users.
Incredibly, later in the article, the reporter admits that (as mentioned in the beginning of the article), the reason so many websites that host this kind of abusive materials moved out of the Netherlands was… because the government finally got serious about enforcing the laws it had. But then it immediately says but since the content just moved to the US, that wasn’t really effective and “the solution, child protection experts argue, will come in the form of legislation.”
But, again, this is already illegal. We already have laws. The issue is not legislation. The issue is enforcement.
Also, finally at the end, the reporter mentions that “privacy and human rights advocates” don’t like EARN IT, but misrepresents their actual arguments, and presents it as a false dichotomy between tech companies “prioritizing the privacy of those distributing CSAM on their platforms over the safety of those victimized by it.” That’s just rage-inducingly wrong.
Companies are rightly prioritizing encryption to protect the privacy of everyone, and encryption is especially important to marginalized and at risk people who need to be able to reach out for help in a way that is not compromised. And, again, any major internet company already takes this stuff extremely seriously, as they have to under existing law.
Also, as mentioned earlier, the article never once mentions the 4th Amendment — and with it the fact that by forcing websites to scan, it actually will make it much, much harder to stop CSAM. Experts have explained this. Why didn’t the reporter speak to any actual experts?
The whole article repeatedly conflates the sketchy, fly-by-night, dark web purveyors with the big internet companies. EARN IT isn’t going to be used against those dark web forums. Just like FOSTA, it’s going to be used against random third parties who were incidentally used by some of those sketchy companies. We know this. We’ve seen it. Mailchimp and Salesforce have both been sued under FOSTA because some people tangentially associated with sex trafficking also used those services.
And with EARN IT, anyone who offers encryption is going to get hit with those kinds of lawsuits as well.
An honest account of EARN IT and what it does would have (1) not lied about what Section 230 does and does not protect, (2) not misrepresented the state of the law for websites in the US today, (3) would not have only quoted people who are heavily involved in the fight for EARN IT, (4) not have misrepresented the warnings of people highlighting EARN IT’s many problems, (5) not left out that the real problem is the lack of will by the DOJ to actually enforce existing law, (6) would have been willing to discuss the actual threats of undermining encryption, (7) would have been willing to discuss the actual problems of demanding universal surveillance/upload filters, and (8) not let someone get away with a bogus quote falsely claiming that companies care more about the privacy of CSAM purveyors than stopping CSAM. That last one is really infuriating, because there are many really good people trying to figure out how these companies can stop the spread of CSAM, and articles like this, full of lies and nonsense, demean all the work they’ve been putting in.
MIT’s Tech Review should know better, and it shouldn’t publish garbage like this.
Filed Under: csam, earn it, incentives, photodna, scanning, section 230, surveillance
Companies: tech review
Comments on “People Are Lying To The Media About EARN IT; The Media Has To Stop Parroting Their False Claims”
So they may as well admit...
That they not only think DOJ is ineffective at prosecuting when it comes to CSAM but doesn’t expect them to do their damn job either preferring to outsource the job of law enforcement to companies.
Just admit it.
EARN IT doesn’t do a damn thing to improve prosecutions involving CSAM, would cause NCMEC to be even more overwhelmed, and blatantly unconstitutional ontop of it.
What's this?
Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »
In that case,
That MIT Tech Review article is not a news article, but a piece of propaganda.
Re:
All we meed to know is who made that hatchet job, and more importantly, who paid for it.
Dear MIT’s Tech Review,
Some asshole is putting your name on PR press releases and pretending they are “news” stories without doing any research.
You might wanna look into that.
KTHKSBAI
TAC
Re:
MIT’s Tech Review has been terrible for well over a decade. This type of totally wrong, fact-free trash is the norm rather than the exception.
I’ve always assumed based on the quality of writing that literally any MIT student can write whatever they want for it, without any editorial oversight.
Re: Re:
Yes, it’s the norm.
No, it’s not lack of control of who writes. They’re a propaganda outlet.
When you’re just repeating the statements given to you without question you’re not a reporter, you’re just a PR stooge that isn’t getting paid by the one using you as a mouthpiece.
Re:
Going by his previous statements, (un)Chozen knows all about being used as a ‘mouthpiece’. ;D
Re:
You presume there’s no payment.
Re: Re:
You presume that there is payment. It’s more likely that some government stooge contacted the MIT press with an ‘exclusive story’ that they’ve been touting for years. Of course no money would change hands.
Re: Re: Re: Uh, no
Your accusation presumes something not expressed.
When applying Occam’s Razor, it’s necessary to understand the range of possible explanations.
Many things motivate people. A good scoop, a need to submit word count, pride, ego, access, gullibility, corrupt influence, …
MIT’s Tech Review has an article this week which is presented as a news article claiming (questionably) that “the US now hosts more child sexual abuse material (CSAM) online than any other country,” and claiming that unless we pass the EARN IT Act, “the problem will only grow.”
Now you know why I come here to stay updated. I might not always agree with the article authors’ conclusions, but at least nothing they write is so egregiously factually inaccurate (if anything is, it’s minor details, and it doesn’t happen often).
It’s not surveillance. It’s proactive review of otherwise unmonitored activity.
Re: By that definition ...
then a camera aimed at your bedroom window isn’t surveillance either, it’s just proactive review of what wasn’t being filmed before. They will only arrest you if you are caught sleeping with hookers, the wrong gender, young children, or the occasional farm animal, but if you aren’t breaking the law, why worry, right?
Re: Re:
Whenever I fear my faith in people is insufficient I can always read the comments.
Re:
Will you be redefining the word ‘surveillance’ when the NSA are pro-actively reviewing your otherwise unmonitored bathroom when you’re on the john, hmm?
Re: Re:
Well they already monitor the sewars so …
Re:
Is that supposed to be sarcasm? Because it’s not clear that it is, and it sounds like you don’t understand the problems that are laid out in the article. Trolls are always lurking so forgive me if I misunderstood you.
Re:
“It’s not surveillance. It’s proactive review of otherwise unmonitored activity.”
I wish I could call this “sarcasm” but this is, in fact, one of the authoritarian talking points as to why it should be considered offensive if any communication was unmonitored by “Authorities”.
So much for “land of the free” and the 1st amendment.
Re: Re:
It’s only the “Land of the Free” if you’re white. If you’re a person of color, especially Indian, then you’re a potential terrorist and you will be proactively monitored until we decide there’s nothing suspicious about you. If you’re a white person caught up in our proactive monitoring, then that’s collateral damage and we urge you to make a complaint that will not only be dismissed by us, but also dismissed in court if you choose to sue. We are the Government; you are here to respect us.
This comment has been flagged by the community. Click here to show it.
Outsourced Enforcement
We’ve seen how ineffective the “war on drugs” has been over the past few decades. I think the bureaucrats realize it’s not looking good for them. Now, they’re considering the pursuit of a digitally transferred item with perhaps no money trail, and they understand that they’re not going to be up to the task. The point of this legislation is to fob the problem off onto someone else.
pretty sure the claim that the US is a leading supplier of CSAM is probably true. We know that the FBI willingly distributed CSAM in the playpen cases. Now there’s rumours that they’re running another CSAM distribution operation (with at least one operation out of Atlanta), hoping (like ATF did with their Fast+Furious operation) that they can track everywhere the stuff they’re sending out goes to, and can run it down later. We all know that’s not going to be the case.
Re:
Well, we have to get the perverts somehow, even if that means violating the CPPA to do it.
Re:
So true… but for all the wrong reasons.
Re:
So basically, anyone who downloads some of this child porn isn’t committing an offence because the US Government distributed it? You know, like when people downloading certain files weren’t committing copyright infringement because an authorised agent of the copyright holders put the files out there?
Re: really?
Because I am 100% positive that it might not be.
So they may as well admit...
That they not only think DOJ is ineffective at prosecuting when it comes to CSAM but doesn’t expect them to do their damn job either preferring to outsource the job of law enforcement to companies.
Just admit it.
EARN IT doesn’t do a damn thing to improve prosecutions involving CSAM, would cause NCMEC to be even more overwhelmed, and blatantly unconstitutional ontop of it.
Re: 'Look, we can't be bothered so make them do it.'
That’s one of the more galling counters to the argument that they need to gut 230/encryption and dump everything on the platforms as it’s basically admitting that government agencies can’t be bothered to do the job themselves and that for all the ‘concern’ presented the most effort they’re willing to put forth is ‘make it someone else’s problem’.
In my opinion, what these people really have a problem with isn’t Section 230, but the First and Fourth Amendments, which is why they keep trying to attack them by proxy.
https://www.vice.com/en/article/mb8mev/sad-panda-shut-down-hentai Attacking Section 230 is much more likely to have effects like this, which it had in the Netherlands, when they tried to strip hosts of their liability protections.
As for the really bad crooks, ugh, alright there are things we can do to reduce that, however we really can’t allow perfect to be the enemy of good here.
We’re never going to realistically remove every bit of ‘CSAM’ from the Internet, and unfortunately, C3P is one of those loud lobbies shouting it may be possible, which I think provides people with false hope.
Some like Microsoft, and their affiliates, like Hany Farid, are becoming rather tiring, as they keep using this situation to market half-baked technical “solutions”.
But, that runs a serious risk of sweeping up vast swathes of perfectly legal speech, just because someone might deem it “harmful”, and it violates the First Amendment.
Instead of tying the hands of providers with Fosta, or Earn It (and forcing them to delete “hentai” or “prostitution ads”), it is better to work with providers to practically accomplish harm reduction.
https://en.wikipedia.org/wiki/Zero-risk_bias This is another example of zero risk bias.
https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_Coalition Even looking past this, which puts any attempt to censor “hentai” on pretty thin ice (under the First Amendment), it looks as if the Netherlands law ended up taking down content which wouldn’t even fall under that.
Chilling effects are real, and laws such as this, are going to have an impact on speech which goes far beyond this.
Re:
C3P could also be an acronym for the Canadian Centre for Child Pornography. Just sayin’.
Ok
Who is going to make money on this?
Who is going to require that Encryption be Broken to certify things are CLEAN?
WHO is going to Blackmail Every site on the net.(MP3 Bee, used to be very interesting, as they would Hide Data on servers all over the place, without anyone knowing it was there)
The old concept is still there, you Cant prove there is NO PORN on your computer. NO ONE can as long as Bots and Scripts are allowed from sites, Anyone can ship you Stuff in the background.
All this is going to FORCE, is LEGAL names and info on sites, and TO CLOSE the PORN DOORS to free to watch porn.
The truth is SIMPLE. There are places its LEGAL or a GRAY AREA. And trying to create an International LAW, ISNT GOING TO WORK.
I’m surprised the MIT Review would not understand how important section 230 is and user message encryption is to protect free speech and user privacy and not just parrot proponganda from people who want to push a bill that would reduce the freedom of users to communicate and simply place extra costs and filters on all websites that host user uploads like YouTube twitch etc
Re:
Possibly gratitude towards the US Government for giving it land that was itself stolen from the native Americans of the region.
Re:
Teems like there’s some presumption that “MIT Review” is some kind of honest broker. That has not been my experience.
Bet we’re hearing about earn it again because of musk. They get their precious censor bill, use it against Twitter’s porn posts removing a core reason some use Twitter turning users against musk and strip Twitter’s cda 230 making musk liable for conservative speech that’ll force him to censor opiniins.
Re:
How likely is the bill to pass?
Trying to prove your point is always easier if you lie… helps with making headlines, too.
Re:
Trying to prove your point is always easier if you lie… helps with making comments, too.
FTFY. YW.
Has anyone sent a letter to the editor of MIT Review? If not, please do so.
Re:
I tried to figure out how, couldn’t find an address.
Re: Re:
You can’t have looked hard enough, then.
MIT Review feedback form
<a href=”mailto:feedback@technologyreview.com>MIT Review feedback email address
All I did was search ‘MIT Review’, then follow the appropriate links.
Re: Re: Repost because link got broken.
You can’t have looked hard enough, then.
MIT Review feedback form
MIT Review feedback email address
All I did was search ‘MIT Review’, then follow the appropriate links.
Follow the money
A cynical quote from someone who stands to make lotsa dough if scanning becomes mandatory?
By the way, anyone looked more closely at the “reporter” to see who they’re getting their feed from?
My view is that adult communities (the actual people C3P disproportionately impacts, as it invites trolls and other troublemakers to go targeting them), providers, and governments should work together to eliminate any CSAM which emerge (the Canadians have a very creative definition of “CSAM” which even veers into thought crime).
But ultimately, the most stereotypical places are not going to be where this content mainly appears, and you’ll have to remember that the real criminals can move far more readily than legitimate websites can.
I am sure they are used to losing their host at a moment’s notice and having to move. Still, providers can work to eliminate them. But, it’s not realistic to wipe them off the face of the earth.
In short it's all about the money...
With the US being so corrupted at this point by money in politics any semblance of objective reality has been replaced with pure paid propaganda. That would be bad enough on its own but the worst is that the media is not required to do any due diligence nor there are consequences for the shills that peddle them.
With so many “experts” happily selling out to dirty politicians who want to turn their false narratives into equally dishonest legislative outcomes, it’s a small miracle that the US has not descended into complete chaos yet. We are strong as a nation but not impervious to succumbing to the massive rot we are facing from within.
I’m having a hard time remembering when was the last honest piece of legislation advanced in the US senate or the house that wasn’t plagued by misleading or outright false names and surrounded by dishonest scare mongering propaganda or malicious last minute additions without debate that gutted or corrupted their original intent. At this point I don’t think there has been a single one in my entire life and I was born in the 70s… That is unsustainable.