Elon’s ‘Zero Tolerance’ Policy On CSAM Apparently Does Not Apply To Conspiracy Theorist Accounts He Likes
from the not-how-it-works dept
You may recall that early on in Elon’s ownership of Twitter, he insisted that “removing child exploitation is priority #1” while exhorting his supporters to “reply in the comments” if they saw any.

Leaving aside that this is a ridiculously terrible process for having people report potential CSAM (Child Sexual Abuse Material) or, as some people prefer, CSEM (with the E standing for “exploitation”), there was little to no evidence of this actually being put into practice. Most of the people (one person told me everyone) who worked on the CSAM team was let go or left. Ella Irwin, who headed up trust & safety until she resigned two months ago (as far as I can tell no replacement has been named) made a bunch of statements about how the company was treating CSAM, but there was almost no evidence backing that up.
There were multiple reports of the CSAM mitigation process falling apart. There were reports of CSAM on the platform remaining up for months. Perhaps even worse (and risking serious legal consequences), the company claimed it had suspended 400k accounts, but only reported 8k to law enforcement which is required by law. Oh, and apparently Twitter’s implementation of PhotoDNA broke at some point, which is again incredibly serious as, PhotoDNA (for all its problems) remains a key tool for large sites in fighting known CSAM.
And yet the company still claims (on a Twitter-branded page, because apparently no one actually planned for the “X” transition) that it has a “zero tolerance” policy for CSAM.

The key parts of that page say both “We have a zero-tolerance child sexual exploitation policy on Twitter” and “Regardless of the intent, viewing, sharing, or linking to child sexual exploitation material contributes to the re-victimization of the depicted children.”
Anyway, that all leads up to the following. One of the small group of vocal and popular utter nonsense peddlers on the site, dom_lucre, had his account suspended. A bunch of other nonsense peddlers started wringing their hands about this and fearing that Musk was going soft and was now going to start banning “conservative” accounts. In responses, Elon just came out and said that the account had posted CSAM, that only Twitter “CSE” staff had seen it, and that after removing the tweets in question, it had reinstated that guy’s account.

It’s worth noting that this person was among the hand-picked accounts who received money during Elon’s recent pay-for-stanning rollout.
Almost everything about this statement is problematic, and one that any lawyer would have a heart attack over if Elon were their client. First off, blaming Twitter’s legacy code is getting old and less and less believable each time he does it. He could just say “we fired everyone who understood how stuff worked,” but he can’t quite get there.
Second, posting “the reason” for a suspension is, like in so many cases having to do with trust & safety, trickier and involves more nuances than Elon would ever think through. Just to scratch the surface, sometimes telling users why they were suspended can create more problems, as users try to “litigate” their suspension. It can also alert abusive users to who may have reported them, leading to further abuse. Posting the reason publicly can lead to even more issues, including the potential risk of defamation claims.
But, even more importantly, it’s not a zero tolerance policy if you reinstate the account. It really seems like an “Elon’s inner circle tolerance policy.”
The claim that the only people who saw the images were the CSE team seems… unlikely. Internet sleuths have sniffed out a bunch of replies to his now deleted post (which was up for four days on an account with hundreds of thousands of followers), suggesting that the content was very much seen.

Also, there are big questions about what process Twitter followed here, since deleting the content, telling the world about who was suspended for what, and then reinstating the account are not what one would consider normal. Did Twitter send the content to NCMEC? Did it report it to any other law enforcement? These seem like pretty big questions.
On top of that, viewing that content on Twitter itself could potentially expose users to criminal liability. This whole thing is a huge mess, with a guy in charge who seems to understand literally none of this.
He’s now making Twitter a massive risk to use. At a time when the company is begging advertisers to put their ads on the site, I can’t see how Elon choosing to reinstate someone who posted CSAM, which was left on the site for days, is going to win them back.
Filed Under: content moderation, csam, dom lucre, elon musk, zero tolerance
Companies: ncmec, twitter, x




Comments on “Elon’s ‘Zero Tolerance’ Policy On CSAM Apparently Does Not Apply To Conspiracy Theorist Accounts He Likes”
The three things that increased on twitter post-Musk are CSAM, harrassment, and fraud.
Anyone who says twitter is better now, does so only because they enjoy being a participant in one or more of the above.
Re:
You forgot “hate speech”.
Re: Re:
Also ads to tweets ratio, glitches, unwanted UI additions, limitations to non-paying users and authoritarian censorship.
This comment has been flagged by the community. Click here to show it.
Re: Re:
Except it didn’t.
I mean don’t get me wrong, I think what you call “hate speech” for the most part should be allowed, it’s part of open discussion, so by necessity would go up, if only a little.
But there is vanishingly little evidence that it has.
Re: Re: Re:
Does senpai know your name yet?
Re: Re: Re:
Oh, Matty. Your streak of being wrong about pretty much everything you post continues…
https://phys.org/news/2023-04-analysis-speech-significantly-twitter.html
This comment has been flagged by the community. Click here to show it.
Re: Re: Re:2
Oh Strawby, I said evidence. That “paper” (term used very loosely) offers nothing of the kind.
It doesn’t list hate speech searched for (hilariously, after a trigger warning), and admits it used a sentiment detection algo (it’s not an API) which are a famously and hilariously inaccurate.
There’s no link to the paper the article talks about, the only link is to conference where this paper is supposedly talked about.
There is a lot of bunk “social science” (almost never repeatable) talking about “hate speech” how did you find a paraphrase of the worst one without any citations? Did you even read it? That’s amazing.
Re: Re: Re:3
It took me all of 3 minutes to find and download the paper, and follow the GitHub link listed in the paper. How inept ARE you, Matthew?
https://github.com/dan-hickey1/musk-hate-lexicon
Re: Re: Re:3
It’s more than you’ve ever provided for any of your claims.
The study has a link to a Github that contains the hate keywords they used.
Firstly, it’s literally called “Perspective API”. Let’s assume that the people who developed the tool and named it know more about it than you do.
Secondly, nice cherrypicking. Putting it through the API was the last step of their filtering process, but they also had actual people look through the hate keywords to score them.
I know you’re used to having your thoughts and opinions spoonfed to you by crackpots and conspiracy nuts, but when most people are given a title of a study, they know how to use Google to find the actual study.
Oh, in that case, here’s some more evidence.
This comment has been flagged by the community. Click here to show it.
Re: Re: Re:4
Dude, seriously, you have no idea what evidence is, nor does the stupid english major writing whatever politically motivated piece.
And again, I actually want more “hate speech” as you SJW dumbasses define it.
Re: Re: Re:5
Cool story bro. Just one thing.
One can’t help but notice you brought a sloppy argument to an evidence fight.
Re: Re: Re:5
I literally spoon-fed it to you, Matthew. Stop being such a child.
https://github.com/dan-hickey1/musk-hate-lexicon
Re: Re: Re:5
You don’t say?
Does your alleged wife know you’ve been hatefucking other men in your headspace, Matty?
Re: Re: Re:5
that’s because you have been a privileged white male your whole life and have never been the recipient of the constant barrage of hate speech that most minorities in this country have to endure for their entire lives, especially when it comes from people like you.
You just want to call people racial slurs and not have any consequences of doing so. Where the rest of the civilized population realize that there is no point in discussing any possible good things about Hitler.
Re: Re: Re:6
Where the rest of the civilized population realize that there is no point in discussing any possible good things about Hitler.
I mean he did do one good thing, the rest of his legacy might have been all sorts of horrible and monstrous but he did kill Hitler.
Re: Re: Re:5
LOL
Re: Re: Re:3
They call you kfc bro?
Cause every time you wrong you Double Down!
Re: Re: Re:2
Can we ask scientists to research if it’s possible to use this as an infinite source of energy?
Re: Re: Re:
“As a straight white man who’s never experienced actual hate speech directed at me I have no issue with other people experiencing hate speech because fuck them.”
Re: Re:
Consider it under the umbrella of ‘harassment’.
Speaking of begging advertisers to return, according to WSJ, the company “…warned advertisers that beginning Aug.7, brands’ accounts will lose their verification – a golden check mark that indicates their account truly represent the brand – if they haven’t spent at least $1000 on ads in the previous 30 days, or $6000 on ads in the previous 180 days…”
I’m trying to come up with a word for such an action, but for some reason can’t quite choose one from the many that spring to mind.
Re:
“Extortion” seems like the best fit here.
Re: Re:
Trying to overcharge your customers while showcasing an imploding product is not really extortion but delusion. It’s extortion of the “if you don’t give me your car, I’ll stop sleeping with your wife” kind.
It recalls the “that’s not a knife” scene from Crocodile Dundee.
Re: Re: Re:
It’s a shakedown. It’s almost identical to the whole “Nice store you got here, shame if anything happened to it,” method of shaking store owners down to make them pay for protection. (Protection from the people they’re paying the protection money to.) This is, “Nice brand name you got there, shame if someone else could pretend to be you and tweet horrible stuff, ruining your brand’s reputation…”
If they lose their gold check-marks, someone could change their name to match the brand, pay for Twitter Blue and a lot of people would think the blue check-mark meant they were really the brand tweeting. Same as what happened during the initial roll-out of Twitter Blue, only worse because the real brand wouldn’t have a check-mark this time.
Re: Re: Re:2
But will it be considered RICO?
Probably not, but one can hope…
Re: Re:
Some of those words are best to not be used in a decent company, others are full of sarcasm. I mean, those advertisers are getting the same treatment the regular users did but a few months ago. And the were warned by the “chief twit” (he can rebrand all he wants, but he called himself that and it stuck) back when the current money grabbing – misnamed – idiotic “verification” system was announced. And they saw the first attempt at rolling it out. Should take a hint when given one… or several!
Also, i wonder what the current theoretical CEO with her ad world background thinks of this move. I’m not sure that “extortion” is a good B2B strategy.
Re: Re: Re:
Since he wants to rename Twitter to X, I say we call him the Chief X-hole.
Re: Re: Re:2
Alternatively, ‘Chief Xit’.
The main defense Elon and Dom’s fans are using right now is that he didn’t post it for enjoyment, but to condemn it.
Which, even taking that at face value, changes absolutely nothing about the legality or ethics here. If there were a loophole that it’s okay to post pictures of children being abused as long as you caption them “what NOT to do,” that would go very badly, very fast.
Re: Goat Fornicators
Ah yes, Popehat’s law of goat… er, fornication.
This comment has been flagged by the community. Click here to show it.
Another day, another article by Democratic Party operative MM in which he indulges his obsessive hatred of Elon Musk.
In a few hours, we’ll next have another post by TC alleging that law enforcement is evil and that most Americans don’t actually like their police stomping on the heads and alleged-rights of the scum of our society.
After that, probably something about how intellectual property rights are a bad thing and companies and individuals that try to defend them are working against society (unlike radical gender ideologues, but I digress).
Rinse, repeat.
Re:
Is this possibly due to the fact that Elon is a very hate-able guy?
Or that law enforcement is evil, or .. is it possible that people abuse the laws .. like copyright? Say it isn’t so!
This comment has been flagged by the community. Click here to show it.
Re: Re:
Re: Re: Re:
yer serious … lol
Re: Re: Re:
Oh, a Musk-stan that is salty, how stupidly unoriginal.
Re: Re: Re:
That you think Righthaven was okay tells us all we need to know
Re: Re: Re:
How many tears did you have to fight back just to type that sentence John Smith?
I assure you, between the likes of Perfect 10, Malibu Media and Richard Liebowitz, copyright law has no shortage of bad actors.
Re: Re: Re:
„The only people who think law enforcement is evil are communists and non-ideologically aligned criminals who resent the check on their predatory, anti-social behavior.“
So you agree that the January-06-people should be in prison for attacking capitol police, right?
Right?
Re: Re:
“Law enforcement is evil” is about as compelling as “raspberries are rotten”. Obviously that isn’t an universal truth or there would be no reason to want either in the first place. When you find raspberries are more prone to rotting than you want to, you need to work on your culture.
Re:
You’re free to fuck off. You won’t have to read stuff you clearly disagree with, and we won’t have to read your moronic takes on the articles.
Win-win.
Re:
Musk cultists sure seem to love their CSAM
Re: Re:
As do the MAGA cultists and pretty much the entire extreme right-wing. They’re utterly obsessed with child prostitution, claiming just about everyone on the left does it.
They project a lot, so…
Re: Re: Re:
Exactly.
At this point we need to report every Republican to the police for child abuse, including exposure to religion and not having a drag queen figure in their lives. We must change the narrative.
Re:
Another day another impotent post by crybabby Jhon
Re:
So if you’re not affiliated with the Democratic party you’re ok with CSAM? Have I got that right?
Re: Re:
Yeah they either did not think that argument through or just engaged in some refreshing if damning honesty.
This comment has been flagged by the community. Click here to show it.
You're lying, again.
You don’t know what, exactly was posted, and it if something that people are claiming is kiddie porn (no idea, I don’t know what it was), it was absolutely done to out and shame pedophiles.
I certainly think intent matters (it matters under the law), and the policy that you’re quoting predates Musk’s takeover. Removing a lot of those policies was literally the point of taking over Twitter.
(Which no, was not profitable prior to takeover, regardless of whatever dumbshit cherrypicking you want to play by saying “it was profitable in June!” So what? It wasn’t profitable overall. And yes they banned the laptop story for 2 weeks and yes they shadowbanned and yes followed government orders to censor)
Furthermore, you’re purposefully omitting the none-zero tolerance part of the policy:
In other words, there’s wiggle room, and you’re making it sound like the Old-Twitter policy meant that dom_lucre must be banned forever, when it says nothing of the kind. You are literally lying about what the policy says. Meanwhile, the original posts were removed, regardless of intent, so Musk is following the old policy quite well, actually.
But you can’t have that because you need to pretend Musk is being hypocritical and capricious.
Bonus, I loved this bit:
So what that tells me is that you are against any kind of due process, which since you’re an orwellian leftist who loves censorship, even guided by government, yeah, that tracks.
Just in general I think it’s hilarious that you think it’s Musk’s job to run Twitter how you would (which would obviously fail) but you feel the need to lie to pretend that Musk isn’t following a policy he probably doesn’t feel obligated to follow but is still following, anyway.
Perfect. My expectations for you were very low and you still failed them.
Re:
There is no exception to CSAM laws if used “for shaming.” It’s a strict liability law. You have it, you’re in deep shit.
God, you’re so wrong about everything related to the law.
There is not “intent” defense for CSAM. https://www.law.cornell.edu/uscode/text/18/2252A
You’re claiming the intent of taking over Twitter was to weaken Twitter’s CSAM standards? Interesting.
It was. I provided the data. You have not provided any counter data.
No, the story was blocked for one day. You continue to misunderstand what happened (the Post account was locked for two weeks, but the story was only banned for 1 day, and only that Post story, other stories were allowed).
The gov’t made no such orders, only flagged info, which Twitter rolled its eyes at mostly. And, no, they didn’t shadowban, they used visibility filtering, which (1) they publicly announced and (2) which your lord and savior Elon Musk says is his favorite idea ever.
So, if we take your ridiculous assertion that this is shadowbanning, are you upset that Musk has ramped up shadowbanning?
Your reading comprehension levels are so bad it’s not even funny. Go get your brain checked.
I’m… not a leftist. I do support due process when it comes to gov’t censorship. What I was talking about was not about due process, but about what information you should reveal publicly and what you should not.
I mean, you thought that of Dorsey… so…
The policy is US criminal law. He is kinda obligated to follow it.
You constantly exceed my expectations for you, Matthew, in that I find it difficult to believe you can comprehend how to turn on a computer, let alone use one.
This comment has been flagged by the community. Click here to show it.
Re: Re:
Intent is part of common-law, dumbass. Even laws expressly written to exclude intent, intent is still written in, otherwise you could send someone to jail by putting child porn on their computer without their knowledge. That’s not how the law works, at all.This was the pretext that Clinton was let loose (classified info laws don’t require intent, either).
What a fucking dumbass.
No…..you didn’t, that’s the joke. You keep on saying “they were profitable 16 of 20 quarters”, which besides being obvious cherry-picking is fucking meaningless. That’s 5 years, right? They lost money over those 5 years you dumbfuck.
I honestly have trouble believing you’re this dumb but the alternative is you think you can just lie your way through anything. “I provided data”….holy fuck you did not.
Lie.
Lie. Oh, Facebook did exactly the same thing on gov orders, literally proven in court. Besides the ample evidence of it happening at Twitter, proof of one should be viewed as proof of the other, there’s no reason to think they treated it differently.
Lie.
Yes….you don’t want the accused to be told what they are accused of, which is a basic part of due-process. You fucking moron.
No, US law is not that anyone suspected of posting kiddie porn must be banned forever. You god damned fucking moron.
Re: Re: Re:
You really are that desperate to want to see some, aren’t you Matty?
Re:
Irrelevant. Intent is not a defense in CSAM law.
It actually doesn’t matter under the law. Have you actually read the law?
Removing policies in place to conform to criminal law by removing CSAM immediately, regardless of context (which is what the law requires, btw) is why Musk took over Twitter? Because that’s the policy we’re talking about here.
Do you have evidence to support this?
No, the post containing the story was banned for one day. The account was locked for two weeks, but that isn’t the same as banning the story.
No, they did not shadowban in the sense it was meant and understood at the time they said they didn’t, and what they were doing was exactly what Musk is supporting now.
No, they did not follow government orders to censor. The evidence points to the exact opposite.
That doesn’t follow. At all.
He doesn’t. Nobody does. Stop lying.
The policy in question is US criminal law, so he ought to feel obligated to follow it.
This comment has been flagged by the community. Click here to show it.
Re: Re:
Relevant, actually, because intent is always part of criminal law, even if the law makers try to exclude it. You literally cannot remove intent as a prerequisite.
But Mr Bari Weiss, I thought I made clear you were not smart enough to argue with, and that I did not want to hear from you?
Nothing about that has changed. You definitely haven’t gotten smarter.
Re: Re: Re:
And yet here you are.
If you genuinely hate this place so much you would have fucked off a long time ago, instead of spending the time here you might have spent on your alleged wife.
Re:
https://newsthump.com/2023/06/13/man-uses-quote-he-doesnt-understand-from-book-he-hasnt-read-to-make-point-he-hasnt-thought-through-2/
So CSEM vs. CSAM. Are people that use the term CSEM low-key disputing that stuff is ‘abuse’, or is ‘exploitation’ just casting a wider net than ‘abuse’?
Haven’t seen that distinction before, but seeing as how Twitter is using it now, I assume there’s some fuck-stupid passive aggressive reason behind it.
Re:
I think it’s to avoid the whole argument over how broad the term “abuse” is. No one except pedophiles is going to argue that it isn’t, at a minimum, exploitation, but there are some who insist that abuse must involve violence and/or negative emotions, not just manipulating. I don’t agree, but it is easier to just avoid the argument altogether.
Re:
They are related concepts, but CSAM is a subset of CSEM.
CSAM: child sexual abuse material, aka material that depicts or memorializes an actual act of sexual abuse against a child
CSEM: child sexual exploitation material, aka material that is an image of a child and is being used for sexual gratification, but is not inherently and by definition memorializing an actual act of sexual abuse against a child
All CSAM is also CSEM, but not all CSEM is CSAM. The images on Twitter that we’re discussing are absolutely CSAM: they are the recording of a particularly heinous act of child abuse (and trust me, if you don’t know the details, don’t look them up). Examples of something that could be CSEM but not CSAM range all the way from “beauty pageant photos” to “toddlers in diapers” to “kids running through a sprinkler so their clothes are wet and see-through”: there’s a wide range of stuff that doesn’t meet the legal US defintion of “child pornography” (which is what the law calls what we now call CSAM) but is definitely weird, creepy, and sexualized, and that’s what the CSEM term is useful for. Non-photorealistic art that is not “indistinguishable from an actual minor” like drawings is also CSEM but not CSAM. Photomanipulations that appear to be indistinguishable from an actual minor is classed with CSAM. AI-generated photorealistic depictions are probably going to eventually be classified under CSAM just like photomanipulations are, but that’s a very fast-moving area and there’s no real precedent on it yet.
CSAM is, inherently, illegal in the US. (It’s inherently illegal in most other countries, too, but I can only reliably speak to US law.) CSEM is sometimes illegal and sometimes not, depending on whether or not it meets the Miller vs California obscenity test. There is a large demand for CSEM-that-is-not-CSAM, in both “drawings that are not photorealistic” and “photorealistic images that don’t meet the legal defintion of ‘child pornography’ but are still weird, creepy, and sexualized in the context they’re presented in”, and platforms struggle really hard with identifying and removing it, especially because the same image can be CSEM in one context and not in the next.
(There’s a reason why just about every T&S expert out there will tell you that if you’re a parent, and especially if your kid is pre-pubertal, do not ever post images of them online. Especially do not ever post images of them in diapers or images that show their feet. Just trust me on this one.)
This comment has been flagged by the community. Click here to show it.
I wholeheartedly support this.
I support freedom of artistic expression as much as the next person, but there are limits to that.
It really is time that Japanese artists started to realize that and adhere to American and British standards.
Re:
Yes, won’t someone think of all the drawings being hurt and having their images posted online? I can only imagine how they feel. It’s so sad.
Re: Re:
Is defending CSAM really the hill you want to die on?
Re: Re: Re:
If it’s adapted into an anime series, there’s nothing we won’t defend.
Re: Re: Re:
Anime isn’t CSAM. CSAM depicts real abuse, not fictional abuse.
This comment has been flagged by the community. Click here to show it.
Re: Re: Re:2
Precisely.
Nobody is hurt if I watch fictional boys getting their cocks and balls stomped on and crushed.
It increases the odds of me wanting to see it in real life, of course, but who the fuck cares about boring ass males?
Re: Re: Re:3
I literally pay for artists to illustrate these scenes. I have a right to this just as a boring ass male pays for boring ass porn.
You can’t stop the paradigm shift. We will evolve the same way as the anglerfish does: where the male is nothing more than a weakling simp to seek out an alpha female and become her shiny pair of testicles. Because that’s all men are. Spermatozoa on legs.
Re:
That’s about depictions of actual children, not purely fictional ones.
This comment has been flagged by the community. Click here to show it.
Re: Re:
The rights of lesbians who masturbate to underaged Japanese anime schoolgirls kissing each other must be protected.
This comment has been flagged by the community. Click here to show it.
Re: Re: Re:
You cannot censor us forever, Hywoman. Neither can you censor all of us. As Jodi Picoult portrayed in her bestseller Sing You Home, lesbians are tasked with the important duty of saving women from themselves by invalidating their unhappy unions with men. We can start by touching schoolgirls to kiss each other, not filthy, nasty boys.
Re: Re: Re:2
You cannot hide the truth forever.
Women are no longer happy in their unions with men. Only another woman can truly love another woman.
The revolution is coming, and men can either stand aside, stand with us, or be trodden underfoot.
So Elon is reinstating the accounts of people that have had their accounts suspended for CSAM. I guess we know who the real pedo guy is.
'Zero tolerance(for anyone but us posting it)'
Another fine showing of ‘rules for thee but not for me’ that has plagued the site since he took over, where the rules aren’t there to keep order and be applied equally but merely used against those he doesn’t like and ‘forgotten’ when it comes to those he does.
That he’s willing to hand out nothing more than a minor wrist slap for CSAM though… kinda giving away the game there Elon.
He values the fact qAnon targets the political left too much to want to take any action against them, no matter how much harm they cause to their families, the victims of actual child exploitation and legitimate child protection charities. Also he knows if he starts banning them, they will turn the shitbag conspiracy theorist eye of sauron on him and his connection to Epstein and Maxwell.
I don’t know how relevant this is, since I understand reasonable people aren’t in the business of ranking CSAM by badness, but based on other reporting I’ve seen on this, it was a piece of CSAM so violent and vile that multiple law enforcement agencies assumed it was a hoax when they were first made aware of it. That’s what Musk’s QAnon buddy posted, and that’s what Elon’s twitter left up for four days
Re:
Thanks, that is potentially interesting.
This comment has been flagged by the community. Click here to show it.
Techdirt is not Twitter. Technically, Techdirt doesn’t even exist, according to Bing. Maybe you should consider “community notes”, like the “big boys” do. Then maybe Bing could find you.
Re:
Did you hear what Bobby Kennedy said recently? “There has never been a time in history when the good guys have been the ones censoring people”.
Truth.
Re: Re:
What if they’re censoring CSAM?