Pretty Much Every Expert Agrees That Elon Has Made Twitter’s Child Sexual Abuse Problem Worse

from the not-great,-bob dept

About a month ago, we wrote an article pulling together a variety of sources, including an NBC News investigation, that suggested that Elon Musk’s Twitter was doing a terrible job dealing with child sexual abuse material (CSAM) on the platform. This was contrary to the claims of a few very vocal Elon supporters, including one who somehow got an evidence-free article published in a major news publication, insisting that he had magically “solved” the CSAM issue, despite firing most of the people who worked on it. As we noted last month, it actually appeared that Elon’s Twitter was not just failing to deal with CSAM (which is a massive challenge on any platform), but that he was actually going backwards and making the issue much, much worse.

Last week, Senator Dick Durbin released a letter he sent to the Attorney General, asking the DOJ to investigate Twitter’s failures at stopping CSAM.

I write to express my grave concern that Twitter is failing to prevent the selling and trading of child sexual abuse material (CSAM) on its platform and to urge the Department of Justice (DOJ) to take all appropriate actions to investigate, deter, and stop this activity, which has no protections under the First Amendment, and violates federal criminal law.

The last two points are important: CSAM is (obviously) not protected speech, and as it violates federal criminal law, Section 230 is not relevant (lots of 230 haters seem to forget this important point). Of course, there is still the issue of knowledge. You still can’t hold a platform liable for things it didn’t know about. But, deliberately turning a blind eye to CSAM (while stating publicly that it was the number one priority) is still really bad.

Now, a NY Times investigation has gone much, much further into this issue and found, as NBC News did, that Twitter isn’t just failing to deal with CSAM, it has made a ton of really, really questionable decisions regarding how it handles the problem. The NY Times report notes that it used some tools to investigate CSAM on Twitter without looking at the material itself. While it doesn’t go into detail, from what’s stated, it sounds like wrote some software to identify potential CSAM, without looking at it, and then forwarded the accounts to the Canadian Center for Child Protection and also to Microsoft, which created and runs PhotoDNA, the tool that many large companies use to identify CSAM on platforms and to report that content to NCMEC (the National Center for Missing and Exploited Children) in the US and the CCCP in Canada (and other organizations elsewhere). And what they found is not great:

To assess the company’s claims of progress, The Times created an individual Twitter account and wrote an automated computer program that could scour the platform for the content without displaying the actual images, which are illegal to view. The material wasn’t difficult to find. In fact, Twitter helped promote it through its recommendation algorithm — a feature that suggests accounts to follow based on user activity.

Among the recommendations was an account that featured a profile picture of a shirtless boy. The child in the photo is a known victim of sexual abuse, according to the Canadian Center for Child Protection, which helped identify exploitative material on the platform for The Times by matching it against a database of previously identified imagery.

That same user followed other suspicious accounts, including one that had “liked” a video of boys sexually assaulting another boy. By Jan. 19, the video, which had been on Twitter for more than a month, had gotten more than 122,000 views, nearly 300 retweets and more than 2,600 likes. Twitter later removed the video after the Canadian center flagged it for the company.

Even Twitter’s responses to requests from the government agencies dealing with this stuff did not go well:

One account in late December offered a discounted “Christmas pack” of photos and videos. That user tweeted a partly obscured image of a child who had been abused from about age 8 through adolescence. Twitter took down the post five days later, but only after the Canadian center sent the company repeated notices.

As an aside, I’m curious how all the people insisting that no government agency should ever alert Twitter to content that might be illegal or violate its policies feel about the Canadian Center alerting Twitter to CSAM on its platform.

And the article notes that Twitter seems to be ignoring a lot of the more easily findable stuff for organizations that have access to these types of tools:

The center also did a broader scan against the most explicit videos in their database. There were more than 260 hits, with more than 174,000 likes and 63,000 retweets.

“The volume we’re able to find with a minimal amount of effort is quite significant,” said Lloyd Richardson, the technology director at the Canadian center. “It shouldn’t be the job of external people to find this sort of content sitting on their system.”

Even more worrisome: the NY Times report notes that Twitter uses a tool from Thorn, the well known anti-trafficking organization that tries to use technology to fight trafficking. Except the report notes that, for all of Musk’s claims about how fighting this stuff is job number one… he stopped paying Thorn. But, even more damning, Twitter has stopped working with Thorn to provide information back to the organization to improve its tool and to help it find and stop more CSAM:

To find the material, Twitter relies on software created by an anti-trafficking organization called Thorn. Twitter has not paid the organization since Mr. Musk took over, according to people familiar with the relationship, presumably part of his larger effort to cut costs. Twitter has also stopped working with Thorn to improve the technology. The collaboration had industrywide benefits because other companies use the software.

Also eye-opening in the article is that, while Twitter is claiming that it is removing more such content than ever, its reports to NCMEC do not match that and have dropped massively, raising serious concerns at NCMEC:

The company has not reported to the national center the hundreds of thousands of accounts it has suspended because the rules require that they “have high confidence that the person is knowingly transmitting” the illegal imagery and those accounts did not meet that threshold, Ms. Irwin said.

Mr. Shehan of the national center disputed that interpretation of the rules, noting that tech companies are also legally required to report users even if they only claim to sell or solicit the material. So far, the national center’s data show, Twitter has made about 8,000 reports monthly, a small fraction of the accounts it has suspended.

Also, NCMEC saw that Twitter’s responsiveness dwindled (though in January seemed to pick back up a bit):

After the transition to Mr. Musk’s ownership, Twitter initially reacted more slowly to the center’s notifications of sexual abuse content, according to data from the center, a delay of great importance to abuse survivors, who are revictimized with every new post. Twitter, like other social media sites, has a two-way relationship with the center. The site notifies the center (which can then notify law enforcement) when it is made aware of illegal content. And when the center learns of illegal content on Twitter, it alerts the site so the images and accounts can be removed.

Late last year, the company’s response time was more than double what it had been during the same period a year earlier under the prior ownership, even though the center sent it fewer alerts. In December 2021, Twitter took an average of 1.6 days to respond to 98 notices; last December, after Mr. Musk took over the company, it took 3.5 days to respond to 55. By January, it had greatly improved, taking 1.3 days to respond to 82.

The Canadian center, which serves the same function in that country, said it had seen delays as long as a week. In one instance, the Canadian center detected a video on Jan. 6 depicting the abuse of a naked girl, age 8 to 10. The organization said it sent out daily notices for about a week before Twitter removed the video.

None of this is particularly encouraging, especially on a topic so important.

It also appears that foreign regulators may be taking notice as well:

Ms. Inman Grant, the Australian regulator, said she had been unable to communicate with local representatives of the company because her agency’s contacts in Australia had quit or been fired since Mr. Musk took over. She feared that the staff reductions could lead to more trafficking in exploitative imagery.

“These local contacts play a vital role in addressing time-sensitive matters,” said Ms. Inman Grant, who was previously a safety executive at both Twitter and Microsoft.

Again, dealing with CSAM is one of the most critical, and challenging, parts of any trust & safety team for any website that allows user content. There is no “perfect” solution. And there will always be scenarios where some content is missed. So, in general, I’ve been hesitant to highlight articles (which come along with some frequency) insisting that because reporters or researchers are able to find some CSAM it means that the site “isn’t doing enough.” Because that’s rarely an accurate portrayal.

However, this NY Times piece goes way beyond that. It didn’t just find content, it found empirical evidence of Twitter being slower to react than in the past, not reporting the material it should be reporting to the agencies set up for that purpose, cutting off Thorn from both money and collaboration data, and many other things.

All of which adds up to pretty compelling evidence that for all of Musk’s lofty talk of fighting CSAM being job number one, the company has actually gone not just a little backwards on this issue, but dangerously so.

Filed Under: , , ,
Companies: thorn, twitter

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Pretty Much Every Expert Agrees That Elon Has Made Twitter’s Child Sexual Abuse Problem Worse”

Subscribe: RSS Leave a comment
180 Comments
This comment has been deemed insightful by the community.
PaulT (profile) says:

This is horrendous news, but nothing that wasn’t already suspected.

Hopefully, this could actually lead to some legal pushback against Musk personally. I somehow doubt it, but there seems to be little question that Musk’s personal decisions have led to increased criminal activity, and I’m not sure how “I fired the teams who were preventing CSAM, then stopped working with and paying the people on the outside who were helping” can be spun into him not deliberately taking down protections. If he took down protections deliberately, then surely it follows that increased trafficking in such things is on his shoulders.

It would also be poetic justice for the man who bullied others with false claims of pedophilia to be brought down by actually aiding and abetting it.

Although, none of that is really likely for various reasons. The main takeaway for me is that it takes away yet another angle from those still desperate enough to defend Musk.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

If he took down protections deliberately, then surely it follows that increased trafficking in such things is on his shoulders.

Do you have evidence Musk actually believes in rational thinking?

I don’t think Musk aided/abetted. But only because I don’t think he thought about anything he was doing.

But yeah, it does look pretty bad for Musk, especially sense it was pretty obvious this would likely be an outcome of the way he handled the take over (not that I think that though will ever be able to enter his brain).

This comment has been deemed insightful by the community.
PaulT (profile) says:

Re: Re:

“Do you have evidence Musk actually believes in rational thinking?”

I’m sure he believes in something that he believes is rational thinking. It’s not been in evidence so far with Twitter, but he couldn’t have been this consistently bad with his other companies.

I think he’s just so used to getting away with certain behaviour, he’s falling apart when that behaviour isn’t being corrected for by those around him. Being a blustery rich teen edgelord works when you’re trying to fund startups that promise ambitious and world-changing ideas, as long as the engineers below you are able to temper expectations and deliver product. It doesn’t work when you try running a mature company by firing said engineers as your first term of action at a company whose mission is far more trivial.

This comment has been deemed funny by the community.
Anonymous Coward says:

Re: Re:

But yeah, it does look pretty bad for Musk, especially sense it was pretty obvious this would likely be an outcome of the way he handled the take over (not that I think that though will ever be able to enter his brain).

Shhh, you’re going to summon Matthew Bennett if you say that.

Scary Devil Monastery (profile) says:

Re: Re:

“Do you have evidence Musk actually believes in rational thinking?”

All the evidence so far points to Musk mainly believing in catchy one-liners, talking points and bumper sticker mentality.

He’s advocated free speech maximalism for years and this exact situation is one which was brought to his attention many times – and which he shrugged off as irrelevant.
Now it’s suddenly relevant because he’s the guy at whose desk that buck stops. And to paraphrase PaulIT, consequences are where being a blustery rich edgelord suddenly loses it’s appeal.

That this was coming should have been bitingly obvious to anyone not neck deep in the libertarian religion where everyone Not Giving A Shit somehow produces a society worth living in rather than one where somehow assholes fail to understand the now unwritten and unenforced rules.

PaulT (profile) says:

Re: Re: Re:

“All the evidence so far points to Musk mainly believing in catchy one-liners, talking points and bumper sticker mentality”

Worse, he appears to believe in memes. As far as I can tell, he bought into the “moderation is easy” and “Twitter is blocking right-wingers” memes, and he’s finding out that these were not true and the solutions are way harder than he imagined.

As for consequences, I maintain that he was shielded from them by being a rich kid who pushed for actual world changing ideas like renewable energy and space exploration. Now that he’s working toward trivialities like Twitter, that shield no longer exists.

This comment has been flagged by the community. Click here to show it.

Scary Devil Monastery (profile) says:

Re: Re:

“Let’s not make baseless generalizations like this.”

I think we’ll have to start making those generalizations. The general US “conservative” today seems to be just fine to keep voting for Moore, Gaetz, or Werewolf Walker.
As long as there’s an “R” in front of the name, the US conservative today will vote for that candidate, it’s that simple.

They are ON BOARD with fascism, bigotry, racism, nazism and the…people who “like them young” (as Trump still described his best buddy Epstein before even he distanced himself). Looking at the voting blocks of today then yes – we can and should generalize. The US conservative of today votes R and is just fine and dandy with whatever their chosen candidate gets up to.

Those “conservatives” who weren’t OK with this are, in the political climate of today the new progressives who simply aren’t fine with the GOP issuing legislation out of A Handmaid’s Tale or having Ted Cruz telling them what their kids should learn in school. Those may call themselves independent but generally vote “D” while holding their noses.

This comment has been flagged by the community. Click here to show it.

Hyman Rosen (profile) says:

It’s hilarious how the use-mention problem shows up with respect to child pornography. Just like using the “n-word” in hypothetical legal cases about workplace harassment. There’s something very wrong with a system that would prosecute people who are looking for child pornography in order to tell people accurately how much child pornography exists.

This comment has been deemed insightful by the community.
Violet Aubergine says:

Re:

Do you even have the barest understanding of how capitalism works?

If such things weren’t illegal there’d be “businesses” in certain under regulated states setting up one hour training sessions on how to become a CSAM “expert” allowing creeps to volunteer to legally watch their favorite videos under the protection of the law.

Please remember that capitalism’s predatory nature will always do its best to force itself forward regardless of how disturbing, defiling and damaging the process and that is why a strong regulatory state must exist to temper its excesses. Capitalism will never ever do it on its own accord because, while many aspects of capitalism are beneficial to society, there are plenty of venal aspects that will cut corners like slashing everything left and right without reason because you ridiculously bought a company while high thinking a $54.20 stock offer price was highlarious and are now saddled with 13 billion in debt that’s living rent free in Elon’s head 24/7/365 until he dies. This is the start of his Howard Hughes phase.

Scary Devil Monastery (profile) says:

Re: Re:

Honestly, “Hyman Rozen” coming out swinging against laws intended to prevent the sexual abuse of minor and laws preventing racist workplace discrimination in one and the same breath really shouldn’t come as a surprise.

Except possibly in the revelation that he appears to view minors and children through the same lens of lesser beings he uses for black people and jews.

Realize where he’s coming from and save yourself the long-winded explanation unless you thought it might benefit more rational and less bigoted people.

Thad (profile) says:

Re:

Musk could have made a big show of buying Twitter, changed nothing substantial, and been fawned over as a genius.

No he couldn’t have, because there was no way he could have bought it in the first place without paying far more than it was worth.

He could certainly have shit the bed less badly than he has. But nobody was ever going to hail him as a genius for offering to buy the company for 40% more than its market value based on a 420 joke, signing an airtight deal waiving due diligence, trying to back out of it, and finally closing the purchase under legal pressure.

Scary Devil Monastery (profile) says:

Re: Re: Re:

Well, yeah.

The people who forked over the dough to pay the shady tailors are heavily invested in claiming the Emperor’s new wardrobe is a work of art, that the view of the august personages dangling dingleberries is just a product of your lying eyes, and that the brat screaming the Emperor is, in fact, naked, just needs a caning until he shuts the fuck up.

Even without involving Musk’s own cultists there are plenty of people owning enough stock in TSLA their sphincter tightens unpleasantly at the thought of Musk’s halo of infallibility slipping further.

Scary Devil Monastery (profile) says:

Re:

Coming out swinging in favor of Musk because the real point of Qanon was always about the global conspiracy led by the Elders of Zion. Every last conspiracy issued by Qanon leads to “It’s ze Jews!”.

It’s anti-semitism all the way down, and the children were always just the tired old blood libel shtick used once again as a grindstone for that particular hatchet.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

“Pretty Much Every Expert Agrees That Elon Has Made Twitter’s Child Sexual Abuse Problem Worse”

So the article makes a quantitative statement without making any effort to establish a baseline. There were studies on the amount of CSAM on Twitter before Musk bought it. Making now attempt to use those numbers to establish baseline is a fraud and you should be ashamed you fucking sellout shill..

At the sometime you are pimping Mastodon of which the majority of the platform is CSAM. You are going to burn in hell you shill.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re:

you are pimping Mastodon of which the majority of the platform is CSAM

Um…I’ve been on Mastodon for a few years now, and I don’t know what instance you’re on where the majority of the content is CSAM, but I’m pretty sure the Feds would like to have a talk with you about that.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re:

Made the news. You didn’t notice because you do nothing but suck Mike’s dick on this forum everyday.

“OSINT: The Mastodon Paedophile Problem
Mastodon has a major paedophile problem, join computer scientist Edward Charles for a closer look at the pedophilic side of the fediverse.”

https://www.secjuice.com/osint-mastodon-paedophile-csam-child-porn-problem/

You are a flying monkey of an obvious overt narcissist sociopath.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:

Made the news.

A site no one here has heard of is not “the news”. And even after putting that nitpick aside, the article to which you linked doesn’t say, as you claim, “the majority of [Mastodon] is CSAM”. It doesn’t even imply that.

I don’t doubt that the Fediverse has a CSAM issue. All social media platforms have that issue⁠—it’s simply the nature of the beast. But you’re making a wild accusation about the entire Fediverse that you can’t back up based only on that one immutable fact. You shouldn’t be surprised when people call “bullshit”.

You are a flying monkey of an obvious overt narcissist sociopath.

Every accusation, a confession… 🥱

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Rocky says:

Re: Re: Re:3

The fediverse by its very design will always have a CSAM issue.

Any method or technology to connect people will always have a CSAM issue. Your problem is that you think pointing to the fact that other social media also has CSAM, Mastodon in this instance, is some kind of excuse for Musk’s stupidity in firing almost everyone who actually dealt with CSAM and thus exacerbating the proliferation of it on Twitter.

Do you have any more bad excuses you want to give us?

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:3

The fediverse by its very design will always have a CSAM issue.

As will Twitter. And Facebook. And Gab, and Truth Social, and Discord, and Telegram, and Signal, and literally every form of online communications. The CSAM issue has to do with people, not platforms, and there is no “silver bullet” solution for the problem facing those platforms. But more can always be done to mitigate the problem. That’s why Musk actively gutting the Twitter team responsible for handling that problem makes Twitter look⁠—rightfully, I should note⁠—as if it’s doing worse on handling that problem. Distracting from that point with a “hey, look over there” aimed at the Fediverse won’t change the situation at Twitter. Even if the Fediverse’s CSAM issues are worse (and that’s still up for debate), it doesn’t change the fact that Twitter still has that same problem and Elon gutted the team responsible for handling it anyway.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re: Re:2

While the fediverse undoubtedly has actual CSAM on it, what these people are likely referring to is a piece of conservative misinfo where they classified anime (even instances with very tame super SFW content) as “CSAM” to try to smear the network.

By the same standards, Twitter is “full of CSAM” and every other site.

It’s not worth engaging with that conspiracy because it is entirely based in bad faith.

PaulT (profile) says:

Re: Re: Re:

“Made the news”

OK, so instead of being an offensive idiot you could have just linked to where this was reported. Not everyone gets their news from the same places as you, and while it’s suspicious that something this significant hasn’t been widely reported, there’s always exceptions.

“link to random blog nobody’s heard of”

So, not widely reported by reliable sources then? Got it.

When you have no facts, turn to insults and conspiracy theories about those asking WFT you’re talking about… Gotcha.

PaulT (profile) says:

Re: Re: Re:

Oh, and reading that random blog post, it doesn’t say that “a majority of the platform is CSAM” as you claimed, so even if it was actual news and not a random blog post by someone obviously reacting in defence of known problems with Twitter making headlines with random searches based no clear methodology, you’re lying about its conclusions. Typical.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Am I really reading this? Pinch me.

So the article makes a quantitative statement without making any effort to establish a baseline. There were studies on the amount of CSAM on Twitter before Musk bought it. Making now attempt to use those numbers to establish baseline is a fraud

If you’re trying to evaluate whether something got worse, then the recent past IS a baseline.

When Elon critics don’t cite studies: “Where’s the evidence?”
When Elon critics cite studies: “Science is fake. You’re a fraud and a shill.”

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:

As you said:

There were studies on the amount of CSAM on Twitter before Musk bought it.

Those studies seem like a decent enough baseline to start with, since they occured⁠—as you admit⁠—before Musk bought Twitter. Comparing those studies with studies done after Musk bought Twitter certainly makes sense to me.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re: Re:2

But they didn’t do that. Thats kind of like how Fauci didn’t spend one dime on RTC studies of mask efficacy during COVID. He knew what the results would be so he never paid for such a study to be done.

This trash article didn’t make any effort to establish a baseline because it knew what the answer would be. Musk has been extremely effective, greatly reducing CSAM on the platform in a matter of months.

Rocky says:

Re: Re: Re:3

But they didn’t do that. Thats kind of like how Fauci didn’t spend one dime on RTC studies of mask efficacy during COVID. He knew what the results would be so he never paid for such a study to be done.

Ah, the old “mask efficacy” argument where the less intelligent have looked at some studies that say that facemasks have little to no effect protecting against getting infected.

Now, if we instead look at studies about facemasks and how they actually reduce the spread of an infection it is easy to see how dishonest your argument is.

Rocky says:

Re: Re: Re:3

But they didn’t do that. Thats kind of like how Fauci didn’t spend one dime on RTC studies of mask efficacy during COVID. He knew what the results would be so he never paid for such a study to be done.

Oh, the stupid “mask efficacy” argument where the stupid thinks it’s an argument for not using facemasks because it does little to protect them from infection.

Smart people look the studies that found that the use of facemasks reduce the spread of infection to other people. Imagine that, it’s almost like health professionals using facemasks were on to something.

Musk has been extremely effective, greatly reducing CSAM on the platform in a matter of months.

You ask for baseline stats, we give you baseline stats of the number of reports (every time Twitter removed content that even hinted at being connected to CSAM they were, and are, legally obliged to report it). The number of reports after Musk took over nosedived which proves one of two things: Twitter isn’t reporting all CSAM (real or suspected), or, they don’t actually know how much CSAM is on the service today and only report the little they can find with the reduced toolset they have now.

I expect you to use the argument that they now respond to CSAM-reports much quicker again which is a stupid argument to use, because it doesn’t actually tell us how much better or worse Twitter themselves are at actually detecting and removing CSAM under Musk.

One have to be particularly stupid to defend Musk’s action where he gutted the department dealing with CSAM leaving it understaffed for months by arguing that “there’s less CSAM on Twitter now”. This particularly stupid defense from the Musk sycophants have been making the rounds since November, but what do I know, perhaps not having enough people looking for CSAM is a way to have “less” CSAM on a site.

It’s particularly ironic, well stupid in your case, that you ask for baseline stats while yourself make statements with zero facts attached.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re: Re:4

“The number of reports after Musk took over nosedived which proves one of two things: Twitter isn’t reporting all CSAM (real or suspected), or, they don’t actually know how much CSAM is on the service today and only report the little they can find with the reduced toolset they have now.”

Or they simply aren’t there anymore. A 2022 Study published in Reuters found 500 CSAM accounts over a 20 day period.

https://www.reuters.com/article/twitter-csam-exclusive-idCAKBN2QT1JX

By January 2023 the Musk Hating NBC could only find “at least dozens” over a longer timeframe than the 2022 study.

The simple answer is that they aren’t on Twitter anymore they moved to Mastadon. So Twitter has less to report.

This comment has been deemed insightful by the community.
Rocky says:

Re: Re: Re:5

Or they simply aren’t there anymore.

I see, gut the department that combats CSAM and all the pedos gets scared and leave.

By January 2023 the Musk Hating NBC could only find “at least dozens” over a longer timeframe than the 2022 study.

So they used the same tools and methodology that Ghost Data used? Funny that, since the article specifically talked about a dozen accounts they went and reviewed.

You also seem to have missed continue reading anything after the headline because further down in the story it says: Many more tweets reviewed by NBC News over a period of weeks were published during Musk’s tenure. Some users tweeting CSAM offers appeared to delete the tweets shortly after posting them, seemingly to avoid detection, and later posted similar offers from the same accounts. Some accounts offering CSAM said that their older accounts had been shut down by Twitter, but that they were able to create new ones.

Tell us again that there isn’t CSAM on Twitter any longer.

The simple answer is that they aren’t on Twitter anymore they moved to Mastadon. So Twitter has less to report.

And that statement belies the fact of what NBC reported. Do you think NBC is lying? And Ella Irving too for that matter?

That One Guy (profile) says:

Re: Re: Re:6 'We're LESS likely to get caught here, quick, move somewhere else!'

And that statement belies the fact of what NBC reported. Do you think NBC is lying? And Ella Irving too for that matter?

More than that it’s also a nonsensical argument. Twitter’s new owner puts less effort into finding and removing CSAM than the platform was doing before he took charge and that caused those posting it to move to a different platform?

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re: Re:6

“So they used the same tools and methodology that Ghost Data used?”

That seems like a rather shady argument. Are you arguing that the studies are so different that no comparison at all can be made. Are you arguing that the studies are so different in methodology that would result in a difference of a factor of 10.

The most significant priori factor in the study, time frame, should have resulted in NBC finding more not less.

The claim that we cant compare the two is really just a shady sophistic copout.

Rocky says:

Re: Re: Re:7

That seems like a rather shady argument.

No, it’s not. If you don’t understand the difference in what Ghost Data did and what NBC did you can’t make a comparison between their conclusions in any meaningful way.

Are you arguing that the studies are so different that no comparison at all can be made. Are you arguing that the studies are so different in methodology that would result in a difference of a factor of 10.

Yes, what NBC did can be compared to looking at a few cherry-picked pages in a book counting the occurrence of a particular word while Ghost Data tried to count all the occurrences in the book.

The claim that we cant compare the two is really just a shady sophistic copout.

It’s not my problem that you can’t understand the difference between anecdotal data from a very limited sample compared to something produced by a methodology applied on a very large sample.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re: Re:6

“And that statement belies the fact of what NBC reported. Do you think NBC is lying? And Ella Irving too for that matter?”

Yes that is called lying by context. NBC much like the above article in this OP made no attempt to establish a baseline so you can view their numbers in context. You are left to assume Musk has done nothing about CSAM.

You are a mental child with poor parents. A good a parent wouldn’t have let you get away with other forms of lying. Poor parents don’t properly explain to thier children that a lie is any statement made to trick the listener/reader into coming to the wrong conclusion. These can be lies of commission, omission, context ect. Only a child or adult with poor parents thinks that only lies of commission are the only kind of lying.

Your parents obviously fucked up in raising you.

Rocky says:

Re: Re: Re:7

Yes that is called lying by context. NBC much like the above article in this OP made no attempt to establish a baseline so you can view their numbers in context.

Let me get this straight, you first argue that the NBC article proves that there is less CSAM on Twitter after Musk took over and now you are arguing that they are lying? Perhaps you should actually read the stuff you link to before using your faulty assumptions of what it says to offer up an argument not rooted in reality?

You are left to assume Musk has done nothing about CSAM.

Only if you didn’t actually read the article and only based your assumption on the headline. Musk has done something about CSAM, belatedly, but it was a stupendous exercise of stupidity where he first gutted Twitter’s ability to deal with CSAM and then scramble to rebuild that ability for months.

If your boat is leaking and you have managed to plug some leaks, you don’t fucking remove the plugs because you think you can stop the leaks better without actually having a solution ready – that is essentially what Musk did when took over Twitter.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re: Re:8

“NBC article proves that there is less CSAM”
Not from anything NBC did but I can compare their data to existing data. NBC left out context on purpose for the reader to be decieved.

“only based your assumption on the headline”

Yes which many people do. Most of the people posting here never read the article critically and are making assumptions based on the headline.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Rocky says:

Re: Re: Re:

There is no evidence cited, no baseline for the amount of CSAM on the platform prior to Musk is set

So the number of CSAM-reports to NCMEC and the police before Musk took over isn’t considered a baseline? A number that took a nosedive when Musk fired the staff combatting CSAM?

I don’t know in what reality you live in where you think gutting the staff combatting CSAM makes it easier to be found and removed.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re: Re:2

“So the number of CSAM-reports to NCMEC and the police before Musk took over isn’t considered a baseline? A number that took a nosedive when Musk fired the staff combatting CSAM?”

That is not a baseline on the amount of CSAM. The decline in reports could just as easily be due to the decline in the amount of CSAM.

P.S. They all moved to Mastadon.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re: 'It's okay when my side does it.'

When it comes to defending their own people like that only care about the question: ‘Are they on my team?’ So long as the answer to that is ‘yes’ then it doesn’t matter what the person has done, it’s still acceptable, even if they wouldn’t hesitate a second to tear into someone not on their team for that very thing.

Scary Devil Monastery (profile) says:

Re: Re:

I know you aren’t new here, Ninja so I have to ask – what the hell did you expect?

The libertarian fandom hailing Musk doesn’t give a shit about children. They never did. They’d sell a few themselves if they thought they could get away with it. In so far as they have any ideology beyond “Fuck You, Got Mine” it’s just some re-cast form of antisemitic “jewish global conspiracy” argument standing as the reason as to why they’ve ben oppressed enough not to themselves be rich and wealthy edgelords a la Musk.

Scary Devil Monastery (profile) says:

Re: Re:

Just to add to that – take one good look at the people defending Musk here. Chozen. Hyman Rosen. Koby. Well known trolls always coming out swinging whenever the poor white man is being “oppressed”.

That’s Musk’s fanbase on these boards – as it is in real life where their Great White Savior is letting them all out of Twitter Jail.

This comment has been deemed insightful by the community.
James Burkhardt (profile) says:

Re:

IU can’t read the NY Times article, but Ars Technica did some reporting which dives into the numbers that concern experts.

While suspensions have risen, reports to NCMEC have fallen. This is because Twitter has stopped sending tips on sellers and distributors to third parties, allowing the sellers and distributors to continue to abuse and exploit the children, rather than allow investigators to try to track down the sources of this material to prevent the abuse from continuing.

We still see known CSAM (as in, matches the hash of previously identified CSAM stored in public databases that exist to allow you to search for known CSAM without viewing the material) being reported and taking over a week to be removed. In every criticism of pre-musk twitter and CSAM, ignoring reports is like the #1 example of why twitter was bad at CSAM.

If the NYTimes can put together a bot to find matches to known CSAM hashes, surely Musk’s super team could have already done so for the #1 priority of the company.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

So the article makes a quantitative statement without making any effort to establish a baseline.

From the article:

“Late last year, the company’s response time was more than double what it had been during the same period a year earlier under the prior ownership, even though the center sent it fewer alerts. In December 2021, Twitter took an average of 1.6 days to respond to 98 notices; last December, after Mr. Musk took over the company, it took 3.5 days to respond to 55.”

People can read, Chozen.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re:

““Late last year, the company’s response time was more than double what it had been during the same period a year earlier under the prior ownership, even though the center sent it fewer alerts. In December 2021, Twitter took an average of 1.6 days to respond to 98 notices; last December, after Mr. Musk took over the company, it took 3.5 days to respond to 55.”

That is not a baseline on the amount of CSAM is the time it took to respond to a report.

And you intentionally deleted the next sentence.

“By January, it had greatly improved, taking 1.3 days to respond to 82.”

So in the 1 month of transition to new management it took 2 days longer. And in January Twitter was responding faster than it did before Musk.

This nonsense.

Anonymous Coward says:

Re: Re:

People can read, yes, but you have to remember you’re arguing with a guy who thinks your intelligence is defined by how many typos you can Autocorrect into a sentence.

Chozen probably glanced through it, found no spelling errors, then started going onto Rule 34 soliciting for anal vore femboys again.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Violet Aubergine says:

Re:

Awww, did somebody strike a nerve today. Sing us a torch song Snowflakeahauntus or is it Cries For Pedo Enabler.

The article established repeated execrable responses from Musk’s Twitter including that he has stopped paying for the software that tracks CSAM and working with the same company to continue to make that software more effective–because genius man baby didn’t code that software so it has to be crap!1!! I’m sure Musk had some thought about doing something more hard core than Thorn is doing, decided to stop paying for their services and then got too high to remember he ever had those thoughts because at this point he’s so fucked his empire erasing his memories is probably now a scientific goal. Got some alternative logic about how that’s not revelatory like you morons think the Twatter Files are? He stopped paying for the CSAM search engine and you ask for proof that is right before your closed eyes.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re:

“The article established repeated execrable responses from Musk’s Twitter including that he has stopped paying for the software that tracks CSAM ”

Given the amount of CSAM that was on Twitter prior to Musk’s take over that software sucks.

These arguments are all fucking stupid. Objectively the people responsible for the policing of CSAM on Twitter should have been the first ones fired. Objectively the software they were using to catch CSAM should have been the first software abandoned because they were not doing their job.

Anonymous Coward says:

Re: Re: Re:

“Objectively”, says Chozen, without offering any evidence for the extraordinary claim that the people in Old Twitter in charge of getting rid of CSAM weren’t doing their job.

Since you care so much about studies of how much CSAM was on Old Twitter, then you should be hoping that Musk lets researchers study the amount of CSAM on New Twitter.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Matthew Bennett says:

These aren’t experts

They’re politicians, with an agenda, who hate Musk for largely the same reasons you do.

I really liked where the most credible orgs you cited were just making estimates based on algorithmic tools you had made a post talking about how they are shit a couple weeks ago. No one even checked to see if yes, it was actually kiddie porn.

Basically, to the degree you have showing anything at all, you’ve shown more content shows up on an algorithm you’ve already claimed was obsolete with bad signal to noise ratio.

Perfect. Peak Masnick.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

They’re politicians, with an agenda, who hate Musk for largely the same reasons you do.

Uh, NCMEC and the Canadian Center for Child Protection are in fact experts. As are the team at Microsoft that runs PhotoDNA. None of them are politicians.

Your problem, Matthew, is that people other than you can actually read, so when you make shit up, you look pretty fucking stupid.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

They’re politicians, with an agenda, who hate Musk for largely the same reasons you do.

WTF are you even talking about?

It’s amazing the lengths to which you will go just to defend your idol Musk.

Even when it concerns CSAM, you still defend him even though there is strong evidence that his handling of CSAM material on Twitter is MUCH MUCH worse than before he bought it.

So you are either a Musk fanboy sycophant hoping some day he will fuck you, or you are into CSAM and enjoy the fact that the material is easier to find now that Musk is running Twitter…

You sick twisted degenerate fuck.

Strawb (profile) says:

Re:

These aren’t experts

They’re politicians, with an agenda, who hate Musk for largely the same reasons you do.

Thank you for confirming that you only had the attention span to get through a single paragraph before you felt the need to put your stupidity and ignorance on display in the comments.

On a related note, it takes a special kind of psycho to try to downplay the importance of combating CSAM on Twitter just because you have a hate boner for Masnick.

This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

serene sociopath smile

I’ll say this for Elmo, he knows what his target demo wants.

Stop sputtering you sound like a motorboat.

If he didn’t intend to be the 1 stop shop for freaks sharing and SELLING CSAM to each other, there was a whole team that was doing their best… and he fscking fired them.

Daily notices for a week before they removed CSAM a government flagged for them.

Options –
Elmo is king of the pedos.

Elmo mistakenly believes his faithful aren’t the bottom feeding scum b/c he is actively ignoring the sale of CSAM on his platform.

Elmo doesn’t give a single fsck about doing the right thing if it means he has to spend money to do it.

If the next thing Twitter rolls out is the talked about allowing content creators to sell content via the site, its safe to say he is a corrupt soulless fsck so much worse than anyone wanted to believe.

Meanwhile its kinda clear that its just not them being shot Congress is ignoring about children.
There was a balloon… ooh scary…
THOUSANDS of CSAM images are being sold & traded on Twitter and the owner doesn’t seem to give a fsck, fired the team that deals with it, & cut ties with the groups trying to make the detection faster & more accurate.

But hey at least they aren’t being “silenced” any more as their faithful spend their time trying to own the libs and score that video of the hot new 8 yr old ‘star’.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
PaulT (profile) says:

Re:

Doubtful. Even if he hadn’t failed as badly as he has overall, this is the sort of thing that gets a lot of attention and it makes no sense to do this rather than use whatever Epstein-like provider that he could afford to buy in secrecy.

No, I think that we’re getting what we’re seeing in this case – an ego-driven moron who’s spent too much time being shielded from the consequences of his actions finally bit off more than he could chew, and is finding out the hard way that the people he assumed did nothing are in fact doing everything of importance.

His lack of understanding of how Twitter really operates, combined with his unwillingness to learn and eagerness to get rid of anyone who doesn’t bow to his will are the problems here. No need to invent other conspiracies.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Strawman

Nobody claimed that Old Twitter fixed nor could fix the problem. Nobody claimed that Old Twitter handled CSAM well in the absolute sense. The critical claims are in relative terms. The article’s argument is that Musk’s Twitter has been worse than Old Twitter at dealing with CSAM.

Also, you forgot or ignored that, as the article pointed out, Musk’s Twitter claimed to have improved Twitter’s handling of CSAM.
Twitter Safety on February 1st, 2023:

We’re moving faster than ever to make Twitter safer and keep child sexual exploitation (CSE) material off our platform.

This comment has been flagged by the community. Click here to show it.

Chozen (profile) says:

Re: Re:

Yes nothing in the article establishes a baseline prior to Musk. Its not like there isn’t data and other studies out there they could have used. When advocates don’t establish a baseline its because they don’t want to establish a baseline. They expect people like you to be so fucking stupid that if they say “Made Twitter’s Child Sexual Abuse Problem Worse” you will take it at face value and not realize that at no time did they establish any kind of baseline to make such a quantitative statement.

They know you are a fucking idiot.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

On a global network, sabotaging a site by flooding it with trash was already a known strategy.

SEO poisoning has been happening for over a decade. Just flood a competetor with negative SEO to delist them.

Tue digital version of “Good fences make good neighbors” is already a solution for much of the SPAM woes.

Internet 2.0 will evolve much better than the post-information superhighway era. Just like TV has multiple channels and mediums, multiple data networks will accomodate the changing pace of modern innovation.

A data network without commerce, ads, SPAM and culture rot is easy to implement.

They’ll always be #1, just like single cell organisms that were unable to evolve :p

This comment has been deemed funny by the community.
Violet Aubergine says:

Re:

Oh, and I bet somebody is using Inception level tech on Elon tricking him into refusing to pay for Thorn’s CSAM search engine or working with them to make it more effective. You know, automating the detection of CSAM. I bet gay Twitter is behind it!!!!!!!!! We’re all so annoyed by all the devastating realizations of how everything in liberalism has been proven wrong by the geniuses flooding Twitter since the great free speech avocado Elon Musk took over and turned it into Twitter 2.0!1!!!!!

Anonymous Coward says:

Some things to think about:

The Canadian Center of Child Protection has a bit of a reputation for flagging content that is not actionable and they personally find to be immoral.

They don’t always do this, however their notices require extra review, otherwise they might try to sneak something weird through. They are also a bit activist.

Thorn was founded by Hollywood actor, Ashton Kutcher.

They had some controversy in 2019 for collaborating with infamous firms like Palantir and had a number of religiously motivated “anti-trafficking” groups as partners (which they removed from their site after Engadget reporter, Violet Blue, confronted them).

They’re one of the biggest lobbyists behind the E.U.’s surveillance proposal, referred to by critics as “chat control”. They have privately met with individuals from the E.U. Commission a number of times.

The Commission has been criticized by the opposition for not being adequately transparent about what was discussed in these meetings.

To be honest, foreign regulators / non-profits are likely to be going through the NCMEC, or other U.S. bodies, rather than making direct requests. This might not be so bad, as some are quite keen on a “War on Porn”.

As you’ve pointed out, their responsiveness here has dropped. It’s honestly not surprising, when they have fewer staff. It’s not quite “end of the world” (they get around to it), although it doesn’t inspire confidence.

The most vexing thing is that there really are important decisions to be made, but he reduces everything down to “Old Twitter was bad in every way. I am your savior”.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

There may not be a ‘perfect’ solution to CSAM but there are certainly better and worse ways of dealing with it, and firing the people who had the horrific job of finding and getting rid of it, only suspending rather than reporting to law enforcement those that post external links to it and no longer working with a company who’s job is to provide software to make the task of finding and removal easier absolutely falls into the ‘worse’ side of the equation.

If Musk isn’t trying to make Twitter known as the place to find CSAM he’s doing a terrible job giving the opposite impression.

(As an aside if there was ever an article for Musk fans to not comment on this would absolutely be it, those jumping to the defense of Musk here have shown just how low they are willing to sink to defend the man, or if I’m being less generous exposed a category of content they don’t actually have a problem with.)

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:

More telling is that every time I bring up an example of actual censorship⁠—e.g., Florida’s book bans⁠—they either scatter like cockroaches or deflect back to their imagined forms of “censorship”. They never like to address the fact that the politicians they support are the ones carrying out actual censorship (or that they like said censorship) because it would make them hypocrites of the highest order.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:2

Woke ideologues adore censorship when the viewpoints being silenced are the ones with which they disagree. They screech in outrage when the viewpoints being censored are ones they like. Florida politicians have the same right to censor on platforms they control as Twitter and Facebook do – the government may speak for itself as it chooses, and public schools and public libraries are instruments of the government. Neither do government employees have a 1st Amendment right to speak freely as part of their jobs.

The Florida censorship is bad, because even though they are silencing lies, silencing any viewpoint is morally wrong in a society that has free speech as a foundational value. But woke ideologues are in no position to argue against it, because of their own desire to silence and cancel those with whom they disagree.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:3

Oh, I’mma have some fun with this one, Hyman.

Woke ideologues adore censorship when the viewpoints being silenced are the ones with which they disagree.

Welcome to Coneria!

They screech in outrage when the viewpoints being censored are ones they like.

Gee, it’s almost like those viewpoints are ones that affirm the humanity of the marginalized and seek to enrich rather than restrict the knowledge of others, and any censorship of those viewpoints is tantamount to an attack on marginalized people of all kinds by the power-holding majority~. Imagine that~.

Florida politicians have the same right to censor on platforms they control as Twitter and Facebook do – the government may speak for itself as it chooses, and public schools and public libraries are instruments of the government.

Here’s the problem with than analogy: Twitter and Facebook banning some schmuck for saying anti-trans bullshit doesn’t have the threat of a potential jail sentence in the same way Florida’s anti-woke/anti-queer speech laws have that threat backing their enforcement. They may one day be repealed as a violation of the First Amendment. (Not that I trust this SCOTUS to do that…) But until that happens, bookshelves in schools are being emptied and blocked off because a handful of people deem themselves so morally superior to “the woke” that they believe themselves to be God-ordained arbiters of truth and “appropriateness” for all children of all ages.

The Florida censorship is bad, because even though they are silencing lies, silencing any viewpoint is morally wrong in a society that has free speech as a foundational value.

I find it fucking hilarious that you have to keep qualifying your supposed dislike of Florida’s censorship with shit like “even though they are silencing lies”. You’re not fooling anybody here, Hyman. You know that saying “every accusation, a confession”? That’s a thing that applies to right-wingers far more often than it applies to left-wingers. I mean, you literally said, “woke ideologues adore censorship when the viewpoints being silenced are the ones with which they disagree”, then turned around and complimented Florida’s censorship by saying “they are silencing lies”. I can’t think of a better example of “every accusation, a confession” outside of a preacher who proclaimed themselves a champion of “family values” being caught molesting a child.

But woke ideologues are in no position to argue against it, because of their own desire to silence and cancel those with whom they disagree.

You wanna know what’s funny about this statement (other than it also being another example of “every accusation, a confession”)? You’re assuming that “woke ideologues” have far, far, far, far, far more power in society and politics than they actually do.

Louis C.K. was “cancelled” over several accusations of sexual misconduct involving women; four years later, he won a goddamn Grammy. Dave Chapelle was widely criticized for a comedy special where he made several transphobic remarks disguised as jokes; Netflix stood behind him anyway and he won a goddamn Grammy this past week. J.K. Rowling has been the subject of a hell of a lot of boycotts, criticism, and so on because of her outspoken anti-trans views and her flat-out saying every dollar she makes from Harry Potter supports those views; Hogwarts Legacy still got made. If you really believed that “the woke” had the power to “cancel” people into oblivion with harsh criticism, don’t you think the three assholes I mentioned above would no longer be working in anything but right-wing ~~propaganda~~ entertainment circles?

That’s the problem with your whining about “the woke”, Hyman: You assume that the marginalized people being targeted by Florida’s censorship laws and celebrity-endorsed transphobia and so on have the absolute power to stop all that with a snap of their fingers. You paint the people who support those laws and that bigotry as “brave freedom fighters” who are trying to enable “the truth”, but the real truth is that Florida law deemed a book with a gay couple to be “pornographic” despite the book having no sexual content and you really don’t mind that one goddamn bit because⁠—and I quote⁠—“they are silencing lies”.

Trans people make up 1% of the population (if even that). Yet you think they’re so powerful that they can change entire societies merely by saying “trans rights are human rights”. You rail against them by using the phrase “woke ideologues” because you’re too gutless to just say “trans people” (or “people of color”). The delusion that trans people have even a fraction of the power you deeply believe they have is broken whenever you encounter the actual truth⁠—and the actual truth is that if they had that power you’ve deluded yourself into believing they have, we wouldn’t be seeing Florida trying to mandate the reporting and recording of female high school athletes’ menstrual data as a way of attacking trans girls.

You’re not some bold anti-fascist fighting for a morally noble cause. You’re a bigot whose current attacks against trans people are exterminationist precisely because you already want to drive them out of the public sphere. The only places left for you to go are “as openly bigoted as Donald goddamn Trump” and “hey, maybe it’s time we solve the ‘transgender debate’ with the radical power of violence”. And I don’t even give a fuck if I’m Godwinning myself, but you need to remember an important factual truth, Hyman: The Nazi death camps weren’t the first step⁠—they were the “final solution” to the “question” of what to do about a Repugnant Cultural Other that had been blamed for all of Germany’s woes. Your attacks aren’t advocacy for violence against trans people…but on a long enough timeline, the violence will be visited upon them regardless.

Maybe you wouldn’t have been standing alongside the Nazis. But you would’ve sure as shit been turning your head as your neighbors were dragged away. Accomodation and appeasement only ever gives fascists more power. Give them even the slightest bit of ground and they will take more and more and more until they have scorched the Earth (metaphorically or literally). And you seem more than okay with that approach because⁠—and I quote⁠—“they are silencing lies”. But when your life somehow becomes the “lie” they want to “silence”, who do you think will be left to stand for you when all the other “lies” have been “silenced” and everyone who’s left decides to turn their heads so they won’t be part of the “problem” that requires a “final solution”?

The funniest thing about all this is that I don’t even really hate you. In fact, I’d love to see you show even the slightest bit of introspection and change. I would sincerely love to see you ponder that last question I asked you to the point where you finally understand how your rhetoric is nothing more than proto-fascism that distills trans people into a “debate” that needs to be “won”, a “problem” that has to be “solved”, a “question” that requires a “solution”. But you make it impossible to believe you’re even remotely capable of doing anything more than spreading hate. So I hope you enjoy living in a bubble of fear and hatred stoked by grifters and fascists who want you to be their useful idiot and turn your back to your neighbors⁠—until and unless you prove otherwise, that’s all you’re capable of doing, and that’s all you’ll ever deserve.

Anathema Device (profile) says:

But but Elon is a GeNIus!

Another subject on which he knows nothing, not enough to know how little he knows.

Only this time, it’s not just empowering Nazis, it’s destroying the lives of little kids.

So much for his heartbroken statement about holding his dying son in his arms* meaning he could never allow harm to come to a child.**

*He lied
**He lied

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: If only there were people who's entire job was to do that

About that…

Mr. Shehan of the national center disputed that interpretation of the rules, noting that tech companies are also legally required to report users even if they only claim to sell or solicit the material. So far, the national center’s data show, Twitter has made about 8,000 reports monthly, a small fraction of the accounts it has suspended.

Anonymous Coward says:

The impression I’m getting of Ella Irwin is that even though she is a bit ambitious, she isn’t that bad. She just has to work within the constraints Elon has imposed on her, and can’t appear to not be tough.

If you remember the E.U. Chat Control debate, one of the arguments about a high false positive rate was that someone who isn’t a predator might have their lives shattered after being accused by an algorithm.

I could be wrong but she has mentioned how she avoided over-reaching before.

This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

I see the defense of Elmo and I harken back to last week when someone accused me of hating baby jeebus, being blinded by hate, everyones done something bad….

There is CSAM FOR SALE ON TWITTER.
Full Stop.
ELMO FIRED THE PEOPLE WHO INVESTIGATE AND STOP CSAM ON TWITTER.
Full Stop.
ELMO IS IGNORING THE REPORTS OF CSAM FROM RECOGNIZED REPORTERS OF SUCH INFORMATION.
Full Stop.

If you have any argument to make that somehow this isn’t horrific or its not Elmo’s fault…
What in the actual fsck is wrong with you?

If you are going to try and defend Elmo ignoring bastards trading and selling CSAM on the site, I have to ask how defective is your brain.

Children being abused on film for the pleasure of others who want to abuse children and some motherfucker is making arguments about how its somehow not Elmos fault because old twitter was worse….
Old twitter had a fscking team to try and stop these bastards trading this shit and making more of it to meet demands… Elmo FIRED THEM.

No if’s no ands no fscking buts.
There are pedophiles trading kiddie porn on Twitter.
Elmo knows (or fucking should know) and he is focused on getting money out of those using the API while allowing accounts to keep trading content of abused kids like the sickest fscking trading cards…

There is no other argument.
Pick a side…
Are you a baby fscker or not?
Thats the question.

Elmo removed the cork from the hole in the sea wall that was trying to hold back the ocean of CSAM, he has no interest in doing anything to stop that flow instead focusing on trying to make money… while sick fsckers on twitter are making bank selling content of abused kids.

You argument is invalid.
Elmo is failing at being a human and if you keep defending him, imma gonna call you a baby fscker.
Its not space x, its not tesla, its a dude who is giving a platform to pedophiles to sell their wares to each other… if you can defend that, please leave the planet expediently by the most painful means possible. How much of a step is it from defending Elmo making a safe place for CSAM to making your own CSAM to sell?

So bring it on, defend him in this… I fscking dare you.

Rekrul says:

The NY Times report notes that it used some tools to investigate CSAM on Twitter without looking at the material itself. While it doesn’t go into detail, from what’s stated, it sounds like wrote some software to identify potential CSAM, without looking at it, and then forwarded the accounts to the Canadian Center for Child Protection and also to Microsoft, which created and runs PhotoDNA, the tool that many large companies use to identify CSAM on platforms and to report that content to NCMEC (the National Center for Missing and Exploited Children) in the US and the CCCP in Canada (and other organizations elsewhere)

And this is part of the problem. You can’t even investigate it without breaking the law. You have to use iffy AI tools to try and find it without a human actually looking at it to see if it really qualifies.

Occasionally I do reverse image searches on Google if I want to find a larger/better copy of an image. Google always suggests “similar” images. Sometimes they’re similar, but often it will include images that don’t have anything in common with the original image, other than some colors.

mfellalaw (user link) says:

Thanks for social bookmarking websites list. Really it gonna help many freshers to bookmark their websites/posts, etc. it has various advantages as mentioned above but most importantly it has the main advantage to bloggers, free social bookmarking websites will help them fetch traffic to their websites. When anyone submits any link to any famous bookmarking website, it gets tonnes of free attention and traffic.
https://www.mfellalaw.com/

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Subscribe to Our Newsletter

Get all our posts in your inbox with the Techdirt Daily Newsletter!

We don’t spam. Read our privacy policy for more info.

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...