Congrats To Elon Musk: I Didn’t Think You Had It In You To File A Lawsuit This Stupid. But, You Crazy Bastard, You Did It!

from the to-save-speech-we-need-to-censor-speech dept

So, yesterday we covered Elon Musk’s ridiculous censorial threat to sue Media Matters “the split second the court opens on Monday.” Of course, as we noted, you can file lawsuits 24/7. And yet, as the courts opened on Monday, there was nothing. As mentioned in the original post, I was away from internet access for most of Monday, and when I finally got back online I was told there was still no lawsuit.

But, then, on Monday evening, the lawsuit finally came, and it was glorious. Gloriously stupid.

Honestly, this feels like what you get when you have a rich but legally ignorant dude who announces on a Friday that there will be a lawsuit on Monday and finally finds some terrible lawyers who are actually willing to file something just to live up to that promise.

It’s not a good lawsuit. It’s barely even a lawsuit at all.

Let’s start at the top.

Problem 1: It was filed in Texas federal court, even as the complaint admits that exTwitter is a Nevada corporate entity, based in California, and Media Matters is a D.C.-based entity. The lawsuit barely makes any effort at all to justify the venue.

Indeed, what little justification they do present is not at all how any of this works. To get jurisdiction in Texas for non-Texas parties, they have to show that someone in Texas was involved, that the laws were violated by parties while they were in Texas, or were somehow directed at Texas parties. The complaint doesn’t even make an effort to do any of that. It just says “a substantial part of the events giving rise to the claims occurred herein.” But that’s not how any of this works.

Of course, we all know the real reason it was filed in a Texas federal court. While Texas has a pretty good anti-SLAPP law, the 5th Circuit had deemed that you can’t use it in federal court. If the lawsuit had been filed in the 9th Circuit, where exTwitter is, then California’s (or Nevada’s) anti-SLAPP law would apply.

Problem 2: The lawsuit flat out admits that Media Matters’ reporting was accurate. It makes a big deal of claiming that Media Matters “manipulated” things and “manufactured” the controversy but… still admits that Media Matters used the platform, and saw what it saw, and reported on it.

Media Matters executed this plot in multiple steps, as X’s internal investigations have revealed. First, Media Matters accessed accounts that had been active for at least 30 days, bypassing X’s ad filter for new users. Media Matters then exclusively followed a small subset of users consisting entirely of accounts in one of two categories: those known to produce extreme, fringe content, and accounts owned by X’s big-name advertisers. The end result was a feed precision-designed by Media Matters for a single purpose: to produce side-by-side ad/content placements that it could screenshot in an effort to alienate advertisers.

But this activity still was not enough to create the pairings of advertisements and content that Media Matters aimed to produce.

Media Matters therefore resorted to endlessly scrolling and refreshing its unrepresentative, hand-selected feed, generating between 13 and 15 times more advertisements per hour than viewed by the average X user repeating this inauthentic activity until it finally received pages containing the result it wanted: controversial content next to X’s largest advertisers’ paid posts.

Thus, on page 3 of the complaint, Musk’s newfound lawyers (not from a “BigLaw” firm like he usually uses) tell you that Media Matters did use the platform and did, in fact, see what it reported it saw. It’s just that exTwitter (i.e., Musk) doesn’t like how they they portrayed their usage of the platform.

Notably, the original article never made any claims suggesting that everyone was seeing ads on neo-Nazi content. They just said they saw these ads appearing next to neo-Nazi content, and the complaint admits as much.

So the complaint is “Media Matters set up an account that followed neo-Nazis, which we allow, and found ads next to that content, which we allow, but we’re mad because Media Matters should have followed other people instead.” That’s… not a cause of action.

Problem 3: The lawsuit admits that its real complaint is that it disagrees with how Media Matters framed the story. But, and I know Musk still can’t seem to wrap his brain around this rather important fact: part of free speech and a free press is that you don’t get to dictate how others cover stories about you.

You’d think that a “free speech absolutist” would get that. But Elon Musk appears to have deeply censorial instincts rather than free speech ones:

Media Matters omitted mentioning any of this in a report published on November 16, 2023 that displayed instances Media Matters “found” on X of advertisers’ paid posts featured next to Neo-Nazi and white-nationalist content. Nor did Media Matters otherwise provide any context regarding the forced, inauthentic nature and extraordinary rarity of these pairings

So, yeah, you might also notice that this is Musk admitting that “Neo-Nazi and white-nationalist content” appear on exTwitter and that Media Matters did, in fact, see ads appear next to that content. Great work.

Problem 4: The lawsuit admits that not just Media Matters saw these ads next to neo-Nazi content, even if not that many others saw those ads.

And in Apple’s case, only two out of more than 500 million active users saw its ad appear alongside the fringe content cited in the article—at least one of which was Media Matters

Again, throughout the complaint, it admits exactly what Media Matters reported.

Its only complaint is it doesn’t like how Media Matters reported it. But the 1st Amendment protects such editorial choices. As should any self-respecting “free speech absolutist.”

Problem 5: The lawsuit attacks Media Matters for… using exTwitter’s system the way exTwitter allowed them to. It claims Media Matters “manipulated” the platform, but then describes how it used it in a perfectly reasonable manner, and that exTwitter served up the ads. Media Matters didn’t make exTwitter show these ads. ExTwitter just showed them.

Literally, the complaint admits that exTwitter’s systems worked exactly the way they were designed to, showing ads on content that someone followed, and if someone follows neo-Nazis, then ads are likely to show on that content.

First, Media Matters set out on their attempt to evade X’s content filters for new users by specifically using an account that had been in existence for more than thirty days.

Next, Media Matters set its account to follow only 30 users (far less than the average number of accounts followed by a typical active user, 219), severely limiting the amount and type of content featured on its feed. All of these users were either already known for posting controversial content or were accounts for X’s advertisers. That is, 100% of the accounts Media Matters followed were either fringe accounts or were accounts for national large brands. In all, this functioned as an attempt to flood the Media Matters account with content only from national brands and fringe figures, tricking the algorithm into thinking Media Matters wanted to view both hateful content and content from large advertisers.

Even this did not produce Media Matters’ intended result. An internal review by X revealed that Media Matters’ account started to alter its scrolling and refreshing activities in an attempt to manipulate inorganic combinations of advertisements and content. Media Matters’ excessive scrolling and refreshing generated between 13 and 15 times more advertisements per hour than would be seen by a typical user, essentially seeking to force a situation in which a brand ad post appeared adjacent to fringe content.

So, now… going on exTwitter, following neo-Nazis that Musk refuses to ban, and following advertisers is manipulative? As is “reloading” your feed? Under what theory?

Problem 6: It claims Media Matters “defamed” exTwitter, but then doesn’t include a defamation claim. The lawsuit mentions defamation three times, but not in the claims. So it repeatedly pretends that it’s arguing defamation, even though it’s not:

On November 16, 2023, Media Matters published a false, defamatory, and misleading article with the headline, “X has been placing ads for Apple, Bravo, IBM, Oracle, and Xfinity3 next to pro-Nazi content,” claiming that X was responsible for anti-Semitic content being paired with X’s advertisers’ paid posts.

If it was actually defamatory, Musk would have sued for defamation. The problem was that it was not. So calling it defamatory and not alleging defamation in the claims kinda makes it clear that they’re really just suing because Musk is mad about the article.

Honestly, it reads like the poor lawyer who had to do this rush job thought he was filing a defamation claim, and so added in a few claims of defamation, then a more senior lawyer realized before filing that there’s no fucking way any of this is even remotely defamation, but no one bothered to take out the language about defamation.

Now, there is a “business disparagement” claim and that’s kind of like defamation, but… even harder to show? And it still requires proving actual malice, which this complaint doesn’t even attempt to do. It does do a “Media Matters hates conservatives,” but that’s not how actual malice works.

Problem 7: Other than the “business disparagement” claim, the only thing they can sue over are nonsense throw-in causes of action: “interference with contract,” and “interference with prospective economic advantage.”

These are the kinds of claims that terrible lawyers often include with defamation claims to try to make the lawsuit more scary, and they’re usually dismissed with the defamation claims when judges say “um, you’re really just trying to say defamation in another way here.”

None of the causes of action make any sense here. What Media Matters did was find these ads and accurately report what it found. If that causes advertisers to bail, that’s not “interference with a contract.” It’s… just reporting. If accurate reporting causes someone to end a business relationship, you don’t get to sue over it.

Problem 8: The lawsuit names Media Matters employee Eric Hananoki as a defendant and then never makes a single claim against him. It mentions (mostly in footnotes) that Hananoki has written articles critical of Musk, including the article in the complaint. But, um, if you file a lawsuit against a particular party, you have to say in the lawsuit how that party actually violated the tort in question. And the lawsuit doesn’t even bother trying.

Honestly, Hananoki has the easiest “motion to dismiss for failure to state a claim” argument ever. Normally, you have to respond to the claims made about you and how, even if true, you didn’t violate the law in question. Hananoki doesn’t even need to do that. He can just point out that the lawsuit literally makes no claims against him.

Problem 9: The lawsuit insists advertisers bailed because of this article, but conveniently leaves out the fact that Elon Musk endorsed an antisemitic conspiracy theory a day earlier, and has been promoting bigoted nonsense content for months now. Also, advertisers are free to leave if they want.

Finally, this isn’t exactly a “problem” with the lawsuit, but I’ll just note the conflict in two separate statements:

X Corp. and Elon Musk are a critical Media Matters target because X is the most prominent online platform that permits users to share all viewpoints, whether liberal or conservative, and Mr. Musk is the most prominent voice on the platform and a passionate supporter of free speech.

That’s in paragraph 41 on pages 11 and 12. On Page 14 in the prayer for relief we get this:

A preliminary and permanent injunction ordering Defendants to immediately delete, take down, or otherwise remove the article entitled “As Musk Endorses Antisemitic Conspiracy Theory, X Has Been Placing Ads for Apple, Bravo, IBM, Oracle, and Xfinity Next to Pro-Nazi Content From Its Web” from all websites and social media accounts owned, controlled, or operated, directly or indirectly, by Defendants;

So… within the span of about 2 to 3 pages we are told that Elon Musk and exTwitter are passionate supporters of free speech that allow “all viewpoints” to be shared and that Musk is filing this lawsuit to force Media Matters to take down speech that he admits is absolutely true, but where he doesn’t like how they portrayed things.

Anyway, kudos to Elon. This really takes stupid SLAPP suits to incredible new levels. I didn’t think you’d be able to find a lawyer who would file a lawsuit so stupid, that makes you look this ridiculous, but you did it. Just like people doubted your ability to shoot rockets into space or make popular electric vehicles, I should not have doubted your ability to file absolutely nonsense SLAPP suits that are this laughable.

Filed Under: , , , , ,
Companies: media matters, twitter, x

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Congrats To Elon Musk: I Didn’t Think You Had It In You To File A Lawsuit This Stupid. But, You Crazy Bastard, You Did It!”

Subscribe: RSS Leave a comment
151 Comments
Anonymous Coward says:

A lawsuit this stupid can’t have been filed by a lawyer.

Even if there are two lawyers signing off on the thing, well…

I’m seriously wondering why he hasn’t tried to go pro se on this. I mean, he already doesn’t listen to anyone, and his bootlickers won’t point out the obvious…

And he clearly has enough money to infinitely refile.

This comment has been deemed insightful by the community.
James Burkhardt (profile) says:

Re:

The plaintiff is X, not Elon Musk. The best thing he could do to end this immediately would be to file pro se. Pro se representation is to represent yourself. X requires a lawyer because X is a legally seperate entity, and there is no person that can represent X as “themself”.

Beyond that, Musk doesn’t do anything himself. I’d be surprised if he doesn’t hire someone to wipe his ass. He definitely isn’t writing a legal filing longer than a tweet. It would take too much away from his tweeting time.

This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

Kathryn Tewson @KathrynTewson on extwatter is giving her breakdown of the complaint, but in terms so simple even Elmo might understand them.


Up Goer Five is an XKCD comic in which the author, Randall Munroe, describes the Saturn 5 rocket using only the thousand most commonly used words in the English language. Amusingly, “thousand” isn’t one of them, so he has to say “ten hundred.”

https://twitter.com/KathrynTewson/status/1727002256149770483

The hilarity ensues.

“But to show that this was true, he pointed to something that one of the Word-Showing Paying People said. And what THAT said is that they are mad if their words get shown next to Bad Mean Hate Words more than NONE times.”

This comment has been deemed insightful by the community.
Anonymous Coward says:

It’s not meant to win, it’s meant to dissuade people from criticising Bigot-Me-Elmo. There was an article recently about how some researchers have stopped researching him and the platform for fear of litigational reprisals.

It’s meant to curb free speech, and this from the “free speech absolutist.”

He’s a bully, pure and simple. Stop using his platform, stop driving his cars and stop being enamoured with Nazi space rockets and ffs (to the trolls here) remove your tongues from his arse until he disappears into oblivion.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
James Burkhardt (profile) says:

Re:

IBM stated that ANY amount of neo-nazi content next to its ads is a deal breaker to further ad buys. Musk got IBM ad revenue on the back of claims they could prevent that while still allowing Neo-Nazi content on he platform twitter used to remove. By the nature of Xitter, that promise can not be kept.

People said so at the time, Elon admits, and media matters proved that by the nature of Xitter, neo-nazis will see ads next to neo-nazi content. They were able to get Xitter to serve ads against neo-nazi content. IBM has responded accordingly.

Where is the fabrication?

This comment has been deemed insightful by the community.
blakestacey (profile) says:

Re: Re: Re:

Well, that presumes that Twitter’s software people are competent.

And on that note…

“Like others, I searched #HeilHitler for an experiment and these were the ads that popped up,” Evan Hurst wrote alongside a screenshot of ads for the conservative Media Research Center, an apparel company, and a vintage Banknotes seller. “I didn’t have to ‘excessively scroll’ anything, it took under a minute to find all three.”

Scary Devil Monastery (profile) says:

Re: Re: Re:2

…I don’t think you’d have to be Jewish to be an enemy of a human-rice dish. It not being kosher isn’t going to be the deciding disqualifier for anyone is all I’m saying. I don’t see there being much of a market for any demographic of significance, to be fair.

I mean, they don’t even make Soylent Green out of real people anymore.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: How's this for non-fabricated data?

Elon Musk is suing a media outlet for a story that paints him and a company he owns and operates in an unfavorable light but otherwise contains nothing that could be considered defamatory under current U.S. law. The lawsuit is a clear attempt to stifle the speech of that outlet, such that it will think twice before it tries to criticize Elon again. Tell me when I’m telling lies.

This comment has been flagged by the community. Click here to show it.

Benjamin Jay Barber says:

Re: Re:

The Elements of a False Light Claim
In a false light claim, the plaintiff must prove the following elements:

The defendant published some information about the plaintiff

The information must portray the plaintiff in a false or misleading light

The information is highly offensive or embarrassing to a reasonable person

The defendant must have published the information with reckless disregard for its offensiveness

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

Conservatives love to claim that people revealing and drawing attention to their inappropriate behavior are the ones who are behaving inappropriately.

“Hey, Bob. You poked a hole in the hull and now the ship is sinking.”

“Why would you say that? Hey everyone, this man is causing the boat to sink!”

Every conservative knows boats with holes don’t sink if you just don’t test for holes.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re:

A similar thing happens when you point out instances of racism: “You’re the real racist! If everyone lived colorblind lives where they didn’t think about race, racism wouldn’t happen!” People who espouse this kind of bullshit think recognizing the existence of a problem so people can start trying to address the problem is the same thing as causing said problem. The “ignoring the problem makes it go away” fantasy is easier that doing the work to fix one’s own bullshit, whether it’s racism, misogyny, queerphobia, or being a George Santos–level fraud.

This comment has been flagged by the community. Click here to show it.

Benjamin Jay Barber says:

Re: Re: Re:

the premise is if everyone was literally blind, there could by definition be no racism, because it would be impossible for them to detect race.

The fact that you have to resort to mental gymnastics, that “there is a problem”, and therefore the solution to the “problem” is benevolent racism, and not seeing how malevolent it is to the “problem” races, is ironic given that was the entire justification of Jim Crow.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re: Re:2

If all human beings were blind, they’d come up with a different reason to discriminate, because it’s what people do. That you think blindness would solve the root issue behind racism (humans manufacturing reasons to hate/oppress/exploit other people) just reveals your naivete and disingenuousness.

Also, the initial assertion is wrong because people are pretty good at identifying an individual’s race by their voice: https://pubmed.ncbi.nlm.nih.gov/16243483/

Stephen T. Stone (profile) says:

Re: Re: Re:2

the premise is if everyone was literally blind, there could by definition be no racism

Other forms of hatred and bigotry would still exist. Poking everyone’s eyes out isn’t going to change that.

As for the rest of your bit: If you have to lie about my position to argue for your point, you don’t have a point worth arguing for.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re:

We don’t need to fabricate shit. Everyone knows you right wing white bois are nothing but closet child molesters and rapists, desperate and angry that you fucked around and found out that nobody gives a fuck about you no more.

Your kind can collectively throw yourselves off a cliff and the first thing we’re all going to ask is “Oh shit, is the ground okay?”

Go rope yourselves. We’re better off without you.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Narp says:

Re: terrible lawyers

An attorney writes (per CNN):

“He went with politically connected Texas lawyers, reflecting the extent to which people think that Texas courts are political actors, not legal actors,” Cohen added. “All three of the lawyers in that signature block have backgrounds with the Texas AG’s office or Solicitor General’s office.”

Coincidentally,
“immediately following Musk’s lawsuit on Monday, Texas Attorney General Ken Paxton announced a fraud investigation into Media Matters.”

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re:

Texas Attorney General Ken Paxton announced a fraud investigation into Media Matters.

Oh hey, look at that, an agent of the government using his office to investigate a media outlet for expressing perfectly legal speech. I’d say that the so-called free speech absolutists would call that censorship, but they don’t have the balls to shit-talk a Republican.

This comment has been flagged by the community. Click here to show it.

Benjamin Jay Barber says:

Re: Re: Re:

The Elements of a False Light Claim
In a false light claim, the plaintiff must prove the following elements:

The defendant published some information about the plaintiff
The information must portray the plaintiff in a false or misleading light
The information is highly offensive or embarrassing to a reasonable person
The defendant must have published the information with reckless disregard for its offensiveness

Do keep in mind that Texas law REQUIRES the sort of content moderation decisions that Elon is engaging in, and that law survived both the district court and the 5th circuit court, and therefore Media Matters is trying to punish twitter for obeying those legal requirements, to the benefit of noncompliant companies.

Somewhat Less Anonymous Coward (profile) says:

Truth is stranger than fiction, but it is because Fiction is obliged to stick to possibilities; Truth isn’t.

It’s the same thing with humor. You have to invent a scenario that is impossibly ridiculous. The problem is, that given the right circumstances nothing is impossible. In this case, you never know what lows Elmo won’t reach. Yesterday it was pizzagate, tomorrow it might be snippets from The Anarchist Cookbook for defending the free speech.

This comment has been deemed insightful by the community.
freakanatcha (profile) says:

His thin skin will do him in...

Elon’s problem is that advertisers have fled so how does this lawsuit doing anything to solve his problem?

Why would Apple, or the rest, want to spend $100MM after Elon admits he’s fine with racist & anti-Semitic content and that, yes, your ads will appear next to this content from time to time, but anyone who points this out should be silenced?

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re:

after Elon admits he’s fine with racist & anti-Semitic content

If this actually goes to trial and Elon is called to testify, hearing him have to admit that he’s okay with that kind of content because he’s been replying to it on an irregular basis since buying Twitter will be hilarious. And if it does get to that point, I also hope the lawyers for Media Matters will grill him on how he allowed someone who posted CSAM to return to the platform despite an assumed “zero tolerance” policy on that material and the people who use Twitter to distribute it.

This comment has been deemed insightful by the community.
Anonymous Coward says:

And what if Texas court dismisses the lawsuit, does it mean then are against free speech? Because, come on, he won’t retract his lawsuit after a second thought about it. Him, and one of his best (few) fans, among the “more than 500 million active users” (an estimate that he made up, I guess), would still think courts are rotten and laws are bad. Even if he loose, they’ll still think he has won…

This comment has been flagged by the community. Click here to show it.

Tanner Andrews (profile) says:

Re: fed courts are separate

footnote that says “Admission Pending” so, at some point, we went from being SG of Texas to being unlicensed in Texas

Not really. The Federal courts require separate admission, sometimes per district (e.g. M.D. Fla, N.D. Tx).

We should assume that he is licensed in Texas, as seems likely based on his history and the inclusion of ``Texas Bar No. 24087727” in his signature block. There is no reason to doubt that this is legitimate.

He has probably filed either an application to appear pro hac vice in this matter, but it is possible that he actually applied to become a member of the N.D. Tx bar and that is pending.

This comment has been flagged by the community. Click here to show it.

Strawb (profile) says:

Re:

If you had any cause for complaint beyond “I fffing hate Musk” (you don, really) it would be with their Lawyers, not Musk.

Are you suggesting that TD would waste their time telling Musk’s lawyers how batshit bonkers the suit is? Because that’s moronic, even coming from you.

If you purposefully lie you can be sued for defamation. Negligence too. That’s it.

Pray tell how the story from Media Matters is defamatory or negligent.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Matthew Bennett says:

Re: Re:

Show me where Media Matters lied, lied with purpose, and lied with actual malice towards Elon Musk.

They edited “screenshots” (they were not that) to show ads besides various hateful comments.

You don’t need malice btw (severe negligence will do) but it’s clearly there. Media Matters is a far-left organization and the Left these days (Like Masnick) hate free speech.

They lied very straight forwardedly and tried to “cancel” X over it (semi-successfully). This isn’t hard.

Go back to pretending any attempt to remove porn from school libraries is an attack on The Gays.

bhull242 (profile) says:

Re: Re: Re:

They edited “screenshots” (they were not that) to show ads besides various hateful comments.

Neither the lawsuit nor Elon claims that, and the lawsuit specifically says that ads do, in fact, show up beside hateful comments. Do you have any proof that the screenshots were edited?

You don’t need malice btw (severe negligence will do) but it’s clearly there. Media Matters is a far-left organization and the Left these days (Like Masnick) hate free speech.

Right, so you clearly don’t understand what “actual malice” is. It means “with actual knowledge the statement was false, or with reckless disregard for the truth, when the statement was made”. It has nothing to do with negligence or malice (in the colloquial sense of the term), and negligence is insufficient to show actual malice.

Stephen T. Stone (profile) says:

Re: Re: Re:

They edited “screenshots” (they were not that) to show ads besides various hateful comments.

Prove it.

You don’t need malice btw (severe negligence will do) but it’s clearly there.

And yet, Elon didn’t file a lawsuit claiming Media Matters defamed Twitter (or himself, for that matter). That besides, defamation of a public figure (which Elon clearly is) requires that the defamed party prove there was actual malice behind the defamatory statement.

Media Matters is a far-left organization and the Left these days (Like Masnick) hate free speech.

Remind me, which political “side” has been spending the past couple of years trying to ban books en masse from school and public libraries?

They lied very straight forwardedly and tried to “cancel” X over it (semi-successfully).

Two things.

  1. Prove they lied.
  2. Trying to “cancel” someone by pointing out their misdeeds isn’t defamation, nor is it illegal. For someone who claims that “the left” is an enemy of Free Speech, you sure do seem eager to proclaim that “the left” using speech to criticize the speech and actions of others should be (at a bare minimum) unlawful.

Go back to pretending any attempt to remove porn from school libraries is an attack on The Gays.

You show me where explicit hardcore pornography has been allowed into any school and I’ll show you a flying pig.

This comment has been deemed insightful by the community.
JMT (profile) says:

Re:

If you had any cause for complaint beyond “I fffing hate Musk” (you don, really) it would be with their Lawyers, not Musk.

The lawyers representing X, owned by Musk, and almost certainly acting at the instruction of Musk? Those lawyers?

If you purposefully lie you can be sued for defamation.

Cool, who lied? What was the lie?

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

David says:

Know what I think?

By now I really think somebody hired a character assassin to do a hit job on Musk. You know, someone sticking around in his circle and applauding every bad idea and detracting from sensible choices, to make sure Musk goes down the deep end.

These days there are probably character assassins working pro bono. Wait, that’s what trolls are, right?

This comment has been deemed insightful by the community.
That One Guy (profile) says:

'How dare you say the king has no clothes(but does have a swastika tattoo)?!'

This isn’t just a lawsuit against free speech it’s a lawsuit against reality and the audacity of pointing it out.

This is Elon throwing a hypocritical tantrum that someone pointed out that yes there absolutely is nazi content on the platform and they have no filters against that showing up next to the ads of major companies, which means to the surprise of apparently several major companies their ads can show up next to nazi content and all it takes it for an account to follow the ‘right’ people on the platform.

RandomTroll says:

Tidbit of trivia

Haven’t scrolled to see if this has been mentioned already, but Elmo Mush DOESN’T make cars or send rockets to space. He buys COMPANIES that do that and takes credit.
Haven’t read/heard anything about (The?) Boring Company, so I don’t know anything about it other than the plan for underground driving in Vegas was (apparently) a “crash and burn”.
No pun intended.

This comment has been flagged by the community. Click here to show it.

Benjamin Jay Barber says:

Mike Masnick Malding Again

Problem 6: It claims Media Matters “defamed” exTwitter, but then doesn’t include a defamation claim. The lawsuit mentions defamation three times, but not in the claims. So it repeatedly pretends that it’s arguing defamation, even though it’s not:

There is a tort of “False Light”, which does not require saying anything false but for example “lies by omission”, which I think is appropriate given that they omitted their process of intentionally trying to create the harm they sought to publicize.

Indeed, what little justification they do present is not at all how any of this works. To get jurisdiction in Texas for non-Texas parties, they have to show that someone in Texas was involved, that the laws were violated by parties while they were in Texas, or were somehow directed at Texas parties. The complaint doesn’t even make an effort to do any of that. It just says “a substantial part of the events giving rise to the claims occurred herein.” But that’s not how any of this works.

Lookup the Texas long arm jurisdictional statue and lookup the “Calder Effects Test”, all that needs to be pled is that the defendant intended Texans to stop doing business with Twitter, so I don’t really see a problem with that either, though media matters can claim forum non conveniens the Texas state laws can still be applied.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

There is a tort of “False Light”,

Which is not one of the claims in the lawsuit.

And, even if it was, you don’t seem to understand how “false light” actually works.

But, why should that surprise anyone, given that you’re a convicted criminal revenge pornographer.

This comment has been deemed insightful by the community.
Staid Winnow says:

Missing the point!

X will lose this lawsuit. Musk knows it, Media Matters knows it, Musk’s expensive lawyers know it.

But that is not the point.

Musk is going to spend a few million dollars and drag Media Matters through the Texas political landscape, a Trumpist judge, and on appeal the 5th Circuit, and perhaps even Clarito Inc.

Media Matters, which has fewer financial resources will be dragged through the process and will be spending a lot of time and money. Money they do not have, and will probably recoup through Anti-SLAPP statutes.

At the end of it, it would have been so exhausting for them, that they would basically comply and never criticize Musk again.

And that is if multiple red state AGs do not make life hell for Media Matters with “investigations” related to X.

Those expenses, they won’t even recover.

These are the real assaults on Free Speech. These are the cases that thefire.org should pursue. Not some students barking at Kyle Duncan.

David says:

Re: Re:

You can appeal the venue decision but I don’t think that kind of appeal would involve input from the plaintiff, so it is not adding much to the nuisance of a nuisance lawsuit unless you are out to annoy the judge in order to later claim you annoyed them so much that they must be partial by now. The Trump fraud trial defense is trying that route currently.

This comment has been deemed insightful by the community.
K`Tetch (profile) says:

There is a tort of “False Light”, which does not require saying anything false but for example “lies by omission”, which I think is appropriate given that they omitted their process of intentionally trying to create the harm they sought to publicize.

Bit of a problem there, captain Galaxy Brain.
Texas doesn’t recognize “tort of false light” Cain v. Hearst Corp., 878 S.W.2d 577, 580-81 (Tex. 1994)

ooh, fuck, maybe you should have actually looked into it beyond the name?

Lookup the Texas long arm jurisdictional statue and lookup the “Calder Effects Test”, all that needs to be pled is that the defendant intended Texans to stop doing business with Twitter, so I don’t really see a problem with that either, though media matters can claim forum non conveniens the Texas state laws can still be applied.

yes, please do look up the Calder Effects test. Here let me do it for you
a defendant must (1) commit an intentional act (2) that is expressly aimed at the forum state and (3) causes actual harm that the defendant knows is likely to be suffered in the forum state.
See that number 2 – they must have expressly aimed it at stopping texas doing business with Xitter. They don’t mention Texas once, so its hardly targeted just there.

And the ‘texas long arm jurisdictional statute’, that applies to Texas state courts as its a state law, not Federal courts, which are a whole different legal system.

Your level of legal knowledge is right on the ‘driving is a commercial activity and you only need a license if you’re operating in commerce’ level of competence.

This comment has been deemed insightful by the community.
nerdrage (profile) says:

Musk still doesn't get it

When Musk bought Twitter, he was entering the ad business, which he apparently had no inkling of. In the ad business, the whole point is to treat your customers, the advertisers, surpassingly well, so they get the results they want. Advertising is a conservative and skittish business all based on image. So the one big no-no is to harm your advertiser’s image in any way shape or form.

If one person out of 500,000 sees an Apple ad next to Nazi content, that is completely unacceptable. That one person will tell others and pretty soon all 500,000 know about it.

The solution is to thoroughly scrub the site of all unacceptable content, by whatever means required. Such as by hiring lots of mods. Mods are not some kind of unnecessary expense. They are vital to any ad-based platform.

Or, drop the ads and pivot to a wholly subscription model. The Nazis can pay the freight for the advertisers they run off. That’ll be $1000/month, please.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

Fiction Matters manipulated the platform to create a premise to support its partly invented narrative.

Invented narrative? Like this narrative:
exTwitter/Musk: Ads will never be displayed alongside racist/nazi/etc content.
Media Matters: What about these examples?

Either it did happen or it didn’t. And if it didn’t happen, how did Media Matter manipulate exTwitter to display the ads alongside this content which was guaranteed to never happen? Did they fake the screenshots? Did they hack exTwitter?

Refreshing a page isn’t manipulation you know even if some intellectually stunted people think that, it’s necessary for someone to find out what ads are available on the page because, this may blow your mind, not every available ad is displayed at the same time.

The lawsuit is junk. But so is the reporting.

What’s also junk is that you are attacking the messenger of exTwitters failings, you even had to invent a disparaging name to show your bias and your post is manipulative and disingenuous putting on display your inability to accept reality, which is that these ads actually showed up with no more serious effort than refreshing some pages.

Stephen T. Stone (profile) says:

Re:

[Media] Matters manipulated the platform to create a premise to support its partly invented narrative.

Let’s say that’s true. Wouldn’t that outcome still be a problem for Twitter, since that would mean its algorithms can be manipulated to place advertising next to posts from White supremacists, Nazis, antisemites, and Trump supporters who think Project 2025 should become a reality? I mean, I know I wouldn’t want to advertise on a platform where such posts/users were not only allowed, but openly welcomed by its owner.

LostInLoDOS (profile) says:

Re: Re:

Wouldn’t that outcome still be a problem

Maybe. Depends on the advertiser. The fact that to get there (where this org got to) requires extensive work. They read and followed multiple extremest accounts. Refreshing over and over and over. Eventually getting fed adverts from the general pool along side the content they chose to consume.

Ultimately, I don’t think most advertisers care. Not until they are targeted directly by some media org that wants to force its opinion across the board with invented scandals.

A few will react, eg Apple, as they have big political arms that push social agendas. Others may respond, but do nothing.

The question of if targeted blocks for advertising is available and failed is more important than the content.
What fiction matters did was (possibly) show that people who follow certain content intentionally will see advertising like anyone else. It’s up to the individual advertisers how they respond to this. And musk to clarify how X will deal with any supposed flaw in their algorithm.

Stephen T. Stone (profile) says:

Re: Re: Re:

Maybe. Depends on the advertiser. The fact that to get there (where this org got to) requires extensive work.

No, it doesn’t. And that’s the problem for Twitter: In Musk’s misguided quest to turn Twitter into a haven for all free speech, he turned the site into a Nazi bar where Nazi speech (or any equivalent thereof) is easy to find⁠—and easy to see ads placed next to it. The advertiser exodus should’ve been a clue to him that some speech will have to be “sacrificed” if he wants Twitter to stay both popular and profitable. But since he treasures the idea of far-right dipshits giving him 8 bux a month and lavishing praise upon him so they can keep saying bigoted bullshit without worrying about suspensions/bans, he’ll never learn that lesson, and his $44bn investment is going to turn into the biggest money pit he has ever owned.

I don’t think most advertisers care.

Given how many companies paused or outright cancelled their advertising on Twitter⁠—and how several major brands have paused their posting on Twitter⁠—I think they do care if Twitter manages to associate them with content from Nazis, white supremacists, and far-right Trump supporters.

The question of if targeted blocks for advertising is available and failed is more important than the content.

That you see a platform’s embrace of pro-Nazi content as less of a problem than advertisers leaving the site over that content is your problem. Solve it yourself, Trumpist.

LostInLoDOS (profile) says:

Re: Re: Re:2

No, it doesn’t.

Wow, you link to a post about an org that intentionally sought out and followed content to find that content. Then followed. Then manipulated the system to get the result they pretend is anything other than extensive manipulation.
Your against content existing that you disagree with.
Despite the fact that a random person finding such content without a dedicated search for it is nearly zero.

Given how many companies paused or outright cancelled

Given many came back. And many never left? The initial moral panic over allowing content passed. The content is not being fed to the masses. It is not easy to find without an intentional search for it. We can argue about where advertising appears. I’m partly on your side there. But I also understand the extreme difficulty of filtering content. Keyword filters are as likely to kill advertising for a NYT story as a nazi rally.

That you see a platform’s embrace

I don’t have a problem. I’m a free speech absolutist. If it’s legal it stays.
Yes, I’ve called out Musk on banning people critical of him. Right here in TD. I see banning content as against freedom. Especially if such content is far from generally available. It’s not like that stuff is first page for new users.

Stephen T. Stone (profile) says:

Re: Re: Re:3

Despite the fact that a random person finding such content without a dedicated search for it is nearly zero.

And yet, you see no issue that people can search for that content, can find it on Twitter in droves, and can see ads placed next to it. The whole point of the article to which I linked is that people can find pro-Nazi content and see ads placed next to it without manipulating Twitter’s algorithms⁠—unless you want to seriously claim that directly searching for the hashtag “heilhitler” is such a manipulation.

The initial moral panic over allowing content passed.

Considering that many of the brands that recently paused their advertising on Twitter have also stopped posting on Twitter, I’d say that the “moral panic” over Elon welcoming pro-Nazi/antisemitic content on Twitter isn’t as dead as you so desperately want it to be.

I also understand the extreme difficulty of filtering content.

Ads were being served next to posts containing the hashtag #HeilHitler. I don’t know what you think on the matter, but if it’s “difficult” for a social media service to filter content with that hashtag, I’d say that the service in question isn’t worth using.

I don’t have a problem. I’m a free speech absolutist. If it’s legal it stays.

Yes, yes, you don’t care if the site becomes a shitpit for racists and Nazis and antisemites so long as their speech remains untouched and shoved in everyone’s faced and shoved next to ads run by companies that don’t necessarily want to have their ads placed next to pro-Nazi memes. We fucking get it.

I’ve called out Musk on banning people critical of him.

Not loudly enough.

I see banning content as against freedom.

If I run a social media service and I ban pro-Nazi content, upon whose freedom am I infringing, how am I infringing upon their freedom, and what proper-ass legal consequences should I face in a court of law for infringing upon that freedom? Because…

Especially if such content is far from generally available.

…you seem to be forgetting that no privately owned and operated interactive web service, no matter how open to the public it may be, has even one obligation⁠—legal, moral, or ethical⁠—to host content that it doesn’t want to host. Don’t come to me with shit like “generic speech platforms” because that’ll make you sound like our resident transphobe and I’ll debunk that shit faster than you can think of a dozen comebacks against what I’ll say. And don’t tell me that a refusal to host content is “censorship” because you and I both know that you’ll be trying to sell me a load of bullshit by calling it “an argument”.

LostInLoDOS (profile) says:

Re: Re: Re:4

content, can find it on Twitter in droves

Can find it if they seek it out. I. No problem at all.

without manipulating Twitter’s algorithms

Refreshing the page hundreds of times to force an ad through is manipulation.

you so desperately want it to be.

Actually, I don’t care in the end. My focus is on censorship. X existing or not has zero effect on me. I don’t use it.

And when NYT posts that in an article about the posts? They did, btw. How do you not filter out them if they chose to post the article on X?

shoved in everyone’s faced

Bull. It’s not shoved in any one’s face. It’s not placed in any feeds unless you specifically choose to follow that material.
I’m absolutely against forcing anything on people. It’s part of my issue with Google algorithms.
I have no problem with alternative content existing if it is over there where the people interested can access it and those not, don’t get it.

If I run a social media service and I ban pro-Nazi content, upon whose freedom am I infringing, how am I infringing upon their freedom,

Those that post the materials and those that wish to consume it.

and what proper-ass legal consequences should I face in a court of law for infringing upon that freedom

None. Private company, private rules.

to host content that it doesn’t want to host

I have never said that. Never. In fact I am against the push to force hosting. Even more than my hatred for censorship.

And don’t tell me that a refusal to host content is “censorship”

That it is. I don’t subscribe to the nonsense that censorship is a government action. If a book store refuses a title, that’s censorship. If a department store refuses a clothing line, that’s censorship. If a social platform blocks or removes content, that’s censorship.
I cringe when people like you try to ignore localised censorship just because you agree with the action and it wasn’t the government.

Stephen T. Stone (profile) says:

Re: Re: Re:5

Refreshing the page hundreds of times to force an ad through is manipulation.

And what if someone finds an ad next to pro-Nazi content with one shot⁠—is that not a problem to you?

I don’t care in the end.

You do, but go on.

I have no problem with alternative content existing if it is over there where the people interested can access it and those not, don’t get it.

And yet, when I talk about a social media service banning “alternative content” (that’s a nice euphemism for bigotry, pro-Nazi speech, and other such hateful content), you keep referring to it as “censorship” because “Oh My GoD iT’s DeLeTiOn!!!1!” and you keep crowing on and on about how deletion is censorship even if the content being deleted goes against the rules and the content was already posted somewhere else. You are practically aching to say that social media services censor users by making them follow rules that prevent them from harassing people. It’s right on the tip of your tongue.

I am against the push to force hosting.

And yet, you advocate against the idea of social media services deleting content that violates their rules (but not U.S. law). You’ve said multiple times before that you believe such deletion is censorship. By advocating for social media services to continue hosting such content without deletion, you are advocating for those services to force themselves into hosting content they otherwise wouldn’t. But hey, keep telling yourself that you’re not⁠—comforting yourself with lies is what you do best. (Do you think Henry Kissinger was a hero, too?)

If a book store refuses a title, that’s censorship. If a department store refuses a clothing line, that’s censorship. If a social platform blocks or removes content, that’s censorship.

None of those things are censorship.

The First Amendment protects your rights to speak freely and associate with whomever you want. It doesn’t give you the right to make others listen, make others give you access to an audience, and/or make a personal soapbox out of private property you don’t own. Nobody owes you a platform or an audience at their expense.

No book is inherently owed a spot in a book store. No clothing line is inherently owed a spot in a department store. No speech is inherently owed a spot on a social media network. If any of those statements were false, we’d have a much different story on our hands. But without a right of “free reach”, all of those statements are true⁠—and hey, look at that: The right to “free reach” is some imaginary bullshit with no basis in U.S. law (including judicial precedent)!

But hey, by all means, please explain how Barnes & Noble voluntarily refusing to carry a book that B&N execs don’t want sold through its stores is preventing that book from being sold anywhere else under threat of legal punishment. I’m sure your reasoning won’t sound like complete bullshit~.

LostInLoDOS (profile) says:

Re: Re: Re:6

is that not a problem to you

No. That’s a problem for the advertisers. I understand that advertisers don’t choose their spots online.

content being deleted goes against the rules and the content

It’s still censorship. You have this premise that if it’s legal it’s not censorship. Remember cleanflix? The advertised that the censored content.
What I still don’t get is why these companies don’t just kick the issue in the balls and say yes, we do. Follow the TOS or don’t use the service.

advocate against the idea of social media services deleting content that violates their rules

I advocate for different rules. Different methods. Compartmentalism of content.

None of those things are censorship

Blocking or removing content is censorship. Private censorship is legal. You have the right to choose what content you host. I have the right to point out you practice censorship.

Stephen T. Stone (profile) says:

Re: Re: Re:7

I understand that advertisers don’t choose their spots online.

They still choose the site on which they want to place their ads. If the site makes it possible for those ads to regularly show up next to patently offensive content, that is a problem for the site. No brand worth a good god’s damn would want their ads next to pro-Nazi content; the problem, then, is the site’s embrace of such content to the point where the owner of that site believes he’s being a champion of free speech by allowing that speech even if it causes his site to literally die on that hill.

You have this premise that if it’s legal it’s not censorship.

My premise is that if someone isn’t prevented from sharing an expression of ideas by force of law, acts of violence, or any threats thereof, they haven’t been censored. If Mike wants to delete my comment after I post it, I can repost it on my personal website, on my social media accounts, or any other site willing to host this comment. Mike didn’t censor me any more than Twitter would be censoring Nazis by deleting their speech as TOS violations.

I advocate for different rules. Different methods. Compartmentalism of content.

And those rules and methods amount to advocating for a site owner to keep hosting the content they want to get rid of because you’ll call them a censor and a bully and other such mean names if they do delete it. Your rules and methods would preclude a site owner from fully getting rid of bigoted content⁠—and it would take one screenshot from behind the walls erected around that content getting out to paint that site as a host for that content even if it’s been sandboxed. And you’d still ask for the site owner to keep hosting the content out of an extreme absolutist belief in the freedom of speech even if the reputations of the site and its owner were tanked by those leaks. You’d ask someone to destroy themselves and their site all so a neo-Nazi prick could toss around racial slurs on a site that never wanted to host them in the first place.

Blocking or removing content is censorship.

If Barnes & Noble refuses to carry a book for sale, that content isn’t blocked from being sold at any other bookstore or through any online bookseller. The same principle applies if B&N removes a book from its shelves. No book⁠—not a single goddamn book in the entirety of existence!⁠—is owed a spot on shelves in even one B&N store. To suggest that B&N’s refusal to give any given book a spot on its shelves equals “censorship” is to suggest that “freedom of reach” is a real thing, and what you’d be suggesting is so blatant a lie and so horrifying a concept that I have to wonder if you have some form of brain damage.

You have the right to choose what content you host. I have the right to point out you practice censorship.

See? This implies that choosing what content I would host on a social media service I own and operate is denying someone their right to free reach⁠—that I’d be “censoring” someone because my one lone service, which would apparently be the only existing social media service or host for third-party speech anywhere on the Internet, refuses to host speech by bigots and assholes.

What the fuck is wrong with you, man. I mean, other than your uncritical, unyielding, and undying adherence to Trumpism.

LostInLoDOS (profile) says:

Re: Re: Re:8

even if it causes his site to literally die on that hill.

That is the owner’s right. Musk is far from the champion he claims to be, but it ultimately is his choice.

Mike didn’t censor me

We will never agree on this. I support the right to legal censorship by private individuals. And my right to scream out the fact that they do. Forced speech is nearly as bad as banned speech.
But if you remove it, you act as a censor.

keep hosting the content they want to get rid of

Not at all. But I will practice my free speech by pointing out the factual censoring act. Anyone who removes content is acting as a censor.

even if it’s been sandboxed

And I will point out that anyone who has a problem with highly compartmentalised content need to review their own taste. What if x suddenly decided no catholics or no abortion discussion. See, some people want that. And a site that hosts it is supplying a service for speech. Not supporting the content of said speech.

freedom of reach

A private company has the right to host what they choose. But! When they ban speech, they should be called out for it.

See, it’s easy to see where we stand and why. You haven’t been affected by censorship. You’re probably younger than me. But i remember the book, music, game bans. I remember the television content debates and the invention of the v chip.
The government likes censorship. And they depend on like minded private individuals to enforce bans they can’t create themselves.
Be it musicland not carrying badged albums for a long period of time, or blockbuster banning NC17 content.
Yet, freedom ultimately wins out despite any early backlash. Even when the holdouts change their minds.
Tower Records outlived musicland. Family video had locations years after the fall of blockbuster, as did Lion.
X will likely carry on as well. The panic will pass. Life moves on.

Stephen T. Stone (profile) says:

Re: Re: Re:9

Forced speech is nearly as bad as banned speech.

And yet, you’ve literally advocated for a moderation method that is effectively “forced speech”. By the by, ask 4chan’s current owner how that site’s attempt to quarantine speech via /pol/ worked out.

Anyone who removes content is acting as a censor.

Does kicking someone out of my home for shittalking my family actually censor them?

  • If “yes”: I really can’t want to see you explain that logic.
  • If “no”: How does the same logic somehow not apply to privately owned websites?

anyone who has a problem with highly compartmentalised content need to review their own taste

…fucking what

What if [Twitter] suddenly decided no catholics or no abortion discussion. See, some people want that.

Choosing to ban either Catholics or discussion of abortion would be the decision of Elon Musk. And the ban that would be legal would be a stupid decision, but an otherwise lawful one. He has no obligation to host any kind of discussion of abortion on Twitter.

And a site that hosts it is supplying a service for speech. Not supporting the content of said speech.

Choosing what speech to host (or not) is an implicit endorsement of/association with that speech. If a site chooses to host antisemitic speech, regardless of quarantining, it is choosing to associate itself with that speech. The same issue applied broadly with KiwiFarms: Its old hosting company became associated with the activity from that site, which led that company to rethink its association with KF (and ultimately drop KF as a client).

Elon’s decision to let extremist right-wing hate speech on Twitter with little-to-no pushback means Twitter is now associated with that speech. If Elon didn’t want that association to happen, he could’ve left intact stringent bans on that speech, suspensions on the worst offenders, and the Trust & Safety team tasked with enforcing the TOS. But he didn’t, so it happened. You can bitch and moan about how unfair that is, but life isn’t fair⁠—the fact that Henry Kissinger lived to 100 is proof enough of that.

A private company has the right to host what they choose. But! When they ban speech, they should be called out for it.

Why?

A privately owned website has every right to choose what speech it will host. If it chooses not to host racial slurs, queerphobia, and Nazi propaganda, why does that deserve to be “called out” as “censorship” when that decision doesn’t stop anyone who wants to post that speech somewhere from posting it anywhere else but that one website? Therein lies your problem, Trumpist: You’re still laboring under the delusion that the right to free reach exists and the decision by a website to say “no, this site won’t host this kind of speech”⁠—the decision to deny someone the privilege of using private property they don’t own as a soapbox⁠—is a violation of that imagined right and therefore censorship.

But i remember the book, music, game bans. I remember the television content debates and the invention of the v chip.

Son, I’ve been alive for all those things. I was alive when the PMRC got smacked down in Congress by Dee Snider and John Denver. I was alive when Night Trap and Mortal Kombat were the subject of Congressional hearings and the ESRB was born; shit, I remember when Sega instituted its own pre-ESRB ratings system. I remember when TV shows didn’t have TV ratings logos in the upper-left-hand corner of the screen. Don’t you dare presume what I’ve experienced or what has or hasn’t affected me unless I give out more than enough information for you to make a very educated guess.

Oh, and FYI: None of the censorship of which you speak is even remotely the same thing as a website voluntarily refusing to host third-party speech that the owner of that website considers objectionable. Don’t bother trying to argue otherwise; I’m not that stupid and you’re not that clever.

The government likes censorship. And they depend on like minded private individuals to enforce bans they can’t create themselves.
Be it musicland not carrying badged albums for a long period of time, or blockbuster banning NC17 content.

Walmart still doesn’t sell albums with “Parental Guidance” stickers. So what? People could always buy them elsewhere even before Internet shopping became a thing. A store/chain voluntarily deciding what products it will and won’t sell⁠—which, might I add, it’s allowed to do without government interference⁠—isn’t censorship unless you believe in the right of free reach. So please, go ahead and keep trying to explain how a Barnes & Noble voluntarily refusing to sell a book that can be bought at any other bookstore and any online retailer is “censorship” of that book. And while you’re at it, maybe you can get a camel through the eye of a needle.

LostInLoDOS (profile) says:

Re: Re: Re:10

And yet, you’ve literally advocated for a moderation method that is effectively “forced speech”

I’m really not. I simply point out facts. You can support free speech fully, partly, or stand against it. If you do not support it fully, I have the right to point that out, eg Musk and X.

Does kicking someone out of my home

Yes. They have had their speech censored within your walls. That is your right as the owner. They have the right to say you censored them afterwards.

He has no obligation to host any kind of discussion of abortion on Twitter.

Exactly. But that doesn’t change the fact that he has acted as a censor.

now associated with that speech

Apparently true for the small minded self righteous idiots that can’t understand a platform is not a publisher. Maybe you should join the MPA?

Why

Because they utilise censorship.

is even remotely the same thing as a website voluntarily refusing to host third-party speech that the owner of that website considers objectionable

It completely is. If a station, streamer, game system, edited or refuses a game despite the ratings systems then they are censoring. Just as walmart did. Most Walmart’s don’t have CDs anymore, they do carry M games and R videos. Oh, and they did carry show girls, much controversy over that from the moral police.

censorship unless you believe in the right of free reach. So please, go ahead and keep trying to explain how a Barnes & Noble voluntarily refusing to sell a book

That’s quite easy to understand. If the book is available to them, and they refuse to stick it, then they have censored the title from their sales.
You’ll have to purchase it elsewhere. At a less censorial location.

Stephen T. Stone (profile) says:

Re: Re: Re:11

I’m really not.

You really are, though. You’re advocating for a position of forcing a platform to either host all legally protected speech (regardless of any quarantining) or accept the false label of “censor and enemy of free speech” (which will obviously harm its reputation).

You can support free speech fully, partly, or stand against it.

A platform choosing not to host certain kinds of speech is using its own right to free speech in saying “no, we won’t host your Nazi bullshit here”. You’re saying that a platform choosing to do that is against free speech.

Yes. They have had their speech censored within your walls.

How did I prevent that person from going anywhere else in the world and shittalking my family? How did I infringe upon their actual legal right to speak freely and not their imagined right to speak freely on private property that they don’t own?

They have the right to say you censored them afterwards.

“I have been silenced!” yells the dipshit into a megaphone while standing on the public road outside of my house. “My right to speak has been infringed!” yells that same dipshit while standing on the public walkway outside of my job. “Something must be done to stop this infringement of civil rights,” he posts on multiple forums on the Internet. Yep, I sure as shit censored him~.

Apparently true for the small minded self righteous idiots that can’t understand a platform is not a publisher.

If I let Nazi speech onto my platform without explicit approval of that speech, I am still giving that speech an implicit endorsement: “I am okay with this speech being on my platform regardless of how other users feel about it.” If I ostensibly ban Nazi speech but let it propagate without any actual moderation⁠—or do as you suggest and quarantine it where it can still be easily found and leaked outside of the platform⁠—I am still sending that same message. The only way for me to avoid that association, implicit or otherwise, is to prevent that speech from being posted onto my platform or moderate that speech off of my platform as quickly and ruthlessly as possible. And since the people who want to post that speech on my property have no right to do so, and I have no obligation of any kind to host it, calling my decision “censorship” is like referring to professional wrestling as a “legitimate combat sport”: You might be able to make someone who has either an intellectual disability or a non-functional sense of reason believe you, but you’re not going to be taken seriously by anyone who isn’t in one of those two groups. Go sell that shit to Trumpists, because I’m sure they’ll buy it.

If a station, streamer, game system, edited or refuses a game despite the ratings systems then they are censoring.

No, they’re not. Let’s look at Nintendo: They have every right to say “we don’t allow games like Doom Eternal on the Switch” and enforce that decision by refusing to allow that game to be published on the Switch. That decision can’t, doesn’t, and won’t ever prevent Doom Eternal from being sold on platforms that can run Doom Eternal without a dip in graphical fidelity. If Bethesda thinks it can edit Doom Eternal to fall within Nintendo guidelines, I’d consider that straddling a fine line between censorship and editorial discretion. (Oh, and FYI? Doom [2016] and Doom Eternal were both released uncut and uncensored on the Switch.)

The hypothetical above is a completely different beast than the days of gaming where Nintendo and one or two competitors ruled over all of gaming and PC gaming was considered a niche effort. Konami’s removal of crosses in Super Castlevania IV was absolutely censorship because Nintendo would’ve refused to let the game release in the States without that censorship and Konami had no other viable options for releasing the game. Contrast that with the idea of “Doom Eternal can’t be released on the Switch” in an era where owners the current versions of the PlayStation and the XBox, as well as PC gamers, could already get their hands on it. Please, tell me again how a game being unavailable on a single console because of the rules of the company that makes said console⁠—a game that is already available on multiple consoles and PC platforms⁠—is censoring the publisher of that game by preventing them from releasing that game on other platforms. I dare you.

If the book is available to them, and they refuse to stick it, then they have censored the title from their sales.

The book is only censored if you believe in the right of free reach. That idea forms the foundation of your entire argument about censorship⁠—and since the idea isn’t an actual law or binding legal precedent, your argument is about as solid as the ocean and about as deep as a mud puddle.

LostInLoDOS (profile) says:

Re: Re: Re:12

You’re advocating for a position of forcing

Not forcing, only labelling. If a platform censored content on their platform, they get labelled a censor. Eg, X, censoring tweets about Musk.

How did I prevent that person from going anywhere else in the world and shittalking my family?

You didn’t, you only censored them within your property.

Yep, I sure as shit censored him~.

Only within your home. Though having a megaphone may violate noise ordinances.

groups. Go sell that shit to Trumpists, because I’m sure they’ll buy it

Not likely, Trump seeks to censor fake stories. Thus, censorship.

That decision can’t, doesn’t, and won’t ever prevent Doom Eternal from being sold on platforms that can run Doom Eternal… I dare you

Nintendo doesn’t censor (some publishers do on the platform). Sony is the king of censorship for the last 2 Generations.

If a game is censored in CensorStation, and not on Switch, CS players have a censored version. To play the proper version you need to use a Switch, or PC, or Xbox, or Mac.
But Sony doesn’t tell you they have neutered the game. You have to buy it. Then find out it is a damaged censored title. That’s where local censorship goes wrong.
That is why one must mark all forms of localised censorship they come across.

believe in the right of free reach

Yes, you seek to call local censorship by any other term you can. Just because something is over there doesn’t change the fact that it is not here, and only by the decision of the here to censor that content.
Free reach does not extend to private locales. Private companies can censor all they wish. I have a right to point out when they do.

Stephen T. Stone (profile) says:

Re: Re: Re:13

Not forcing, only labelling.

Distinction without a difference when you’re asking a platform to harm itself by accepting a negative label one way or another.

You didn’t, you only censored them within your property.

The right of free reach isn’t a thing; neither is the right to use someone else’s property as your soapbox without their permission. The only way booting someone out of my home for shittalking my family could ever⁠—and I mean eeeeeeeeeeeeeeeeeeeeeeeeeever⁠—qualify as censorship is if either of those rights actually existed.

Trump seeks to censor fake stories

Yeah, sure, that’s the only kind of stories he wants to censor~. He has absolutely no issue with people reporting facts like “Donald Trump lost a free and fair election”~. He actually wanted to open up libel laws to make things more difficult for him to sue people who said mean-yet-true things about him~.

(jfc you disgust me with your asskissing)

Nintendo doesn’t censor (some publishers do on the platform).

Nintendo did censor in the past. I mean, do you think Acclaim and Midway made the decision to censor Mortal Kombat when it was released on the SNES when it included a blood code on the Genesis port? Yes, nowadays, publishers are more likely to censor content to make it more palatable for (and in certain cases, legally sellable to) specific audiences. Rare is the day when a console maker actually goes “y’all axe the titties and the gore or y’all ain’t gettin’ your game out on our console”, your over-a-decade-old examples included.

Sony is the king of censorship for the last 2 Generations.

Is that for niche Japanese games released in the United States that required alterations to either dodge an AO rating or avoid accusations of some seriously unsavory shit because of cultural differences between Japan and the States, or is that for American games as well? Go on, give me some examples of Sony’s made-and-marketed-for-the-States releases suffering from legitimate content censorship that isn’t found on any other platform. You know I’ll wait.

you seek to call local censorship by any other term you can

You seek to label any form of moderation that isn’t “yeah, sure, let people say slurs on here all they want, just hide those users and their vileness under this rug that everyone can easily lift up to expose all the nasty shit” as censorship. You seek to paint any platform that outright refuses to host speech as vile as racial slurs and anti-queer propaganda as “enemies of free speech”. You seek a path where the right to free reach is enshrined in law so what you refer to as “censorship” can never take place again because you consider yourself an enemy of censorship in all forms.

Alls I’m doing is telling you that kicking someone off a social media platform for saying the N-word is moderation. You’re the one who believes that losing the unguaranteed privilege of using someone else’s property as your own private soapbox is the exact same act in every possible respect as being denied the right to express yourself freely without interference⁠—and that those who would deny that privilege to anyone should be given the exact same treatment in every possible respect as people who burn books instead of reading them.

You have more in common with Moms for Liberty and Greg Locke than you’ll ever have with any actual champion of free speech.

LostInLoDOS (profile) says:

Re: Re: Re:14

Wait no longer:
https://comicbook.com/gaming/news/ps4-ps5-playstation-game-martha-is-dead-censored/

I walked away from CS when they damaged Bunny Must Die. Making the progression require far more difficulty than necessary. The same game I now own on Switch is not censored. And I can progress properly.
It’s not just Japanese games. It’s indi developers as well.

LostInLoDOS (profile) says:

Re: Re: Re:15

The last few years, PlayStation has earned a reputation for censoring games. The most noteworthy example of this was Devil May Cry 5

Last I checked, none of these are AVN titles. The list is extensive. And very few are localised AVN. Very little of such material is on the CS, even in Japan (I have family and friends in the region).

I’ll grant the fact they backed off over the last year, and even allowed updates that fix some earlier cuts.
There was a mass exodus of US users over 2 years. Over 30% of (American) first purchase buyers abandoned the platform according to CIRP.
Sony made the wrong decision. They fixed the problem, mostly, but their reputation is destroyed. Much like Nintendo in the late 90s new leadership changes direction.
The difference, here, is Nintendo was a kids company that finally grew up. Sony was a cross spectrum company that chose to act like a kids company. Will they recover? Don’t know. But I’ll never return.

LostInLoDOS (profile) says:

Re: Re: Re:14

Late, part of the idea bit here but there… the move to digital.
You can no longer import a large number of games. There is/was a huge import market in this country. I have imported games since the Sg3000/master system days. Today, with digital lockouts, you can’t simply place an order from a catalog anymore. Or even import a system and just plug and play.
Nintendo has completely turned around and become a beacon of freedom in the gaming market.

Every content cut should be shouted far wide. So that those unaware are made aware.

andrea iravani says:

Elon Musk may end up spending the rest of his life in court, prison, or both. When Musk announced that he was going to purchase Twitter for $44 miloion, I correctly pointed out that Musk does not give a damn about the 1st amendment, and that he purcgased it strictly to develop an AI Chatbot, which today he announced that in a few weeks, a Grok analysis button will be added under all X posts.

Silicon Valley, whose culture Musk is part of is totally out of sync with what techlogy and social media users want and expect from technology and social media.

Indistry after industry, people have gone on strike against AI chatbots. Industry after industry have bannned the use of them by employees over security concerns, in both the public and private sector.

It is just a sick evil stupid scam and a form of data theft and identity theft by retards, perverts, and thieves.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:42 Supreme Court Shrugs Off Opportunity To Overturn Fifth Circuit's Batshit Support Of Texas Drag Show Ban (62)
15:31 Hong Kong's Zero-Opposition Legislature Aims To Up Oppression With New 'National Security' Law (33)
09:30 5th Circuit Is Gonna 5th Circus: Declares Age Verification Perfectly Fine Under The First Amendment (95)
13:35 Missouri’s New Speech Police (67)
15:40 Florida Legislator Files Bill That Would Keep Killer Cops From Being Named And Shamed (38)
10:49 Fifth Circuit: Upon Further Review, Fuck The First Amendment (39)
13:35 City Of Los Angeles Files Another Lawsuit Against Recipient Of Cop Photos The LAPD Accidentally Released (5)
09:30 Sorry Appin, We’re Not Taking Down Our Article About Your Attempts To Silence Reporters (41)
10:47 After Inexplicably Allowing Unconstitutional Book Ban To Stay Alive For Six Months, The Fifth Circuit Finally Shuts It Down (23)
15:39 Judge Reminds Deputies They Can't Arrest Someone Just Because They Don't Like What Is Being Said (33)
13:24 Trump Has To Pay $392k For His NY Times SLAPP Suit (16)
10:43 Oklahoma Senator Thinks Journalists Need Licenses, Should Be Trained By PragerU (88)
11:05 Appeals Court: Ban On Religious Ads Is Unconstitutional Because It's Pretty Much Impossible To Define 'Religion' (35)
10:49 Colorado Journalist Says Fuck Prior Restraint, Dares Court To Keep Violating The 1st Amendment (35)
09:33 Free Speech Experts Realizing Just How Big A Free Speech Hypocrite Elon Is (55)
15:33 No Love For The Haters: Illinois Bans Book Bans (But Not Really) (38)
10:44 Because The Fifth Circuit Again Did Something Ridiculous, The Copia Institute Filed Yet Another Amicus Brief At SCOTUS (11)
12:59 Millions Of People Are Blocked By Pornhub Because Of Age Verification Laws (78)
10:59 Federal Court Says First Amendment Protects Engineers Who Offer Expert Testimony Without A License (17)
12:58 Sending Cops To Search Classrooms For Controversial Books Is Just Something We Do Now, I Guess (221)
09:31 Utah Finally Sued Over Its Obviously Unconstitutional Social Media ‘But Think Of The Kids!’ Law (47)
12:09 The EU’s Investigation Of ExTwitter Is Ridiculous & Censorial (37)
09:25 Media Matters Sues Texas AG Ken Paxton To Stop His Bogus, Censorial ‘Investigation’ (44)
09:25 Missouri AG Announces Bullshit Censorial Investigation Into Media Matters Over Its Speech (108)
09:27 Supporting Free Speech Means Supporting Victims Of SLAPP Suits, Even If You Disagree With The Speakers (74)
15:19 State Of Iowa Sued By Pretty Much Everyone After Codifying Hatred With A LGBTQ-Targeting Book Ban (157)
13:54 Retiree Arrested For Criticizing Local Officials Will Have Her Case Heard By The Supreme Court (9)
12:04 Judge Says Montana’s TikTok Ban Is Obviously Unconstitutional (4)
09:27 Congrats To Elon Musk: I Didn’t Think You Had It In You To File A Lawsuit This Stupid. But, You Crazy Bastard, You Did It! (151)
12:18 If You Kill Two People In A Car Crash, You Shouldn’t Then Sue Their Relatives For Emailing Your University About What You Did (47)
More arrow