On Social Media Nazi Bars, Tradeoffs, And The Impossibility Of Content Moderation At Scale

from the decentralizing-the-nazi-bar-problem dept

A few weeks ago I wrote about an interview that Substack CEO Chris Best did about his company’s new offering, Substack Notes, and his unwillingness to answer questions about specific content moderation hypotheticals. As I said at the time, the worst part was Best’s unwillingness to just own up to what he was saying were the site’s content moderation plans, which was that they would be quite open to hosting the speech of almost anyone, no matter how terrible. That’s a decision that you can make (in the US at least), but if you’re going to do that, you have to be willing to own the decision that you’re making and be clear about it, which Best was unwilling to do.

I compared it the “Nazi bar” problem that has been widely discussed on social media in the past, where if you own a bar, and don’t kick the Nazis out up front, you get the reputation as a “Nazi bar” that is difficult to get rid of.

It was interesting to see the response to this piece. Some people got mad, claiming it was unfair to call Best a Nazi, even though I was not doing that. As in the story of the Nazi bar, no one is claiming that the bar owner is a Nazi, just that the public reputation of his bar would be that it’s a Nazi bar. That was the larger point. Your reputation is what you allow, and if you’re taking a stance that you don’t want to get involved at all, and you want to allow such things, that’s the reputation that’s going to stick.

I wasn’t calling Best a Nazi or a Nazi sympathizer. I was saying that if he can’t answer a straightforward question like the one that Nilay Patel asked him, Nazis are going to interpret that as he’s welcoming them in, and they will act accordingly. So too will people who don’t want to be seen hanging out at the Nazi bar. The vaunted “marketplace of ideas” includes the ability for a large group of people to say “we don’t want to be associated with that at all…” and to find somewhere else to go.

And this brings us to Bluesky. I’ve written a bunch about Bluesky going back to Jack Dorsey’s initial announcement which cited my paper among others as part of the inspiration for betting on protocols.

As Bluesky has gained a lot of attention over the past week or so, there have been a lot of questions raised about its content moderation plans. A lot of people, in particular, seem confused by its plans for composable moderation, which we spoke about a few weeks ago. I’ve even had a few people suggest to me that Bluesky’s plans represented a similar kind of “Nazi bar” problem as Best’s interview did, in particular because their initial reference implementation shows “hate speech” as a toggle.

I’ve also seen some people claim (falsely) that Bluesky would refuse to remove Nazis based on this. I think there is some confusion here, and it’s important to go deeper on how this might work. I have no direct insight into Bluesky’s plans. And they will likely make big mistakes, because everyone in this space makes mistakes. It’s impossible not to. And, who knows, perhaps they will run into their own Nazi bar problem, but I think there are some differences that are worth exploring here. And those differences suggest that Bluesky is better positioned not to be the Nazi bar.

The first is that, as I noted in the original piece about Best, there’s a big difference between a centralized service and its moderation choices, and a decentralized protocol. Bluesky is a bit confusing to some as it’s trying to do both things. Its larger goal is to build, promote, and support the open AT Protocol as an open social media protocol for a decentralized social media system with portable identification. Bluesky itself is a reference app for the protocol, showing how things can be done — and, as such it has to do content moderation tasks to avoid Bluesky itself running into the Nazi bar problem. And, at least so far, it seems to be doing that.

The team at Bluesky seems to recognize this. Unlike Best, they’re not refusing to answer the question, they’re talking openly about the challenges here, but so far have been willing to remove truly disruptive participants, as CEO Jay Graber notes here:

But, they definitely also recognize that content moderation at scale is impossible to do well, and believe that they need a different approach. And, again, the team at Bluesky recognizes at least some of the challenges facing them:

But, this is where things get potentially more interesting. Under a traditional centralized social media setup, there is one single decision maker who has to make the calls. And then you’re in a sort of benevolent dictator setup (or at least you hope so, as the malicious dictator threat becomes real).

And this is where we go on a little tangent about content moderation: again, it’s not just difficult. It’s not just “hard” to do. It’s impossible to do well. The people who are moderated, with rare exceptions, will disagree with your moderation decisions. And, while many people think that there are a whole bunch of obvious cases and just a few that are a little fuzzy, the reality (this is part of the scale part) is that there are a ton of borderline cases that all come down to very subjective calls over what does or does not violate a policy.

To some extent, going straight to the “Nazi” example is unfair, because there’s a huge spectrum between the user who is a hateful bigot, deliberately trying to cause trouble, and the good helpful user who is trying to do well. There’s a very wide range in the middle and where people draw their own lines will differ massively. Some of them may include inadvertent or ignorant assholery. Some of it may just include trolling. Or sometimes there are jokes that some people find funny, and others find threatening. Sometimes people are just scared and lash out out of fear or confusion. Some people feel cornered, and get defensive when they should be looking inward.

Humans are fucking messy.

And this is where the protocol approach with composable moderation becomes a lot more interesting. On the most extreme calls, the ones where there are legal requirements, such as child sexual abuse material and copyright infringement, for example, those can be removed at the protocol level. But as you start moving up into the more murky areas, where many of the calls are subjective (not so much: “is this person a Nazi” but more along the lines of “is this person deliberately trolling, or just uninformed…”) the composable moderation system begins to let (1) the end users make their own rules and (2) enable any number of 3rd parties to build tools to work with those rules.

Some people may (for perfectly good reasons, bad reasons, or no reasons at all) just not have any tolerance for any kind of ignorance. Others may be more open to it, perhaps hoping to guide ignorance to knowledge. Just as an example, outside of the “hateful” space, we’ve talked before about things like “eating disorder” communities. One of the notable things there was that when those communities were on more mainstream services, people who had gotten over an eating disorder would often go back to those communities and provide help and support to those who needed it. When those communities were booted from the mainstream services, that actually became much more difficult, and the communities became angrier and more insulated, and there was less ability for people to help those in need.

That is, there will still need to be some decision making at the protocol level (this is something that people who insist on “totally censorship proof” systems seem to miss: if you do this, eventually the government is going to shut you down for hosting CSAM), but the more of the decision making that can be pushed to a different level and the more control put in the hands of the user, the better.

This allows for more competition for better moderation, first of all, but also allows for the variance in preferences, which is what you see in the simple version that Bluesky implemented. The biggest decisions can be made at the protocol level, but above that, let there be competitive approaches and more user control. It’s unclear exactly where Bluesky the service will come down in the end, but the early indications from what’s been said so far are that the service level “Bluesky” will be more aggressive in moderating, while the protocol level “AT Protocol” will be more open.

And… that’s probably how it should be. Even the worst people should be able to use a telephone or email. But, enabling competition at the service level AND at the moderation level, creates more of the vaunted “marketplace of ideas” where (unlike what some people think the marketplace of ideas is about), if you’re regularly a disruptive, disingenuous, or malicious asshole, you are much more likely to get less (or possibly no) attention from the popular moderation services and algorithms. Those are the consequences of your own actions. But you don’t get banned from the protocol.

To some extent, we’ve already seen this play out (in a slightly different form) with Mastodon. Truly awful sites like Gab, and ridiculously pathetic sites like Truth Social, both use the underlying ActivityPub and open source Mastodon code, but they have been defederated from the rest of the fediverse. They still get to use the underlying technology, but they don’t get to use it to be obnoxiously disruptive to the main userbase who wants nothing to do with them.

With AT Protocol, and the concept of composable moderation, this can get taken even further. Rather than just having to choose your server, and be at the whims of that server admin’s moderation choices (or the pressure from other instances which keeps many instances in check and aligned), the AT Protocol setup allows for a more granular and fluid system, where there can be a lot more user empowerment, without having to resort to banning certain users from using the technology entirely.

This will never satisfy some people, who will continue to insist that the only way to stop a “bad” person is to ban them from basically any opportunity to use communications infrastructure. However, I disagree for multiple reasons. First, as noted above, outside of the worst of the worst, deciding who is “good” and who is “bad” is way more complicated and fraught and subjective than people like to note, and where and how you draw those lines will differ for almost everyone. And people who are quick to draw those lines should realize that… some other day, someone who dislikes you might be drawing those lines too. And, as the eating disorder case study demonstrated, there’s a lot more complexity and nuance than many people believe.

That’s why a decentralized solution is so much better than a centralized one. With a decentralized system you don’t have to be worrying about yourself getting cut out either. Everyone gets to set their own rules and their own conditions and their own preferences. And, if you’re correct that the truly awful people are truly awful, then it’s likely that most moderation tools and most servers will treat them as such, and you can rely on that, rather than having them cut off at the underlying protocol level.

It’s also interesting to also see how the decentralized social media protocol nostr is handling this as well. While it appears that some of the initial thinking behind it was the idea that nothing should ever be taken down, it appears that many are recognizing how impossible that is, and they’re now having really thoughtful discussions on “bottom up content moderation” specifically to avoid the “Nazi bar” problem.

Eventually in the process, thoughtful people recognize that a community needs some level of norms and rules. The question is how are those created, how are they implemented, and how are they enforced and by whom. A decentralized system allows for much greater control by end users to have the systems and communities that more closely match their own preferences, rather than requiring the centralized authority handle everything, and be able to live up to everyone’s expectations.

As such, you may end up with results like Mastodon/ActivityPub, where “Nazi bar” areas still form, but they are wholly separated from other users. Or you may end up with a result where the worst users are still there, shouting into the wind with no one bothering to listen, because no one wants to hear them. Or, possibly, it will be something else entirely as people experiment with new approaches enabled by a composable moderation system.

I’ll add one other note on that, because there are times when I’ve discussed this that people highlight that there are other forms of harassment or other kinds of risks beyond direct harassment. And just blocking a user does not stop them from harassing or encouraging or directing harassment against another. This is absolutely true. But, this kind of setup does also allow for better tooling for potentially monitoring such a thing without having to be exposed to it directly. This could take the form of Block Party’s “lockout folder” where you can have a trusted third party review the harassing messages you’ve been receiving rather than having to go through it yourself, or, conceivably. other monitoring and warning services could pop up, that could track people who are doing awful things, try to keep them from succeeding, and alert the proper people if things require escalation.

In short, decentralizing things, and allowing many different approaches, and open systems and tooling doesn’t solve all problems, but it presents some creative ways to handle the Nazi Bar problem that seem likely to be a lot more effective than living in denial and staring blankly into the Zoom screen as a reporter asks you a fairly basic question about how you’ll handle racist assholes on your platform.

Filed Under: , , , , , , ,
Companies: bluesky, substack

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “On Social Media Nazi Bars, Tradeoffs, And The Impossibility Of Content Moderation At Scale”

Subscribe: RSS Leave a comment
167 Comments
This comment has been deemed insightful by the community.
Bobson Dugnutt (profile) says:

Re: Re:

I realize after I posted this that I shouldn’t have made this crack in a tech forum. It’s on the nose and might be a personal affront.

Here’s where the crack came from. In 2009, self-identified and self-identifying Libertarian Peter Thiel built a bridge. Over the bridge hangs this sign:

“I no longer believe that freedom and democracy are compatible.”

He said this in a piece entitled “The Education of a Libertarian,” published by The Cato Institute, set up to advance libertarianism as an identity.

Thiel is also one of the most pre-eminent financiers of the world’s fastest-growing economic sector. He has Curtis Yarvin, who has utterly deranged views on culture and modernity under the pen name Mencius Moldbug, on retainer as his court philosopher. He is a faculty member of Stanford. He makes sure he gets invited to parties. He hosts salons featuring alt-right and White supremacist guests of honor. He invites himself to parties.

He’s now actively making a push to get his movement into politics. He has succeeded with acolyte JD Vance as Ohio’s senator (but failed to get Blake Masters as Arizona’s senator, but only by 5 points). Vance’s office is the conduit through which Thiel will smuggle his ideas, largely through his unelected, unnamed staff who will network with fellow rightwing groups.

So, yeah, Thiel is Power personified. And because Thiel is the most successful libertarian in business and in culture, he has made libertarianism in his image.

Many, if not most, libertarians followed Thiel across the bridge. Did you?

This comment has been deemed insightful by the community.
mick says:

Re: Re: Re:

Thiel pretends to be libertarian and all for small government, but then happily uses the government to kill off Gawker because they outed him as gay.

The guy’s a clown posing as a libertarian, as many wealthy people are wont to do. I’m not a libertarian (or any other type of joiner), but playing along with him and his pretend “libertarianism” is almost as hysterical as his early insistence that he was straight.

This comment has been deemed insightful by the community.
Bobson Dugnutt (profile) says:

Re: Re: Re:2

The problem with libertarianism is the same as Marxism in the U.S. They are mirror images, really.

Both are theoretically elegant. They do have fully formed, logically coherent arguments that they make. However, they both make grand claims to politics — ideology as a framework for organizing society.

Trouble is, when they try to turn theory into practice — the proof of the pudding is in its eating, after all — not only does practice look nothing like theory but they make the world a worse place by forcing reality to conform to their theories.

There is no ideal-form libertarian that we could point to to judge practical libertarianism against. It’s also not been able to create any successful political movement or a political program.

Peter Thiel is the most successful libertarian who has ever lived. He identifies as a Libertarian. He passes for Libertarian among Libertarian circles. He’s not only one of the wealthiest people alive today, he’s also one of the wealthiest people who lives in history.

He is very much political. He is very much philosophical. He is also building out his organizing principle of the world.

Read up on Curtis Yarvin to see where Thiel is trying to take libertarianism. And yeah, when Peter Thiel opened his bridge in 2009, many libertarians crossed it and believe “freedom” and “democracy” are at odds and define freedom as only deserving of the people who crossed that bridge.

For those who didn’t cross that bridge, find another bridge to cross, change your name, or … yeah, realize that democracy has a lot going for it and delivered more good to more people and generally more humanely than most other politics before it.

This comment has been deemed funny by the community.
Anonymous Coward says:

This will never satisfy some people, who will continue to insist that the only way to stop a “bad” person is to ban them from basically any opportunity to use communications infrastructure.

Don’t they know? The only thing that stops a bad guy with a tweet is a good guy with a tweet!

Anonymous Coward says:

Re:

Reminder to those who want to whine about “censorship”: Bluesky, like Twitter and Gab and all the others, can’t actually censor anyone.

They can, though the topic they CAN censor are limited to the following:

a) News and criticism about the social media platform itself
b) News and criticism about the leaders/founders of the social media platform itself
c) anything that falls afoul of 1A or are explicitly criminal in nature (which would then be under the purview of whatever criminal code/law the platform is following)

…and that’s about it! Everything else is just “are you going to follow the TOS or not”.

This has been a thing since the days of USENet, email lists AND IRC, and even earlier.

Also, while private platforms CAN slam you with a SLAPP, it is usually worse for them PR-wise to do so.

Corporate censorship is another thing altogether.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re:

Y’all can’t resist taking the bait, huh.

They can

No, they can’t. Even with the first two subjects you claim they can “censor”, they can’t stop anyone from going anywhere else⁠—including a competing platform!⁠—and saying those things. To censor is to deny someone their ability to speak; being booted off a single platform for violating its rules doesn’t deny anyone their rights.

You can want to be a martyr for free speech all the live-long day. Getting booted from a social media platform doesn’t, and won’t ever, make you one.

private platforms CAN slam you with a SLAPP

It’s possible, yes. But how often has that ever happened?

Anonymous Coward says:

Re: Re: Re:

they can’t stop anyone from going anywhere else⁠—including a competing platform!⁠—and saying those things.

Unfortunately, yes, this has happened before, though not with a social media platform.

Blitzchung, anyone? Kizuna Ai’s harassment from her parent company? (Full on censorship of an accurate timeline of events ON REDDIT and blocking of a GoogleDoc of said timeline of events) The latter even got the FUCKING CHINESE, of all people, to go “wow parent company, that’s fucked up”.

Rare, yes. But it has happened before. And even with the Blitzchung thing Blitzchung still was free to post elsewhere, but the attempted censorship did happen.

Getting booted from a social media platform doesn’t, and won’t ever, make you one.

Agreed there. Technically, you’d have to die to be a martyr, but, splitting hairs and all.

It’s possible, yes. But how often has that ever happened?

Hasn’t happened yet but knowing Elon, he’ll burn his dad’s blood emerald mine to force one.

Stephen T. Stone (profile) says:

Re: Re: Re:4

Not yet.

Yes, yet. No single social media service can take away anyone’s right to speak freely.

But Elon probably will try anyway.

Then that would be Elon doing it, not Twitter. Can you guess what the big metaphysical difference is between Apartheid Emerald Mine Space Karen and the corporation named Twitter?

Ethin Probst (profile) says:

Re: Re:

Nope, they can’t.
In law, “censorship” is defined as “any and all actions undertaken by a governmental or other authoritative entity which serve to restrict, prohibit, or suppress the dissemination, expression, publication, or transmission of information, communication, or ideas,” whereas “banning” (or any of the other terms that a “anti-viewpoint moderation bill” might use) might be defined, collectively, as “a private entities’ capacity to limit, prohibit, or suppress the dissemination, expression, publication, or transmission of information, communication, or ideas by an individual or group”. There is a significant difference between these definitions. You cannot claim that a “private entity” is a “governmental or other authoritative entity”, because that term means “any organization or individual that exercises governmental or quasi-governmental power, authority, or control over a particular jurisdiction, territory, or group of people,” whereas a “private entity” is “any organization or individual that is not a governmental or quasi-governmental entity and is primarily engaged in commercial, non-profit, or other non-governmental activities.” See?
Items A and B are nonsensical, and can only be done through civil disputes. Oh, a private entity could remove your criticism on its servers or systems, but that’s not censoring you (see the definitions above). They could try a civil lawsuit, but they’d then be trying to prove defamation, which is not criticism. If they own multiple platforms, they could completely silence you on all of those platforms, but that still wouldn’t count as censorship, because the private entity isn’t a governmental authority and does not have governmental or quasi-governmental power. And you could always take your speech elsewhere. Now, if the USG/state gov/etc. explicitly passed a law that said you couldn’t talk or teach about x or y (something that some states are doing, cough cough Florida cough cough), that is censorship. And also violates the first amendment as well as potentially the 14th. Another example of censorship would be the government sending you an NSL or other legal order that says that you can’t talk about something. Which also means that, technically, permanent gag orders or NSLs that explicitly tell you not to talk about them and that if you do, you’ll be prosecuted is a form of censorship. Not that anyone will actually hold the government accountable for that. (I excluted temporary gag orders from this discussion because they are just that, temporary. And they usually don’t last long.)
Anyway, there’s my tangent for the day.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:

Yes, it is.

https://ncac.org/news/blog/private-censorship-fighting-suppression-of-speech-by-non-governmental-actors

It is merely the case that private censorship of all sorts is legal, because of the 1st Amendment rights of the censors, so it doesn’t show up very much in legal cases. But censorship has a “common law” meaning, not just a technical legal one. When someone has their opinion silenced by a private actor, they have been censored just as much as if the government had done it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:7

Murdered trans people tend to be Black transwomen street prostitutes, namely young Black men in crime-adjacent professions. It would be nice if the people professing that Black lives matter would become aroused by the unfortunately disproportionate number of Black lives taken by Black people, and acted within the Black community to do something about it.

“Fixing trans people” is actually an interesting question. There are all sorts of conditions people have that render them atypical in some way. Not all of those people want to be changed to be more like normal people. As things stand, people who are judged mentally competent to make their own decisions should be the ones making them, and they can choose to try to get “fixed”, or they can choose to live as they are. Otherwise, it’s the responsibility of their guardians to decide.

If, at some future point, we can determine whether an unborn fetus will develop any of these atypical conditions, it will be up to the mother to decide to abort, to try to have the fetus fixed in utero, or to let things proceed as they will. At that point, we will likely also develop new systems of morality around these various choices. More fights, yay!

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re: Re:4

Hyman.

Per your FUCKING ADMISSION, you have been REMOVED from those fine conservative wateringholes for, what we presumably know, acting like you fucking do here.

And AGAIN, per you OWN FUCKING ADMISSION, you have been taught, from your parents, no less, the dangers of the fucking ideology you keep shilling for. Lessons you appraently DIDN’T LEARN.

If you don’t get that WE DON’T WANT YOU HERE, what will?

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:5

I get banned from sites because I like to argue against false but prevalent beliefs on those platforms.

My parents did not teach me that men can be women, or to ignore accurate crime statistics.

I don’t care that you don’t want me here. I suppose that the a way that might get me to leave, aside from Masnick literally blocking me, would be for no one to engage with me. Fortunately that seems unlikely to happen.

This comment has been deemed insightful by the community.
Ethin Probst (profile) says:

Re: Re: Re:2

Yes, it has a common law definition, but even the common law definition (and even the layman’s definition) refers to governments or quasi-government authorities. The common law definition is “the suppression or prohibition of speech, expression, or other forms of communication by a government or other authority,” and the layman’s definition is “The use of state or group power to control freedom of expression or press, such as passing laws to prevent media from being published or propagated.” I’d say that’s pretty clear. The use of the word “propagated” completely eliminates any form of interpretation that would otherwise imply that just getting banned from an internet property is censorship, solely because that doesn’t suppress the propagation of anything. I mean, you can’t access that internet property, but that doesn’t stop you from going to another internet property or creating your own. Suppression of the propagation of information, ideas, etc., would imply some method of completely silencing you, e.g., imprisoning you, denying you the access to the internet or other mediums of communication, etc.

This comment has been flagged by the community. Click here to show it.

Stephen T. Stone (profile) says:

Re: Re: Re:6

Of course I don’t want to silence trans people.

And yet…

How else will everyone come to understand, via republication by sites such as LibsOfTikTok, what nonsense they are claiming?

…you support the people who do. Not supporting their goals doesn’t absolve you of supporting their methods⁠—or the pain they cause to the people whom you apparently take delight in seeing suffer.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:7

Remember that LibsOfTikTok, as their name implies, gathers videos and other media that show woke ideologues, gender and otherwise, acting ridiculously or badly, and publicizes those. Or they show the sexually explicit pages from the books that woke ideologues want to keep in school libraries. To the extent that woke ideologues find these postings painful, they have no one to blame but their fellow woke ideologues.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Stephen T. Stone (profile) says:

Re: Re: Re:5

And yet, under your free reach–wanting ideology, even those forums (so long as they’re sufficiently “generic”) would be forced to host speech that the owners otherwise wouldn’t host. Whether you support those exact ends doesn’t absolve you of supporting the means others will use to get there⁠—just like not supporting laws that ban trans people from being in public (i.e., “bathroom bills”) doesn’t absolve you of supporting the hateful, fascist, and genital-obsessed ideology behind those laws.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:6

A forum dedicated to a specific ideology is, of course, not generic, but just the opposite.

The owners of the forum get to choose what to allow. No one is forcing anyone to allow anything. However, large generic speech platforms should value the principles of freedom and not silence opinions based on viewpoint.

Trans people “being in public” is not the same as trans people forcing their way into single-sex spaces for which their bodies disqualify them.

I do not need to be absolved of anything.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re:

silencing opinions based on viewpoint on platforms the censor controls.

Kicking assholes off a social media platform is not censorship, it’s kicking an asshole off a social media platform.

Just because you are the asshole that got kicked off Twitter, doesn’t mean there aren’t plenty of other social media sites that will accept assholes.

To wit: Truth Social — Why don’t you go there to spew your hate garbage.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:

There is nothing wrong with wanting things, only with forcing people who are under no obligation to provide you with the things you want.

I want large generic speech platforms to not censor opinions based on viewpoint. If they choose to censor, they should be encouraged, shamed, criticized, or bought to get them to change their ways. But never forced.

I am bigoted only to the extent that reality is bigoted. You woke ideologues may put Winston Smith’s head into a rat cage until he states that a man is a woman, but reality does not bend to people’s will, and men will never be women no matter how many people state otherwise.

Stephen T. Stone (profile) says:

Re: Re: Re:2

But never forced.

Don’t lie to us, Hyman. Like I said elsewhere: You’re the kind of person who wants to shit all over and ruin spaces where other people (especially marginalized people) gather and associate with one another. You’ll never admit it, but that’s the effect of your actions and the goal of your ideology. You give more of a fuck about using your rights and freedoms to make other people suffer than you ever will about using it to improve the lives of marginalized people.

That’s how I know you’d be perfectly fine with a law that says “using Twitter is a civil right”⁠—and how everyone else here knows it, too. After all, you’re always bitching about not being able to spew your transphobic bullshit where it isn’t welcome. (Hell, that’s precisely why you stopped logging in to post.) For what reason would you not support a law that says your bullshit is guaranteed a spot⁠—and therefore some semblance of an audience, no matter how forced⁠—on a platform you don’t own?

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:4

Hyman.

You are the cause.

I am left with things best called “violates 1A” for simply saying them.

You won’t listen, you won’t learn, your “debates” and “arguments” are continually called out, proven wrong and you refuse to even consider the DSMV as the academci consensus without twisting the definition to your own terms.

I am starting to understand WHY those fine conservative sites kicked you out. Despite you expressing their bullshit talking points. You simply WON’T stop until someone STOPS YOU.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:

Right-wing dipshits like Hyman know exactly what they’re doing in that regard. The point is to take a useful word and make it worthless in any form of discourse: “Censorship”, in this case, can be made worthless by attaching it to anything that only “feels” like censorship but actually isn’t (e.g., social media moderation) so actual acts of censorship (e.g., book bans) won’t seem so serious in comparison. It’s a fascist trick, and it only works if we let it.

That One Guy (profile) says:

Re: Re: Re:

‘Based on viewpoint’ seems to be the current dishonest and cowardly dog-whistle bigots or otherwise toxic individuals use to try to avoid having to say exactly what those viewpoints are in order to attempt to frame consequences for being an asshole as some unjustified persecution.

‘I was kicked out of X due to the viewpoints I expressed’ is a lot more likely to garner sympathy than ‘I was kicked out of X for harassment and/or bigotry’.

This comment has been flagged by the community. Click here to show it.

Stephen T. Stone (profile) says:

Re: Re:

Tell me again how someone getting banned from a privately owned open-to-the-public social media service is the exact same act as a government agent threatening someone with jail time for saying things the government doesn’t like. Will you try to Gish Gallop me into another mental breakdown like you did the last time you got into this shit with me, or will you make a genuinely convincing argument?

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:

That’s a stupid question, because it isn’t trying to be a question — it’s an attempt to put words into my mouth in a dishonest way, the precise action you kept complaining about when you wrongly thought it was happening to you.

To give it more credit than it deserves, I will, once again, explain to you the idea that two things can fall into the same category and not be the same. Most people grasp this by, oh, first grade or so, when they’ve figured out that cats are animals, and so are cows, but cows are not cats.

Similarly, both of those things are acts of censorship. But they aren’t ‘exactly the same’, something I’ve always taken pains to explicitly clarify to you and which has apparently never gotten through. At this point your misunderstanding appears wilful, and I’m starting to get annoyed by it.

This comment has been flagged by the community. Click here to show it.

Stephen T. Stone (profile) says:

Re: Re: Re:4

censorship doesn’t require a violation of a right

You’re right.

But it does require at least the threat of violating a right. And a Twitter ban doesn’t do that.

I’m pissed off about a couple of other things right now and I don’t need to get into a long-ass argument with you because you’re going to make me angrier and I do not need to go through another mental breakdown right now. So I’m gonna leave you to keep whining about how things that only “feel” like censorship to you are actually censorship. Keep at it, champ⁠—because when everything is censorship, nothing will be.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:5

“But it does require at least the threat of violating a right.”

No, it does not. What it requires is a third party stopping you from speaking. That is it, and that is all.

Whether someone’s rights are violated in the course of that doesn’t determine if something is censorious, nor even if it’s bad, because of the wide variety of civil rights regimes in the world. There are acts of official, government censorship that don’t breach someone’s rights because those rights have been defined so that the government can do stuff like that. But saying that, therefore, it’s not censorship, would be incredibly stupid.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re: Re:6

No, it does not. What it requires is a third party stopping you from speaking. That is it, and that is all.

That is a selfish, self serving definition that leads to granting those who shout the loudest a hecklers charter.

Anonymous Coward says:

Re: Re: Re:7

Only if you take it as an article of faith that censorship shouldn’t happen.

Alternatively — and this is my position — the fact that something is censorious does not disqualify it from being something should be done. Tossing people from social media falls into this category, as does defamation law, as can prohibitions on hate speech.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re: Re:8

So, whats wrong with using censorship/moderation to distinguish between the bad and acceptable removal of speech? Using the same word for both makes id easier for trolls to force their way onto platforms where they are not welcome.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:9

Well, sort of. But as soon as you say something along the lines of ‘moderation isn’t censorship’, in the context specifically of the removal of material or persons from something, you wind up with a categorization error.

Besides which, if it’s moderation-if-it’s-good and censorship otherwise, we suddenly have things like anti-defamation or anti-fraud laws being moderation, which seems a bit weird to me.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

There is no shortage of people who hate freedom and want to silence people who disagree with them. Those people have no hesitation in labeling their enemies as Nazis, Communists, x-phobes, hate speakers, un-American, and anything else they can throw. For a platform owner, it takes courage to withstand the onslaught of the would-be censors and allow people to speak regardless of the viewpoint of their opinions.

For people who hate freedom, the United States as a whole is a “Nazi bar”. After all, how does a nation that has chosen to uphold freedom of speech differ from a platform that has chosen to uphold freedom of speech?

Of course, if the promise and premise of Bluesky does pan out, that would be great. Moderation in the hands of users is never a violation of the principles of free speech. People have the right to not hear things they don’t want to hear (including the ranting of stinking, crazed, drug-addled, dangerous bums while a captive audience on the subway), but other people should not be forcibly deciding those things for them.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re:

For a platform owner, it takes courage to withstand the onslaught of the would-be censors and allow people to speak regardless of the viewpoint of their opinions.

And it takes an equal amount of either ignorance or hubris to believe that anyone other than bigots and assholes would want to willingly associate with bigots and assholes. Your problem, Hyman, is that if a platform willingly tells those “own the libs”–types to go pound sand, you think that’s a violation of the civil rights of the bigots and assholes rather than an exercise in the rights of the platform’s owner(s) to choose who they want associated with said platform.

Using Twitter, Bluesky, et cetera is a privilege. Losing that privilege isn’t censorship. Just because you want shit all over and ruin spaces where other people (especially marginalized people) gather and associate with one another doesn’t change those facts. Now fuck off back to your Jesse Singal anti-trans circlejerk.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re:

Not at all. As usual, you argue with illusory versions of me who say what you want them to say.

Private censorship is not a violation of anyone’s civil rights because private censors have the 1st Amendment right to censor as they wish, including things like payment processors and banks refusing service to people. (Presumably, public accommodation laws fall afoul of the freedom of association granted by the 1st Amendment, since no one has stepped in to stop this.)

But censorship remains the act of the censor, silencing opinions based on viewpoint on platforms the censor controls. Private censorship does not violate rights, but it violates principles. In a society that has freedom of speech as a foundational value, opinions should not be silenced based on viewpoint even when the censors have the right to do so.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

There is no shortage of people who hate freedom and want to silence people who disagree with them.

That’s pretty much, I dunno, the other 191 countries in the world?

Those people have no hesitation in labeling their enemies as Nazis, Communists, x-phobes, hate speakers, un-American, and anything else they can throw.

At least 74 million inside America, and at least 3 billion outside of it. Including their leaders.

People have the right to not hear things they don’t want to hear (including the ranting of stinking, crazed, drug-addled, dangerous bums while a captive audience on the subway), but other people should not be forcibly deciding those things for them.

I guess you also hate Social Security, the Veterans’ Association and being kind to the homeless then?

The homeless is an international issue, btw. Just because the bums are treated like shit in America doesn’t mean there aren’t homeless or bums elsewhere.

I mean, Japan practically ignores the existence of THEIRS.

Stephen T. Stone (profile) says:

Re: Re:

The homeless is an international issue, btw. Just because the bums are treated like shit in America doesn’t mean there aren’t homeless or bums elsewhere.

And shit, in America, we are apparently allowing the homeless to be executed in public by “good samaritains” even if a homeless person who seems threatening hasn’t actually attacked someone. So, yeah, I can’t wait to see Hyman justify a public lynching on a New York City subway train in 2023.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:2

Indeed. Just consider it as an obnoxious person being booted off a platform for violating norms of behavior. In this case, the platform is earthly existence.

This is the problem with woke ideologues. Decades ago they decided to close down mental institutions where people like Jordan Neely could have been held to protect him and others from him. But they did nothing to make sure that alternative systems would be in place for those people who were thrown out, and now these people are left on the streets.

Stephen T. Stone (profile) says:

Re: Re: Re:3

And what did the “non-woke” ideologues do to help those people, hmm?

Was it “nothing”⁠—was that what they did?

Also, seems like the “non-woke” types are the ones who are at best justifying and at worst celebrating the murder of Jordan Neely. I’m guessing you’re somewhere in between those two extremes, given how much you’ve ranted about how every homeless person ever is a drug addict and a murderer-in-waiting.

Bobson Dugnutt (profile) says:

Re: Re: Re: Be careful what you wish for

I can’t wait to see Hyman justify a public lynching on a New York City subway train in 2023.

I know it’s sarcasm but Skarka’s Law says that on the internet there is absolutely nothing too vile or indefensible to stop a justification for it.

Related: Ben “Yahtzee” Croshaw’s Greater Internet Fuckwad Theory.

Bobson Dugnutt (profile) says:

Re: Re: Re:3 Arminianism and Calvinism

First coined by the fine gentlemen at Penny Arcade.

Yes, John Gabriel coined the Greater Internet Fuckwad Theory in 2004. Even Croshaw credits Gabriel.

Croshaw’s variant came five years later.

Gabriel’s GIFT: Normal person + anonymity + audience = Total fuckwad
Croshaw’s GIFT: Normal person = Total fuckwad

It’s the internet age’s version of the theological controversy of Arminianism and Calvinism.

Gabriel’s is the Arminian version. We can choose to be fuckwads, and fuckwaddery is part of the human condition, but through grace we can also choose to be good people.

Croshaw’s is the Calvinist version, distilled down to total depravity and predestination. We are all fuckwads by nature, we are going to be fuckwads, and the internet creates the conditions to bring out the fuckwaddery within us all.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re:

I do not hate any government programs that provide social services. On the contrary, they need to be bigger and be run better and properly funded. The state of public healthcare, education, housing, legal aid, prisons, and on and on is shamefully bad in the US, mostly because these things have no oversight with teeth.

Anonymous Coward says:

Re:

There is no shortage of people who hate freedom and want to silence people who disagree with them. Those people have no hesitation in labeling their enemies as Nazis, Communists, x-phobes, hate speakers, un-American, and anything else they can throw.

So why do you call everyone who doesn’t adhere to your particular anti-human beliefs for woke ideologues?

For people who hate freedom, the United States as a whole is a “Nazi bar”. After all, how does a nation that has chosen to uphold freedom of speech differ from a platform that has chosen to uphold freedom of speech?

The only one who hates freedom here is you since you want to force yourself upon others against their wishes on others private property.

People have the right to not hear things they don’t want to hear (including the ranting of stinking, crazed, drug-addled, dangerous bums while a captive audience on the subway)

A bum on a subway has more rights to say whatever he wants to a momentary captive audience than you have forcing yourself and your views upon others on private property.

Btw, may a thousand bums haunt you every day.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re:

I live on the Upper West Side of Manhattan. Being haunted by a thousand bums every day is not difficult here. We have one bum who has been inhabiting the same spot on the street for years, outlasting a Victoria’s Secret there that opened and closed.

Captive audience restrictions on public transliteration do exist:

https://www.mtsu.edu/first-amendment/article/895/captive-audience

I call people woke ideologues when that’s what they are. I do that to criticize their point of view, not to silence them. They should be able to speak as freely as anyone else, because I do not have the right to silence lies, just to call them out.

Anyone who wishes to prevent me from speaking on their property is free to do so. This site is Masnick’s property, and he chooses not to block my posts.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:2

your law doesn’t do what it says

Yes, it does: It protects individual citizens from government interference in speech and expression. Don’t get pissy because it protects the rights of bigots and assholes⁠—after all, the worst speech needs the most protection.

Anonymous Coward says:

Re: Re: Re:3

Dude, it says “make no law”.

Either it should stop all laws, as the text directly states, or the existence of the various ‘exceptions’ raises the questions of ‘why those and only those?’ If there were laws against certain sorts of speech and I wrote ‘you can’t make laws against speech’ in my country’s rules, I would think the obvious conclusion would be that those laws were prohibited now, not that “actually those laws are fine, it’s any new sorts of restrictions that aren’t allowed”, but because the second allows you to have an operable society and the first would not, that’s the one that was went with.

Otherwise, “kill that guy” couldn’t be illegal.

Anonymous Coward says:

Re: Re: Re:3

Oh, and side note: “The worst speech deserves the most protection” is a slogan, nothing more, and wrong to boot. The worst speech does not, in fact, deserve the most protection.

It in fact deserves prohibition, because allowing ‘the worst’ speech winds up impairing the speech rights — and indeed many other things — of everybody else.

I will not, in fact, defend to the death someone’s right to say something. Certain people damn well should feel the coercive power of the state to shut them up.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:4

“The worst speech deserves the most protection” is a slogan, nothing more, and wrong to boot. The worst speech does not, in fact, deserve the most protection.

Yes, it does. Because if it didn’t, I can think of a single word that could (and probably would) be banned in practically any and every context. And while a ban on that word might bring people offended by it some sense of emotional relief, it would also infringe on the rights of everyone who uses that word not as a slur, but as a term of identification or even endearment.

I’m talking, of course, about queer.

(…what, did you think I meant some other, well-known, far more obvious “taboo” word?)

See, “queer” as an identifier⁠—for either individuals or groups of people that don’t identify as either cisgender, heterosexual, or cishet⁠—has also been used as a slur in the past. Even as some people (like myself) have reclaimed “queer” as a self-identifying adjective, others who would fit under that umbrella avoid it because of its past usage as a slur. That word doesn’t offend me, but it may offend someone else. To believe “queer” should be refused any legal protections because someone⁠—anyone⁠—might use the word as a slur is to believe “queer” is, in fact, part of that “worst speech” you think should be afforded no protection under the law.

Hell, you can extend that logic to other anti-queer slurs that LGBTQ people have at least tried to reclaim. If a gay man wants to call himself a f⸻t even if that word would likely offend other gay people, for what reason should the feelings of the offended group of gay people supercede the rights of the individual gay man to call himself that?

I’m neither eager nor willing to defend the speech of racists and bigots. But I can defend their rights to say their bullshit without defending their bullshit. If queer people and Black people and all other groups of people deserve the right to speak freely, so do the bigots and assholes who oppose those groups. To say otherwise is to deny those bigots and assholes the same rights promised to all other people on the basis of a goddamn thought crime. And I will stand against that fascist bullshit any day of the week and twice on Sundays.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:5

In places that decide to do something about the worst of speech, ‘queer’ is not, in fact, a taboo word. The two concepts are not connected, and you’re inventing a catastrophe in order to pat yourself on the back for, in your imagination, standing against it.

Stephen T. Stone (profile) says:

Re: Re: Re:6

In places that decide to do something about the worst of speech, ‘queer’ is not, in fact, a taboo word.

My point was and still is that, to some people, “queer” is a taboo word and an anti-LGBTQ slur⁠—“the worst speech”, if you will⁠—and those people would likely accept the notion that the word should be banned. But as a queer person, banning that word would be an affront to me and everyone else who identifies as queer. You seem to want the sensibilities of those offended by the word to overrule the rights of those who aren’t offended.

I won’t defend or justify the usage of racial slurs and bigoted speech. I won’t say people who use that language should be welcomed in any space other than their own shitpits. But I will defend the rights of those assholes to say whatever the fuck they want so long as they’re not breaking the law. To say otherwise is to imply that I believe the government should use the law to silence “the worst speech”⁠. But supporting that stance would set me up for a Leopards Ate My Face moment when the government decides that my speech is “the worst speech”. A defense of the right to spew bigoted, hateful, and otherwise repugnant speech⁠—like, say, a satirical ad about a famous televangelist fucking his mother in an outhose⁠—is a defense of the right to say any speech at all, for it is the rights of those who express “the worst speech” that require the most defending.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:7

“A defense of the right to spew bigoted, hateful, and otherwise repugnant speech⁠—like, say, a satirical ad about a famous televangelist fucking his mother in an outhose⁠—is a defense of the right to say any speech at all, for it is the rights of those who express “the worst speech” that require the most defending.”

This is an article of faith, but it isn’t bourne out in the reality, where it is entirely possible to defend the vast majority of speech without actually needing to find yourself in the position of having to defend all speech. Hell, you yourself have conceded elsewhere to accepting certain exceptions to the U.S. 1st Amendment, because, well, you’re used to those exceptions and think ‘of course those shouldn’t be allowed’.

So until and unless I see you banging a ‘decriminalize fraud!’ drum here kindly spare me the silly-ass ‘you must defend everything or you defend nothing’ false dilemmas.

Stephen T. Stone (profile) says:

Re: Re: Re:8

it is entirely possible to defend the vast majority of speech without actually needing to find yourself in the position of having to defend all speech

And I’d be a hypocrite for doing so, for one day someone might think my speech is “the worst speech”. Again: I can defend the rights of assholes without defending what they use their rights to say.

you yourself have conceded elsewhere to accepting certain exceptions to the U.S. 1st Amendment, because, well, you’re used to those exceptions and think ‘of course those shouldn’t be allowed’

Those exceptions are extremely limited and have to do with committing actual crimes rather than mere thoughtcrimes. Our resident transphobe Hyman Rosen spews his hate with reckless abandon, and I won’t defend anything he says in that regard. But unless he starts calling for actual violence against trans people or posts links to images of the genitals of trans children (and I’d bet money that he has a few on his computer/phone somewhere), I’m loathe to have him silenced by using the law to violate his civil rights.

Speech I find offensive, so long as it doesn’t violate the law, has every right to be spoken regardless of how I feel about it. That doesn’t mean I have to host, listen to, defend, or refuse to criticize that speech. I believe in the rights of the worst people to say the worst things and “get away with it” precisely because a future government might consider my criticism of it to be “the worst speech”. You’re asking me to set up a bear trap and step in it, but you’re not that clever and I’m not that stupid.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:9

‘And I’d be a hypocrite for doing so, for one day someone might think my speech is “the worst speech”’

Yeah, they might. But it’s entirely possible to argue against that without needing to defend what you think is the worst speech, and without it being hypocritical if you don’t think freedom of speech requires you to defend literally everything, and instead focus on what that speech is and does.

You concede ‘actual criminality’ as a ‘narrow exception’, but there’s nothing inherent to your slippery slope construction that says that that can’t be the start of it either. If you’re not defending criminal speech, someday your speech might be declared criminal, c’nest pas?

Stephen T. Stone (profile) says:

Re: Re: Re:10

it’s entirely possible to argue against that without needing to defend what you think is the worst speech

Let me once again make this as clear as I can: I’m not defending the speech itself. I don’t and won’t defend racial slurs, queerphobia, and any other hateful bullshit. What I’m defending is the right of people who say that shit to say that shit because defending their rights also defends my rights. The worst speech requires the most legal protections no matter how you, I, or anyone else feels about that speech.

Also, pulling that “you must defend criminal speech” trick won’t work on me. You made that particular bear trap so easy to spot that it doesn’t even belong in the Resident Evil 4 remake.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:10

“It’s all very well to say the answer to bad speech is more speech, until the people with the power and megaphones are all Nazis and white supremacists, and the elections are all rigged, and suddenly the only speech that is considered ‘good’ is utter garbage.”

Not sure why you’re up in arms in this thread, given that you posted the above, which would seem to agree with the idea that in fact defending all speech is a bad plan, in a different one.

Anonymous Coward says:

Re: Re: Re:7

I won’t defend or justify the usage of racial slurs and bigoted speech. I won’t say people who use that language should be welcomed in any space other than their own shitpits. But I will defend the rights of those assholes to say whatever the fuck they want so long as they’re not breaking the law.

This works up until the point where the speech that isn’t breaking the law leads people to commit acts that do break the law.

Or they use speech that doesn’t break the law to get support and votes to create laws that make what was legal into the illegal, and vice versa.

We’re seeing how stochastic terrorism has worked in multiple states to create a ban of the existence of trans folks. The decades-long, evil-as-fuck project of the GOP to overturn Roe finally succeeded, as well. Look at Florida & Texas.

The assholes are using “I’m not touching you!” tactics to actualize their vile shit and turn it into something that is allowed by the laws in this country. You do see that, right? Letting assholes fester in their shitpits leads to those shitpits getting bigger and engulfing and damaging more than just the people there. We saw what happened when 4chan let /pol/ fester and pretend that it was a “containment” board.

Stephen T. Stone (profile) says:

Re: Re: Re:8

This works up until the point where the speech that isn’t breaking the law leads people to commit acts that do break the law.

At which point you can argue that someone incited an illegal act⁠—but if you want to ban otherwise legal language because it might incite violence or other illegal acts, you’re gonna have a bad time.

Or they use speech that doesn’t break the law to get support and votes to create laws that make what was legal into the illegal, and vice versa.

That’s why I’m on record as supporting the right of every person to say anything that, under current U.S. law, is legally permissible speech: If such speech is made illegal in the future, I will stand against that change.

We’re seeing how stochastic terrorism has worked in multiple states to create a ban of the existence of trans folks. The decades-long, evil-as-fuck project of the GOP to overturn Roe finally succeeded, as well. Look at Florida & Texas.

People from that side of the political aisle will argue that “stochastic terrorism” against “their side” has resulted in laws and policies that target “their side” in equal measure. Throw the Overton Window out for a picosecond and consider this: Would you truly want to give the power of banning “offensive” speech to people with whom you agree if you knew with the certainty of God that those with whom you disagree would use that same power for their own self-serving ends?

The assholes are using “I’m not touching you!” tactics to actualize their vile shit and turn it into something that is allowed by the laws in this country. You do see that, right?

I’m well aware of what they’re doing. I still refuse to compromise my principles on this. Trying to extort me into doing so by bringing my pro-queer political beliefs into the mix won’t work.

Letting assholes fester in their shitpits leads to those shitpits getting bigger and engulfing and damaging more than just the people there. We saw what happened when 4chan let /pol/ fester and pretend that it was a “containment” board.

I’m not happy about the existence of those shitpits or their continued (and frightening) growth. But my discomfort doesn’t entitle me⁠—or anyone else⁠—to shut down those shitpits by saying “this offends me, take it down”. You’ll need a better argument than “but 4chan” if you want me to turn my principles into positions of convenience that change depending on the popularity of a given argument.

The worst people should have the right to express the worst (legally protected) speech and the worst ideas because their rights are my rights, too. And I can defend their rights without having to defend their speech. That includes defending your right to tell me I’m a loathsome son of a bitch for taking that stance.

Anonymous Coward says:

Re: Re: Re:9

At which point you can argue that someone incited an illegal act⁠—but if you want to ban otherwise legal language because it might incite violence or other illegal acts, you’re gonna have a bad time.

This is how they win. All their speech that leads up to the incitement and violence is legal, and then the illegal incitement and violence is done. But we aren’t allowed to do anything about the speech that leads up to the illegal speech.

People from that side of the political aisle will argue that “stochastic terrorism” against “their side” has resulted in laws and policies that target “their side” in equal measure. Throw the Overton Window out for a picosecond and consider this: Would you truly want to give the power of banning “offensive” speech to people with whom you agree if you knew with the certainty of God that those with whom you disagree would use that same power for their own self-serving ends?

The GOP is waging open fascism now, and has been for the last few years. They are already doing their damnedest to ban LGBTQIA+ people, media, depictions, and discussions, from public life, both online and offline. They didn’t need a slippery-slope fallacy of the stripe that you continually discuss; they’re doing this shit right now. They didn’t need progressives to start passing bills, they’re passing the bills right now.

Are you going to try to point to some ill-conceived law from the Democrats as to why the GOP in Texas, Florida, Montana, Missouri, and more are shoving through these bills banning gender-affirming care as well as public participation in regular daily life?

You don’t get to go “Oh, but their side thinks that your side is just as evil!”. That’s false equivalency devil’s advocate trash. Abstracting their fascism into “those with whom you disagree” and “that side of the political aisle” is how you convince nobody and accomplish nothing.

But my discomfort doesn’t entitle me⁠—or anyone else⁠—to shut down those shitpits by saying “this offends me, take it down”.

It’s not “this offends me, take it down”. It’s “these are people who want me dead and are fomenting violence against me & my friends, and have been doing so for years at this point, take it down”.

Shit like open, virulent bigotry and advocacy for genocide are not things you can paint over with the label “things that you disagree with” to make yourself sound correct.

Anathema Device (profile) says:

Re: Re: Re:4

And so we run up against the Paradox of Tolerance.

It’s all very well to say the answer to bad speech is more speech, until the people with the power and megaphones are all Nazis and white supremacists, and the elections are all rigged, and suddenly the only speech that is considered ‘good’ is utter garbage.

“Certain people damn well should feel the coercive power of the state to shut them up.”

Agreed. But that only works as a act of righteousness if the state is righteous, c.f Russia, Florida, Turkey, Indonesia, the UK etc.

Of course when the state is not righteous, there is no longer such thing as free speech of any kind.

Rina Slusnyte (user link) says:

Content moderation should take place and monitor user-generated content on all platform.

Content that violates the platform’s rules or community standards-should be removed.

This including hate speech, harassment, fake news, and graphic violence, among other things.

Every platform must be a safe and respectful place for everyone to use.

I understand,and agree, that content moderation can be a difficult task ,
but it has to involve balancing the right to free speech with the need to prevent harm and protect users.

Anathema Device (profile) says:

Re:

“Every platform must be a safe and respectful place for everyone to use.”

No, because people can choose not to use a platform with offensive users. Voting with your feet works pretty well, because it disincentivises advertising, and reduces the all powerful ‘reach’.

Also, it helps when the arseholes are all on a few platforms that tolerate them, because the rest of us can just block the crap out of them.

Only platforms which are part of a public service or publicly funded should have an obligation to be safe and respectful.

Anonymous Coward says:

Re: Re:

In a decentralized moderation scheme, how else do you support blocking, unless you just want it to be client-side? If you want the upstream to block for you, you have to tell it what to block. That means that you are trusting the upstream with your block list, and that means a malicious service can set itself up to harvest the lists and make them public.

Anonymous Coward says:

A better point than the Nazi Bar might be “Do you remove sexual expression?” If so, why? If you do remove that, why is it this particular hateful speech is so deserving of protection?

It’s annoying having all these wannabe free expression warriors who only ever seem to go above and beyond when it’s for the KKK or a conspiratorial nutjob.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...