Substack Realizes Maybe It Doesn’t Want To Help Literal Nazis Make Money After All (But Only Literal Nazis)

from the you-don't-have-to-hand-it-to-the-nazis dept

Last year, soon after Elon completed his purchase of (then) Twitter, I wrote up a 20 level “speed run” of the content moderation learning curve. It seems like maybe some of the folks at Substack should be reading it these days?

As you’ll recall, last April, Substack CEO Chris Best basically made it clear that his site would not moderate Nazis. As I noted at the time, any site (in the US) is free to make that decision, but those making it shouldn’t pretend that it’s based on any principles, because the end result is likely to be that you have a site full of Nazis and… that tends not to be good for business because other people you might want to do business with might not want to be on the site welcoming Nazis.

Thus, it should not have been shocking when, by the end of the year, Substack had a site with a bunch of literal Nazis. And, no, we’re not just talking about people with strong political viewpoints that lead people who oppose them to call them Nazis. We’re talking about people who are literally embracing Naziism and Nazi symbols.

And Substack was helping them make money.

Even worse, Substack co-founder Hamish McKenzie put out a ridiculous self-serving statement pretending that their decision to help monetize Nazis was about civil liberties, even as the site regularly deplatformed anything about sex. At that point, you’re admitting that you moderate, and then it’s just a question over which values you moderate for. McKenzie was claiming, directly, that they were cool with Nazis, but sex was bad.

The point of the content moderation learning curve is not to say that there’s a right way or a wrong way to handle moderation. It’s just noting that if you run a platform that allows users to speak, you have to make certain calls on what speech you’re going to allow and what you’re not going to allow — and you should understand that some of those choices have consequences.

In the case of Substack, some of those consequences were that some large Substack sites decided to jump ship. Rusty Foster’s always excellent “Today in Tabs” switched over to Beehiiv. And then, last week, Platformer News, Casey Newton’s widely respected newsletter with over 170,000 subscribers, announced that if Substack refused to remove the Nazi sites, it would leave.

Content moderation often involves difficult trade-offs, but this is not one of those cases. Rolling out a welcome mat for Nazis is, to put it mildly, inconsistent with our values here at Platformer. We have shared this in private discussions with Substack and are scheduled to meet with the company later this week to advocate for change.

Meanwhile, we’re now building a database of extremist Substacks. Katz kindly agreed to share with us a full list of the extremist publications he reviewed prior to publishing his article, most of which were not named in the piece. We’re currently reviewing them to get a sense of how many accounts are active, monetized, display Nazi imagery, or use genocidal rhetoric. 

We plan to share our findings both with Substack and, if necessary, its payments processor, Stripe. Stripe’s terms prohibit its service from being used by “any business or organization that a. engages in, encourages, promotes or celebrates unlawful violence or physical harm to persons or property, or b. engages in, encourages, promotes or celebrates unlawful violence toward any group based on race, religion, disability, gender, sexual orientation, national origin, or any other immutable characteristic.”

It is our hope that Substack will reverse course and remove all pro-Nazi material under its existing anti-hate policies. If it chooses not to, we will plan to leave the platform.

As a result of those meetings, Substack has now admitted that some of the outright Nazis actually do violate “existing” rules, and will be removed.

Substack is removing some publications that express support for Nazis, the company said today. The company said this did not represent a reversal of its previous stance, but rather the result of reconsidering how it interprets its existing policies.

As part of the move, the company is also terminating the accounts of several publications that endorse Nazi ideology and that Platformer flagged to the company for review last week.

The company will not change the text of its content policy, it says, and its new policy interpretation will not include proactively removing content related to neo-Nazis and far-right extremism. But Substack will continue to remove any material that includes “credible threats of physical harm,” it said.

As law professor James Grimmelann writes in response: “As content moderation strategies go, “We didn’t realize until now that the Nazis on our platform were inciting violence” perhaps raises more questions than it answers.”

Molly White, who remains one of the best critics of tech-boosterism, also noted that Substack’s decisions seemed likely to piss off the most people possible, by first coddling the Nazis (pissing off most people who hate Nazis), and then pissing off the people who cheered on the “we don’t moderate Nazis.”

In the end, Substack is apparently removing five Nazi newsletters. As White notes, this will piss off the most people possible. The people who want Substack to do more won’t be satisfied and will be annoyed it took pointing out the literal support for genocide for Substack to realize that maybe they don’t want literal Nazis. And the people who supported Substack will be annoyed that Substack was “pressured” into removing these accounts.

Again, there are important points in all of this, and it’s why I started this post off by pointing to the speed run post at the beginning. You can create a site and say you’ll host whatever kinds of content you want. You can create a site and say that you won’t do any moderation at all. Those are valid decisions to make.

But they’re not decisions that are in support of “free speech.” Because a site that caters to Nazis is not a site that caters to free speech. Because (as we’ve seen time and time again), such sites drive away people who don’t like being on a site associated with Nazis. And, so you’re left in a situation where you’re really just supporting Nazis and not much else.

Furthermore, for all of McKenzie’s pretend high-minded talk about “civil liberties” and “freedom,” it’s now come out that he had no problem at all trying to put his fingers on the scale to put together a list of (mostly) nonsense peddlers to sign a letter in support of his own views. McKenzie literally organized the “we support Substack supporting Nazis” letter signing campaign. Which, again, he’s totally allowed to do, but it calls into question his claimed neutrality in all of this. He’s not setting up a “neutral” site to host speech. He’s created a site that hosts some speech and doesn’t host other speech. It promotes some speech, and doesn’t promote other speech.

Those are all choices, and they have nothing to do with supporting free speech.

Running a private website is all about tradeoffs. You have to make lots of choices, and those choices are difficult and are guaranteed to piss off many, many people (no matter what you do). For what it’s worth, this is still why I think a protocol-based solution should beat a centralized solution every time, because with protocols you can setup a variety of approaches and let people figure out what works best, rather than relying on one centralized system.

Substack is apparently realizing that there were some tradeoffs to openly supporting Naziism, and will finally take some action on that. It won’t satisfy most people, and now it’s likely to piss off the people who were excited about Nazis on Substack. But, hey, it’s one more level up on the content moderation speed run.

Filed Under: , ,
Companies: platformer, substack

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Substack Realizes Maybe It Doesn’t Want To Help Literal Nazis Make Money After All (But Only Literal Nazis)”

Subscribe: RSS Leave a comment
98 Comments

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Free Speech

Once again, because you love private left-wing censorship of viewpoints you hate, you claim that allowing people to speak freely is not support of free speech and censoring people is.

Supporting free speech means allowing people to state their viewpoints without censorship. If you support free speech, you must do that even if you lose the business of people who are offended by some of the risks. You have an insane notion that silencing the people you hate so that more people you like will speak means you support free speech. You don’t. You support censorship. You hide behind the 1st Amendment, thinking that opposing government censorship is enough. It is not. Today the public square is privately owned, and outsourcing censorship to the owners is legal and constitutional. But that sorts not support free speech, it destroys it.

This comment has been deemed insightful by the community.
Pixelation says:

Re:

So, what is it that you so badly want to say that you haven’t said? Reminder, the people that are being “censored” can make a website of their own and if people are interested, they can go read it. My suspicion is that they won’t get much of a following. What I think you are looking for, and people have pointed out, is a megaphone for vitriol. We’re not interested, really.

This comment has been deemed insightful by the community.
Section_230 says:

Re:

Supporting free speech means allowing people to state their viewpoints without censorship.

Your First Amendment right to Freedom of Religion and Freedom of Expression without Government interference, does not override anyone else’s First Amendment Right to not Associate with you and your Speech on their private property.

This comment has been deemed insightful by the community.
bonk says:

Re:

Once again, because you love private left-wing censorship of viewpoints you hate, you claim that allowing people to speak freely is not support of free speech and censoring people is.

You have to be particularly stupid if you think “private left-wing censorship” is something that exists, the only reason why you would say such gobbledygook is you being so socially inept and reprehensible that no service want to be associated with you.

Supporting free speech means allowing people to state their viewpoints without censorship.

Sure, you are free to express whatever you want – on your own dime. And others are free to express their wishes that they don’t want to associate with you. That you don’t understand this basic social contract, my stuff – my rules, is kind of telling what kind of person you are.

You have an insane notion that silencing the people you hate so that more people you like will speak means you support free speech.

Telling loudmouth assholes and other dregs to take a hike because they harass and drown out normal people with their dreck is “silencing” according to you only means you belong to that group. What was the word used? Ah, decorum! Is very simple to see who lack decorum, it’s the people who can’t follow the rules or the proprietors wishes because people who lack decorum and butts into places where they aren’t welcome, they are the ones silence people by their presence and overbearing speech.

It’s kind of revealing that you have no respect for other peoples wishes when it comes to their property, you are demanding that they shut the fuck up and allow you free access.

You are just another fascist demanding obedience to your demented version of what is right not caring at all that you are trampling on other peoples rights by forcing them to adhere to you standards while butting into their conversations when they something you don’t like. The simple truth is that if you tried the same thing in meatspace, like in a shopping center, the best case scenario is that you would be thrown out on the street on short notice.

The simple truth is, if there’s consequences for doing something in meatspace don’t be surprised when it also happens online. Socially inept persons doesn’t grok that.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
bhull242 (profile) says:

Re:

First, given that—from what I’ve heard—at least one former member of the Nazi party was found to be part of some Canadian government body, I’d say they’re not all dead.

Second, as far as I’m concerned, someone who calls themself a Nazi, uses Nazi imagery, and espouses messaging essentially the same as what the Nazi Party did (including white supremacist, antisemitic views and supporting fascism and/or genocide) is a literal Nazi. There’s nothing hyperbolic about it. Since that’s what we’re talking about here, I fail to see any exaggeration going on here.

Anonymous Coward says:

Re: Re:

if you want to get down to brass tacks, there was more than one nazi party. The american nazi party rebranded into christian nationalism when they decieded nazism was too toxic a brand. They fractured into many neo-nazi groups. It’s where we get neo-nazis like this today. They are literally the ideological successors to the american nazis. Several of the american nazi party still exist.

Anonymous Coward says:

The people who want Substack to do more won’t be satisfied and will be annoyed it took pointing out the literal support for genocide for Substack to realize that maybe they don’t want literal Nazis. And the people who supported Substack will be annoyed that Substack was “pressured” into removing these accounts.

As pointed out later, this is a moderator’s Kobayashi Maru scenario. There is no point on the spectrum from “All nazis allowed” to “nothing even vaguely nazi-like allowed” where you won’t piss off almost everyone, for one reason or another.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re:

The question of at the heart of content moderation is much like the question at the heart of comedy: “Who do you want to be part of the audience?” Every decision to moderate answers that question to a finer degree.

Consider, for example, the idea of rape jokes: Someone who tells such a joke can aim that joke at rapists or rape victims. The question, then, is about which group you want as part of your audience. If someone want rapists laughing at their jokes…well, that is certainly a decision that person can make, even if it severely limits who will be laughing that material. And the same goes for content moderation: If someone wants to platform Nazis and host pro-Holocaust speech, that is the right of that someone⁠—but their decision is going to severely limit the audience that will want to be on that same platform.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re:

Ah yes, because only Right Wingers can hate, and Left Wingers are the happy party of love and tolerance? Ignoring their history of slave owning, continued abuse of African Americans (“if you don’t vote for me you ain’t black”), and their insistence that the Jewish homeland should be given back to its invaders.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:3

Hell, even if you take January 6th out of the equation, Trump still tried to have the results of a free and fair election tossed out only because he lost. He tried to carry out a coup and failed so miserably because so many people told him “no” that even a violent insurrection couldn’t get the job done for him.

That One Guy (profile) says:

'It's not fair, why is everyone being mean to my buddies!?'

Furthermore, for all of McKenzie’s pretend high-minded talk about “civil liberties” and “freedom,” it’s now come out that he had no problem at all trying to put his fingers on the scale to put together a list of (mostly) nonsense peddlers to sign a letter in support of his own views. McKenzie literally organized the “we support Substack supporting Nazis” letter signing campaign.

As damning as that is it does explain why a) it took so long for them to do even this much and b) why they’re doing so little. They(or at least he specifically) wants the nazi users on the platform, they’re just hoping that by getting rid of a handful of the more obvious examples it will be enough for this to die down and allow him to frame anyone still pissed off that there are self-identifying nazis using the platform to get paid as ‘unreasonable’ for trying to silence anyone who ‘might’ be a nazi just because they use their imagery, ideals and arguments.

Arianity says:

For what it’s worth, this is still why I think a protocol-based solution should beat a centralized solution every time, because with protocols you can setup a variety of approaches and let people figure out what works best, rather than relying on one centralized system.

“setup a variety of approaches” is a very polite way of saying the Nazis will still have a platform, you just won’t have to look at it. Which doesn’t really solve the platforming Nazis problem, if that’s something you have a problem with.

Stephen T. Stone (profile) says:

Re:

You’ll never be able to fully solve the “platforming Nazis” problem without making Nazi speech and iconography illegal⁠—and even then, that won’t stop them from toning down their rhetoric and using images that aren’t Nazi-related. The best anyone can hope to do is drive those dipshits off decent platforms and into rancid little shitholes where no one but the Nazis go to play.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re:

Ironically what you propose Steph would directly benefit them in the long term, as they only leads to more radicalization and destruction. I’m not surprised you’re unaware of how extremism works though; you strike me as a very emotional and unwell individual that is already shoulders deep in your own extremist ideology.

Stephen T. Stone (profile) says:

Re: Re: Re:

Ironically what you propose … would directly benefit them in the long term, as they only leads to more radicalization and destruction.

That’s a risk of deplatforming Nazis and other such shitbirds from decent platforms, yes. But letting them run rampant on those platforms and chase off everyone else isn’t exactly the best way to prevent that consequence.

I’m not surprised you’re unaware of how extremism works

I am, but go on.

you strike me as a very emotional and unwell individual that is already shoulders deep in your own extremist ideology

Dude, I’ve literally advocated for the right of Nazi shitbirds to speak their mind without government interference. If I’m an “extremist”, so is the ACLU.

As for the other shit: Well, you’re not entirely wrong. But I’m not out here advocating for genocide, expressing bigotry towards marginalized people, or demanding that my religious beliefs become the law that governs everyone.

This comment has been flagged by the community. Click here to show it.

Arianity says:

Re: Re:

You’ll never be able to fully solve the “platforming Nazis” problem without making Nazi speech and iconography illegal⁠

You’ll never be able to fully solve it. But you can do a bit better, or a bit worse at it. And there is a trade off with protocols, that makes it harder to drive them off.

The best anyone can hope to do is drive those dipshits off decent platforms and into rancid little shitholes where no one but the Nazis go to play.

Swapping to protocols would make that sort of disruption harder to do, for the same reason it would be harder for people like Musk to disrupt things that are good. It’s a double edged sword.

You can definitely argue the trade off is worth it, but it’s important to be clear there is a trade off.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:

Swapping to protocols would make that sort of disruption harder to do

A protocol is not a magic megaphone that forces speech onto everyone else using the protocol. To wit: Right-wing shitpits built using Mastodon don’t tend to have the reach of other Masto instances precisely because many of those other Masto instances defederate with the shitpits.

Arianity says:

Re: Re: Re:2

A protocol is not a magic megaphone that forces speech onto everyone else using the protocol.

I didn’t say that it was. Quite the opposite. Hence why I said you just won’t have to look at it.

To wit: Right-wing shitpits built using Mastodon don’t tend to have the reach of other Masto instances precisely because many of those other Masto instances defederate with the shitpits.

I’m not saying you’re forced to connect to them, or that they have the same reach. But it does mean they have the tools for fully functional masto instances. And all we can do about is defederate, no way to drive them off the protocol further. That’s not nothing

Stephen T. Stone (profile) says:

Re: Re: Re:3

it does mean they have the tools for fully functional masto instances

So what? They would have the tools for websites if they could learn HTML/CSS and pay for web hosting. The ability of a Nazi to use the same tools as the rest of us doesn’t, and shouldn’t, preclude the rest of us from using those tools.

Arianity says:

Re: Re: Re:4

So what? They would have the tools for websites if they could learn HTML/CSS and pay for web hosting.

Yes, that’s another example of something where having access to a protocol means they can use it. The lower the barrier for entry, the easier it makes it for them to use it. It’s not an insurmountable barrier, but it is a barrier.

You could say the same thing about social media right now. Anyone can spin up a Masto server, or a website. They mostly don’t, though. But the easier it is, the more people will do it.

The ability of a Nazi to use the same tools as the rest of us doesn’t, and shouldn’t, preclude the rest of us from using those tools.

I think that is a reasonable stance, but I think it should be stated explicitly when talking about protocols over platforms. Instead of leaving it unsaid, which makes it sound rosier than it is.

Stephen T. Stone (profile) says:

Re: Re: Re:5

I think that is a reasonable stance, but I think it should be stated explicitly when talking about protocols over platforms.

I did state it explicitly. But in case you need it further personalized: You can’t deny horrible people access to the same tools you use for expressing yourself without giving someone else the power to cut off your access to those tools because they think you’re a horrible person.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

setup a variety of approaches” is a very polite way of saying the Nazis will still have a platform, you just won’t have to look at it.

Which is no different from having them thrown off the platform you you use, and them using a different platform. In either case then end up with a smaller audience.

Arianity says:

Re: Re:

Which is no different from having them thrown off the platform you you use, and them using a different platform.

I would say it’s pretty different. Giving them a fully fledged protocol they can use just as much as anyone else is a much stronger tool than them having to build a platform themselves.

It’s not simple to build up platforms, as we’ve seen. That’s the whole point of swapping to a protocol after all. It makes it harder for someone like a Musk to silence people for capricious reasons. That goes the other way, too, though- it makes it harder to silence Nazis, as well. That’s a double edged sword.

Arianity says:

Re: Re: Re:2

They can use the available free/open source protocols and software to build their own platforms.

Which is exactly the sort of cost of having those protocols being open source. And that only gets bigger the more developed and normalized those protocols are.

Nobody is required to link to them, and indeed their sites are blocked by most of the fediverse.

I didn’t say they were. Quite the opposite, I said you just won’t have to look at it.

Stephen T. Stone (profile) says:

Re: Re: Re:

Nazis and other such bigots having a platform built on an open-source protocol doesn’t guarantee them freedom of reach. Even if they can access the protocol, that doesn’t mean they’ll be able to access everyone else who uses it. Look at Mastodon: Every instance gets to decide their own rules for federation, which means a new Masto instance is likely to set up a TOS that all but requires defederation with right-wing shitpits.

Arianity says:

Re: Re: Re:2

Nazis and other such bigots having a platform built on an open-source protocol doesn’t guarantee them freedom of reach.

It seems likely it will tend to give them more reach, compared to the alternative.

Even if they can access the protocol, that doesn’t mean they’ll be able to access everyone else who uses it

Sure, but it will make it easier to onboard people onto a Mastodon instance than say, Stormfront or whatever.

It doesn’t guarantee people will see it, or be forced onto it. But the same tools that make onboarding on something like an art Masto server also work for Nazis.

Stephen T. Stone (profile) says:

Re: Re: Re:3

It seems likely it will tend to give them more reach

Any form of communication that isn’t a closed-off echo chamber will give them some level of reach. You can’t prevent that unless you actively work to prevent their speech from being published in the first place.

it will make it easier to onboard people onto a Mastodon instance than say, Stormfront or whatever

So what? That doesn’t mean Nazis are going to have access to the wider Fediverse from their Nazi instance. If anything, that instance being an openly Nazi instance would likely get that instance defederated from most of the Fediverse in short order.

the same tools that make onboarding on something like an art Masto server also work for Nazis

So what? We can’t deny Nazis the right to use the same tools of personal expression that the rest of us get to use⁠—not if we still want a country where the law guarantees everyone the right to express themselves without government interference, anyway.

Look, I get it: You don’t want Nazi speech on the Internet. I wouldn’t want Nazi speech on the Internet, either. But you’re coming damn close to saying “we should deny Nazis their right to free speech”. As much as I loathe Nazis, their speech, and their ideology, I can’t support the idea of denying them their civil rights because their ideology is bigoted and genocidal.

And before you say anything: Yes, I hate being in this position. I don’t want to defend Nazis or their speech. But the right of free speech must apply to the most vile and reprehensible people you can imagine. Only by protecting the worst of us can the best of us feel free to say “fuck off, you Nazi sons of bitches”.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

I always find the mental gymnastics here so interesting. You claim to disagree with censorship, yet happily signal boost stories of where censorship was, in your eyes, required. Additionally, they continually have a very Left Wing bias. Where is the discussion of ANTIFA terrorists getting banned for calling for the deaths of innocents? Or discussion on the Left’s continued insistence to censor speech wherever possible? It isn’t possible that you willfully ignore one side of same coin because you accept the lies they spew, right Mikey?

Genuinely, some of the most ignorant and dishonest content on this website.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re:

You claim to disagree with censorship, yet happily signal boost stories of where censorship was, in your eyes, required.

Show me where the Nazis who got kicked off Substack had their civil rights infringed by being kicked off Substack. I’ll wait.

Where is the discussion of ANTIFA terrorists getting banned for calling for the deaths of innocents?

It’s in the same place as the discussion of right-wing terrorists getting banned for the same reason: Nowhere, because that decision isn’t controversial⁠—it’s what should be happening on any decent platform.

Or discussion on the Left’s continued insistence to censor speech wherever possible?

Remind me, which political party is most closely aligned with book bans happening all across the country and the groups seeking those bans? 🤔

It isn’t possible that you willfully ignore one side of same coin because you accept the lies they spew, right Mikey?

This site has skewered a number of left-wingers/Democrats for their dumb bullshit. That right-wingers/Republicans do way dumber shit way more often is their own damn fault.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

The Soviet Union is gone, Russia is effectively a dictatorship, and China is a totalitarian one-party state.

The few Soviet-trained saboteurs left are old and dying and can do no more than yell at clouds at best.

Meanwhile, it’s the white supremacists and neo-Nazis who not only call for censorship, but are more than happy to Swat, dox and, oh, right, be isurrectionists.

So, antidirt, or whoever you are, care to tell me how a bunch of ageing and dying commies are censoring people?

This comment has been flagged by the community. Click here to show it.

LittleCupcakes says:

Re: Re:

First, as usual, let me correct an error.

Substack isn’t “monetizing” the offending columns. It noted that those columns had zero subscribers (and a very small readership thank goodness).

Second, i’m unaware of any state or federal law passed or proposed that generally prohibits talking about and/or learning about abortion. I would like to see evidence of such an absurd claim.

Plenty of Commies and other Democrats child-rape as well you bigoted POS.

Anonymous Coward says:

Re: Re: Re:

Second, i’m unaware of any state or federal law passed or proposed that generally prohibits talking about and/or learning about abortion.

Try looking at Texas…

the law’s broad language suggests lawsuits may be brought by private citizens against those who aid, abet or perform abortions.

And that makes providing information about abortion risky.

This comment has been flagged by the community. Click here to show it.

Richard Steven Hack (user link) says:

This is just stupid

“Because a site that caters to Nazis is not a site that caters to free speech. Because (as we’ve seen time and time again), such sites drive away people who don’t like being on a site associated with Nazis. And, so you’re left in a situation where you’re really just supporting Nazis and not much else.”

So losing subscribers is the same as “not catering to free speech.”

This probably the most illogical think Masnick has ever written. There is literally zero logical connection between those two propositions.

YOU CATER TO FREE SPEECH BY ALLOWING FREE SPEECH. If you lose subscribers as a result, that may be unfortunate and you may have to decide whether you are really in support of free speech, and taking the weaselly way out may not be any better, but it has absolutely nothing to do with whether you in fact are allowing free speech.

Of course, in the case where you are subject to the laws of the state which forbid various forms of speech, you may not have a choice in the matter if you lawyers tell you so. This seems to have been the case here.

Which Masnick apparently can’t comprehend. Instead he spends his time complaining about how Nazis have a right to free speech. Which in fact, they do. As someone pointed out recently in a video I watched, yes, Illinois Nazis had the right to march in Illinois. Does Masnick want to have a hissy fit about that? Fine – he has the right to free speech as well and that includes having a hissy fit about Substack allowing Nazis on its platform.

But Masnick can not logically conclude that 1) he supports free speech, and 2) Nazis don’t have a right to free speech, and that 3) (even more illogically) that Substack by allowing Nazis to have free speech on their platform that they are “supporting Nazis.”

Again, the most muddled article Masnick has written in my memory.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

I don’t understand why people concern themselves with what other people write for other people to read. Clearly the goal of people who would never read the content trying to block the content is to try to prevent others who want to read the content from reading it. That reasoning goes against the principle of free speech. Note, I agree platforms are free to block the content and it’s obviously not a 1A violation to pressure companies from blocking the content.

Anonymous Coward says:

Re:

Clearly there is a choice to read a web site, or not read it if you find the content objectionable. Whether or not people want to read a site does depend on how the site is moderated, as that is the major decider of what people will read on the site. Many bigoted people realize that the general public will not visit their dedicated sites, and so wish to force themselves onto sites with larger audiences, as otherwise their only audience is people like them.

Bloof (profile) says:

Re:

In this case they are using the people subscribing to the work of actual journalists, scholars, scientists, tech bloggers and creative writers to subsidise far right content and other disinfo peddlars and to give neo Nazis who payment processors don’t want to touch because they’re nazis a platform that they can use to organise and fundraise. This isn’t ‘you can’t read that’, this is authors and subscribers of Substack saying ‘we do not want to be associated with this content, you can either have legitimacy provided by people who actually want to create and make the world better or you can have Nazis, but not both.’

This comment has been flagged by the community. Click here to show it.

PaulT (profile) says:

Re:

“I don’t understand why people concern themselves with what other people write for other people to read”

Yet, you just did it by commenting.

Substack can allow Nazis all they want. Others can state that they’re not happy with Nazis being there and avoid going there or paying them money. They can also speak elsewhere to state why they made that decision.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

“…Substack co-founder Hamish McKenzie put out a ridiculous self-serving statement pretending that their decision to help monetize Nazis was about civil liberties…”

Because Jews, Gypsies, ‘asocials’, black people, disabled people, freemasons, gay people, Jehovah’s Witnesses, Polish and Slavic citizens, POWs, political opponents of neo-Nazis, and trade unionists are not deserving of civil liberties. Amirite? /s

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Stephen T. Stone (profile) says:

Re: Re: Re:3

Let’s say you want the government to have the power to kill Nazis without consequence for no reason other than “Nazis suck”. Any power given to the government can and will be used by the worst people for the worst reasons⁠—which means that, under the right regime, the government could and would execute queer people, immigrants, and the homeless. How badly do you want that tradeoff?

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:4

Why do you assume I want “the government” to do the difficult, unpleasant work of denazifying the country?

What I suggest is that we progress in waves, with neighborhoods (or even households, I guess) purging the Nazi vermin who infest their housing plans. After all, who better knows who the Nazis are on your street than you?

I know who the Nazis on my street are! One of them flies a gadsden flag (albeit below the level of the American flag that’s also there). The other said he’s voting for Vivek in the primary (closed primary state) but assumes Trump will be the general candidate and so will cast for DJT in the fall w/o hesitation, even if he’s been convicted and imprisoned. A lot further down the street (but still same plan) is a schoolboard member (Republican of course) who doesn’t want to raise taxes to increase education spending, and objects to implementing a fully revised curriculum friendly to LGBTQ+ students, who teachers in the early grades are working so hard to produce.

Anyway, my point is that we should take care of the Nazis around us. Once you’re removed from society all the Nazis you know personally, then maybe you band with others who live in different parts of your town to help them deal with their infestation…and the program expands from there. But it’s got to be The People carrying it out so no Nazis are missed.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...