Substack CEO Chris Best Doesn’t Realize He’s Just Become The Nazi Bar

from the just-fucking-own-it dept

I get it. I totally get it. Every tech dude comes along and has this thought: “hey, we’ll be the free speech social media site. We won’t do any moderation beyond what’s required.” Even Twitter initially thought this. But then everyone discovers reality. Some discover it faster than others, but everyone discovers it. First, you realize that there’s spam. Or illegal content such as child sexual abuse material. And if that doesn’t do it for you, the copyright police will.

But, then you realize that beyond spam and content that breaks the rules, you end up with malicious users who cause trouble. And trouble drives away users, advertisers, or both. And if you don’t deal with the malicious users, the malicious users define you. It’s the “oh shit, this is a Nazi bar now” problem.

And, look, sure, in the US, you can run the Nazi bar, thanks to the 1st Amendment. But running a Nazi bar is not winning any free speech awards. It’s not standing up for free speech. It’s building your own brand as the Nazi bar and abdicating your own free speech rights of association to kick Nazis out of your private property, and to craft a different kind of community. Let the Nazis build their own bar, or everyone will just assume you’re a Nazi too.

It was understandable a decade ago, before the idea of “trust & safety” was a thing, that not everyone would understand all this. But it is unacceptable for the CEO of a social media site today to not realize this.

Enter Substack CEO Chris Best.

Substack has faced a few controversies regarding the content moderation (or lack thereof) for its main service, which allows writers to create blogs with subscription services built in. I had been a fan of the service since it launched (and had actually spoken with one of the founders pre-launch to discuss the company’s plans, and even whether or not we could do something with them as Techdirt), as I think it’s been incredibly powerful as a tool for independent media. But, the exec team there often seems to have taken a “head in sand” approach to understanding any of this.

That became ridiculously clear on Thursday when Chris Best went on Nilay Patel’s Decoder podcast at the Verge to talk about Substack’s new Notes product, which everyone is (fairly or not) comparing to Twitter. Best had to know that content moderation questions were coming, but seemed not just unprepared for them, but completely out of his depth.

This clip is just damning. Chris just trying to stare down Nilay just doesn’t work.

@decoderpod

Our host Nilay asked Substack CEO Chris Best the tough questions about whether racist speech should be allowed in their new consumer product, Substack Notes. #techtok #technews #substack #ceo

♬ original sound – Decoder with Nilay Patel

The larger discussion is worth listening to, or reading below. As Nilay notes in his commentary on the transcript, he feels that there should be much less moderation the closer you get to being an infrastructure provider (this is something I not only agree with, but have spent a lot of time discussing). Substack has long argued that its more hands-off approach in providing its platform to writers is because it’s more like infrastructure.

But the Notes feature takes the company closer to consumer facing social media, and so Nilay had some good questions about that, which Chris just refused to engage with. Here’s the full context that provides more than just the video above. The bold text is Nilay and the non-bold is Chris:

Notes is the most consumer-y feature. You’re saying it’s inheriting a bunch of expectations from the consumer social platforms, whether or not you really want it to, right? It’s inheriting the expectations of Twitter, even from Twitter itself. It’s inheriting the expectations that you should be able to flirt with people and not have to subscribe to their email lists.

In that spectrum of content moderation, it’s the tip of the spear. The expectations are that you will moderate that thing just like any big social platform will moderate. Up until now, you’ve had the out of being able to say, “Look, we are an enterprise software provider. If people don’t want to pay for this newsletter that’s full of anti-vax information, fine. If people don’t want to pay or subscribe to this newsletter where somebody has harsh views on trans people, fine.” That’s the choice. The market will do it. And because you’re the enterprise software provider, you’ve had some cover. When you run a social network that inherits all the expectations of a social network and people start posting that stuff and the feed is algorithmic and that’s what gets engagement, that’s a real problem for you. Have you thought about how you’re going to moderate Notes?

We think about this stuff a lot, you might be surprised to learn.

I know you do, but this is a very different product.

Here’s how I think about this: Substack is neither an enterprise software provider nor a social network in the mold that we’re used to experiencing them. Our self-conception, the thing that we are attempting to build, and I think if you look at the constituent pieces, in fact, the emerging reality is that we are a new thing called the subscription network, where people are subscribing directly to others, where the order in the system is sort of emergent from the empowered — not just the readers but also the writers: the people who are able to set the rules for their communities, for their piece of Substack. And we believe that we can make something different and better than what came before with social networking.

The way that I think about this is, if we draw a distinction between moderation and censorship, where moderation is, “Hey, I want to be a part of a community, of a place where there’s a vibe or there’s a set of rules or there’s a set of norms or there’s an expectation of what I’m going to see or not see that is good for me, and the thing that I’m coming to is going to try to enforce that set of rules,” versus censorship, where you come and say, “Although you may want to be a part of this thing and this other person may want to be a part of it, too, and you may want to talk to each other and send emails, a third party’s going to step in and say, ‘You shall not do that. We shall prevent that.’”

And I think, with the legacy social networks, the business model has pulled those feeds ever closer. There hasn’t been a great idea for how we do moderation without censorship, and I think, in a subscription network, that becomes possible.

Wow. I mean, I just want to be clear, if somebody shows up on Substack and says “all brown people are animals and they shouldn’t be allowed in America,” you’re going to censor that. That’s just flatly against your terms of service.

So, we do have a terms of service that have narrowly prescribed things that are not allowed.

That one I’m pretty sure is just flatly against your terms of service. You would not allow that one. That’s why I picked it.

So there are extreme cases, and I’m not going to get into the–

Wait. Hold on. In America in 2023, that is not so extreme, right? “We should not allow as many brown people in the country.” Not so extreme. Do you allow that on Substack? Would you allow that on Substack Notes?

I think the way that we think about this is we want to put the writers and the readers in charge–

No, I really want you to answer that question. Is that allowed on Substack Notes? “We should not allow brown people in the country.”

I’m not going to get into gotcha content moderation.

This is not a gotcha… I’m a brown person. Do you think people on Substack should say I should get kicked out of the country?

I’m not going to engage in content moderation, “Would you or won’t you this or that?”

That one is black and white, and I just want to be clear: I’ve talked to a lot of social network CEOs, and they would have no hesitation telling me that that was against their moderation rules.

Yeah. We’re not going to get into specific “would you or won’t you” content moderation questions.

Why?

I don’t think it’s a useful way to talk about this stuff.

But it’s the thing that you have to do. I mean, you have to make these decisions, don’t you?

The way that we think about this is, yes, there is going to be a terms of service. We have content policies that are deliberately tuned to allow lots of things that we disagree with, that we strongly disagree with. We think we have a strong commitment to freedom of speech, freedom of the press. We think these are essential ingredients in a free society. We think that it would be a failure for us to build a new kind of network that can’t support those ideals. And we want to design the network in a way where people are in control of their experience, where they’re able to do that stuff. We’re at the very early innings of that. We don’t have all the answers for how those things will work. We are making a new thing. And literally, we launched this thing one day ago. We’re going to have to figure a lot of this stuff out. I don’t think…

You have to figure out, “Should we allow overt racism on Substack Notes?” You have to figure that out.

No, I’m not going to engage in speculation or specific “would you allow this or that” content.

You know this is a very bad response to this question, right? You’re aware that you’ve blundered into this. You should just say no. And I’m wondering what’s keeping you from just saying no.

I have a blanket [policy that] I don’t think it’s useful to get into “would you allow this or that thing on Substack.”

If I read you your own terms of service, will you agree that this prohibition is in that terms of service?

I don’t think that’s a useful exercise.

Okay. I’m granting you the out that when you’re the email service provider, you should have a looser moderation rule. There are a lot of my listeners and a lot of people out there who do not agree with me on that. I’ll give you the out that, as the email service provider, you can have looser moderation rules because that is sort of a market-driven thing, but when you make the consumer product, my belief is that you should have higher moderation rules. And so, I’m just wondering, applying the blanket, I understand why that was your answer in the past. It’s just there’s a piece here that I’m missing. Now that it’s the consumer product, do you not think that it should have a different set of moderation standards?

You are free to have that belief. And I do think it’s possible that there will be different moderation standards. I do think it’s an interesting thing. I think the place that we maybe differ is you’re coming at this from a point where you think that because something is bad… let’s grant that this thing is a terrible, bad thing…

Yeah, I think you should grant that this idea is bad.

That therefore censorship of it is the most effective tool to prevent that. And I think we’ve run, in my estimation over the past five years, however long it’s been, a grand experiment in the idea that pervasive censorship successfully combats ideas that the owners of the platforms don’t like. And my read is that that hasn’t actually worked. That hasn’t been a success. It hasn’t caused those ideas not to exist. It hasn’t built trust. It hasn’t ended polarization. It hasn’t done any of those things. And I don’t think that taking the approach that the legacy platforms have taken and expecting it to have different outcomes is obviously the right answer the way that you seem to be presenting it to be. I don’t think that that’s a question of whether some particular objection or belief is right or wrong.

I understand the philosophical argument. I want to be clear. I think government speech regulations are horrible, right? I think that’s bad. I don’t think there should be government censorship in this country, but I think companies should state their values and go out into the marketplace and live up to their values. I think the platform companies, for better or worse, have missed it on their values a lot for a variety of reasons. When I ask you this question, [I’m asking], “Do you make software to spread abhorrent views, that allows abhorrent views to spread?” That’s just a statement of values. That’s why you have terms of service. I know that there’s stuff that you won’t allow Substack to be used for because I can read it in your terms of service. Here, I’m asking you something that I know is against your terms of service, and your position is that you refuse to say it’s against your terms of service. That feels like not a big philosophical conversation about freedom of speech, which I will have at the drop of a hat, as listeners to this showknow. Actually, you’re saying, “You know what? I don’t want to state my values.” And I’m just wondering why that is.

I think the conversation about freedom of speech is the essential conversation to have. I don’t think this “let me play a gotcha and ask this or that”–

Substack is not the government. Substack is a company that competes in the marketplace.

Substack is not the government, but we still believe that it’s essential to promote freedom of the press and freedom of speech. We don’t think that that is a thing that’s limited to…

So if Substack Notes becomes overrun by racism and transphobia, that’s fine with you?

We’re going to have to work very hard to make Substack Notes be a great place to have the readers and the writers be in charge, where you can have the kinds of conversations that you find valuable. That’s the exciting challenge that we have ahead of us.

I get the academic aspect of where Chris is coming from. He’s correct that content moderation hasn’t made crazy ideas go away. These are the reasons I coined the Streisand Effect years ago, to point out the futility of just trying to stifle speech. And these are the reasons I talk about “protocols, not platforms” as a way to explore enabling more speech without centralized systems that suppress speech.

But Substack is a centralized system. And a centralized system that doesn’t do trust & safety… is the Nazi bar. And if you have some other system that you think allows for “moderation but not censorship” then be fucking explicit about what it is. There are all sorts of interventions short of removing content that have been shown to work well (though, with other social media, they still get accused of “censorship” for literally expressing more speech). But the details matter. A lot.

I get that he thinks his focus is on providing tools, but even so two things stand out: (1) he’s wrong about how all this works and (2) even if he believes that Substack doesn’t need to moderate, he has to own that in the interview rather than claiming that Nilay is playing gotcha with him.

If you’re not going to moderate, and you don’t care that the biggest draws on your platform are pure nonsense peddlers preying on the most gullible people to get their subscriptions, fucking own it, Chris.

Say it. Say that you’re the Nazi bar and you’re proud of it.

Say “we believe that writers on our platform can publish anything they want, no matter how ridiculous, or hateful, or wrong.” Don’t hide from the question. You claim you’re enabling free speech, so own it. Don’t hide behind some lofty goals about “freedom of the press” when you’re really enabling “freedom of the grifters.”

You have every right to allow that on your platform. But the whole point of everyone eventually coming to terms with the content moderation learning curve, and the fact that private businesses are private and not the government, is that what you allow on your platform is what sticks to you. It’s your reputation at play.

And your reputation when you refuse to moderate is not “the grand enabler of free speech.” Because it’s the internet itself that is the grand enabler of free speech. When you’re a private centralized company and you don’t deal with hateful content on your site, you’re the Nazi bar.

Most companies that want to get large enough recognize that playing to the grifters and the nonsense peddlers works for a limited amount of time, before you get the Nazi bar reputation, and your growth is limited. And, in the US, you’re legally allowed to become the Nazi bar, but you should at least embrace that, and not pretend you have some grand principled strategy.

This is what Nilay was getting at. When you’re not the government, you can set whatever rules you want, and the rules you set are the rules that will define what you are as a service. Chris Best wants to pretend that Substack isn’t the Nazi bar, while he’s eagerly making it clear that it is.

It’s stupidly short-sighted, and no, it won’t support free speech. Because people who don’t want to hang out at the Nazi bar will just go elsewhere.

Filed Under: , , , , ,
Companies: substack

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Substack CEO Chris Best Doesn’t Realize He’s Just Become The Nazi Bar”

Subscribe: RSS Leave a comment
172 Comments
This comment has been deemed insightful by the community.
Samuel Abram (profile) says:

Chris Worst: The GOP Politician

There used to be a certain brand of GOP politician who wouldn’t say outright what they believed (now all of them are cuckoo bananas like Marjorie Taylor Greene). Chris Worst’s waffling and weaseling reminded me of those masked GOP politicos.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Wow, yeah, just apply your TOS and/or firmly held philosophical beliefs and answer the damn question.

i guess there are people who will give that bucket of crap response the benefit of the doubt, somehow. And the Nazis. They’re def in.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Chris, if you’re reading this, you are in Stage 1 of the Nazi Bar feeling: Denial.

Unless you’re pretending and you’ve always wanted to wage a genocidal war against the rest of America.

There are still a lot of good people on Substack, like Ken White, Dave Karpf, and a few black writers I’ve had the pleasure of reading. Please speedrun through the 5 stages of the Nazi Bar feeling and ask Mike what to do about these Nazis in your bar.

haitchfive (user link) says:

Re: People who love the nazi bar

Good people? By some standards, sure.
Lovely people to talk to about the garden and football. But they know where they are.

They are at the nazi bar.

They are free speech maximalists too, at the expense of everything else, including people who look like Nilay

Steven (profile) says:

We need a heavy handed dictatorship!

Seriously though, I think a twitter like site that came out and said, we are going to be ruthless in banning intolerance, misinformation, and other nonsense would actually do pretty well. That is assuming it was doable in a way that didn’t catch too many innocents in the crossfire.

And no, I don’t care about the paradox of tolerance. I’m fine being intolerant of intolerance.

time traveler says:

Re: I come from the future

So uh just Threads then? if you can get passed Meta’s glaring faults they do have heavy handed moderation on their twitter-clone (no nudes is crazy though).

This comment has been deemed insightful by the community.
Thad (profile) says:

I can think of at least two things wrong with that headline.

Substack Notes may be a new feature, but Substack appealing to bigots isn’t new. It’s also not an unintended consequence; it’s a deliberate move on the part of management.

I suppose it’s possible that they simply don’t understand what they’re doing, that they’re courting bigots in the name of “diversity of opinion” as some kind of both-sides View from Nowhere approach. That’s how a lot of the mainstream press works (New York Times, looking at you). Maybe they really are just hopelessly naive. But I don’t think so. Paying a bunch of bigots to post on Substack and not disclosing it is the behavior of a publication that knows it’s doing something questionable.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re:

it’s possible that they simply don’t understand what they’re doing

I know you already addressed this in your comment, but I’d like to chime in with something: Based on how Best framed questions about moderation of racist content as “gotchas”, he knows exactly what he’s doing.

Bobson Dugnutt (profile) says:

Re: Re: Re:

What Substack did/does is bad but forgivable. Substack could release its slate of paid contributors, or it could be leaked, but however the information gets out there, would it hurt Substack as an institution or the quality of its paid vs. voluntary creators?

What is extremely unethical? Well, Fox News by the nature of its very existence. It leaves the word “News” in its business name ironically, like a trucker hat and Pabst Blue Ribbon for a hipster. Functionally, Fox News serves the same function to the Republican Party as the Grand Inquisitor does for the Vatican.

Something else that’s extremely unethical: Forbes. This is something that horrifies journalists. Lewis D’Vorkin pivoted Forbes from a business magazine with some cachet into an SEO play. D’Vorkin was the one who brought the Forbes contributor model to fruition. The less polite term journalists call this is “credibility whoring.”

A contributor could write about a sphere of expertise, usually something they have a vested interest in (like a PR person specializing in alcohol brands who writes about trends in the beverage industry). Most of Forbes traffic comes through its contributor network, though it still maintains a traditional reporting staff and editorial control in house.

However, anything published under the Forbes masthead, contributor or staff writer, is credited to Forbes. “A Forbes article,” “as seen in Forbes,” “Forbes said”, etc.

What gets worse is that there’s an industry of web and social marketing that charges thousands of dollars for a “Forbes” piece (always by a contributor), and SEO experts who use these Forbes contributor articles to get a top ranking position on search engines.

Forbes knows about this and encourages this flim flam because it’s good for its bottom line.

This comment has been deemed insightful by the community.
Andrew F (profile) says:

grand experiment in the idea that pervasive censorship successfully combats ideas that the owners of the platforms don’t like. And my read is that that hasn’t actually worked

Of course it works. It doesn’t work perfectly, in the sense that traffic laws don’t work perfectly but still reduce fatalities.

But there’s a reason 8chan is 8chan and every other site is not.

Anonymous Coward says:

Jesus Christ.

Find it very hard to believe he isn’t sympathetic to rightwing ideas. I mean Substack is already terf central, and the infrastructure provider argument is complete bogus. Fuck this guy and his shitty site.

This comment has been deemed insightful by the community.
weevie833 (profile) says:

The data

First, this is the most sharply concise confrontation with a CEO I have ever seen on this issue and I appreciate the author for having curated it.

What upsets me here, however, is Best’s cynical claim that efforts to censor hate speech is pointless:

“And my read is that that hasn’t actually worked. That hasn’t been a success. It hasn’t caused those ideas not to exist. It hasn’t built trust. It hasn’t ended polarization. It hasn’t done any of those things. And I don’t think that taking the approach that the legacy platforms have taken and expecting it to have different outcomes is obviously the right answer the way that you seem to be presenting it to be.”

First, he is missing the point, completely – it’s not about “ending” bad ideology – it’s about preventing it from metastasizing. There will always be Nazis. The least we can do is limit its reach through some kind of corporate ethics.

Second, he cannot make any kind of claim that “censorship doesn’t work” unless there is some kind of legitimate study that measures some degree of influence (don’t ask me how) under conditions “X” versus conditions “Y.” He’s just being lazy.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

I feel like he said that because it’s part of the rightwing fantasy world that social networks have been heavy handed in combating bigotry, when it’s really the opposite.

This comment has been deemed insightful by the community.
kevin says:

Re:

He’s not missing the point, he’s deliberately avoiding it. Substack has a history of recruiting and promoting bigots. He’s okay with it but doesn’t want to come out and say that.

This comment has been flagged by the community. Click here to show it.

Adolfina says:

Re: Re:

I think I can summarize the sentiment of the Woke Snowflake Brigade here: Let’s ban anyone who says anything we don’t like and who espouses views which we cannot reconcile with our myopic view of a world filtered through the narrow lens of our own existential neurosis.

Free speech isn’t something you can selectively apply to things you agree with while demanding everything else be moderated or censored. It either applies to all or none equally.

The irony of a “brown” guy provoking a SaaS CEO for NOT being a moderation “Nazi” is just pure comedy gold. He whipped out the race card and used it as the basis for bullying Chris Best into corner. It’s an underhanded, cowardly, sleazy tactic.

To the ultra-lib congregation here, however, he’s a fucking hero for being such an obnoxious bully.

But I digress.

Flippantly calling someone a Nazi likens them to a cabal of genocidal warlords and is a stupid, grotesque exercise in ridiculous hyperbole.

Racism, bigotry, hatred and discrimination have existed for millenia and this will never, ever change. It is a cold, ugly fact of humanity.

Here’s what I propose:

Let’s just ban the Internet because someone somewhere might say something mean. Hell, let’s ban movies and books and television shows and bands and music you disagree with.

Let’s ban the left wing, the middle and the right wing. Let’s moderate, revise, redact everything from history to literature to politics.

I suppose all you would be left with are instructional coloring books about little six-year-old Tommy’s forced hormone suppression therapy, a regimented lexicon of allowed words and phrases and those patently milquetoast Ed Sheeran albums on permaloop.

Alternatively, you could all just find something more important to do with your lives besides getting butt-hurt everytime your fragile idealism gets crushed by the big mean world…

Anonymous Coward says:

Re: Re: Re:

“Alternatively, you could all just find something more important to do with your lives ”

Says the person who spewed minutia rant remnants on a blog.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:

I think I can summarize the sentiment of the Woke Snowflake Brigade here: Let’s ban anyone who says anything we don’t like and who espouses views which we cannot reconcile with our myopic view of a world filtered through the narrow lens of our own existential neurosis.

Every accusation, a confession.

bhull242 (profile) says:

Re: Re: Re:

I think I can summarize the sentiment of the Woke Snowflake Brigade here: Let’s ban anyone who says anything we don’t like and who espouses views which we cannot reconcile with our myopic view of a world filtered through the narrow lens of our own existential neurosis.

Well, you sure failed at summarizing the actual sentiment, so you thought wrong.

Racism, bigotry, hatred and discrimination have existed for millenia and this will never, ever change. It is a cold, ugly fact of humanity.

No one said otherwise. The issue is, “Do I want it on my property for everyone to see?”, “What likely social and/or financial consequences might I see if I allow this to remain on my property for everyone to see?”, and “Do I want to help spread this?” on the part of the platform holders, while it is, “Do I want to see this on my social media feed (or whatever)?” or, “Do I want to support a platform that has this on their property?” for the “woke” users. And for us, it’s largely about owning up to what kind of platform you want to run.

Let’s just ban the Internet because someone somewhere might say something mean. Hell, let’s ban movies and books and television shows and bands and music you disagree with.

Let’s ban the left wing, the middle and the right wing. Let’s moderate, revise, redact everything from history to literature to politics.

That would be a combination of throwing the baby out with the bathwater and cutting your nose to spite your face, especially since it also misses the entire point.

I suppose all you would be left with are instructional coloring books about little six-year-old Tommy’s forced hormone suppression therapy, […]

How many times do we have to point out that none of us, and virtually no transgender people or pro-trans activists, support forced hormone suppression therapy at all, let alone for young children. We also haven’t supported hormone suppression therapy for young children, where they are below the safe age for their use, and that the therapy should only be provided (at least to minors) after long, detailed consultations with a licensed medical practitioner and a diagnosis that would make such treatment medically useful.

Moreover, outside of one or two isolated instances at best, there is no evidence that that sort of thing even happens at all, let alone as an intended goal and/or a larger trend. This is a strawman.

Alternatively, you could all just find something more important to do with your lives besides getting butt-hurt everytime your fragile idealism gets crushed by the big mean world…

Pot, meet kettle.

This comment has been deemed insightful by the community.
nasch (profile) says:

Re: Re: Re:2

support forced hormone suppression therapy at all, let alone for young children.

It doesn’t even make sense to talk about hormone suppression therapy for young children, because they’re not experiencing the puberty hormones that the therapy is designed for. There’s nothing to suppress. But of course that doesn’t stop the idiot right.

That One Guy (profile) says:

Re: Re: Re:2

How many times do we have to point out that none of us, and virtually no transgender people or pro-trans activists, support forced hormone suppression therapy at all, let alone for young children.

It is difficult to get a person to understand or admit to understanding something when their entire argument depends upon their real or feigned ignorance of it.

Every medical professional in the world could jointly release a statement saying that the claims made by the person you are talking to are wrong/lying about what’s actually being done and unless they are willing to admit that they’ve been beating up a strawman in order to satisfy their bigotry and/or genital obsession it still wouldn’t change a thing.

C. E. Ramsey (profile) says:

Re: Re: Re: Bigotry

“Racism, bigotry, hatred and discrimination have existed for millenia and this will never, ever change. It is a cold, ugly fact of humanity.”

Nobody is denying that.

You know, when I was four years old, a Nazi in full Nazi regalia, tried to check into my grandparents’ motel. That was just 16 years after the end of WWII. Just 16 years since my grandpa had been shot at while riding tailgunner in a B-17.

And when this Nazi showed up, my grandpa didn’t give him a room. He physically threw him out of the motel office when he refused to leave.

The guy whined, called the cops, and when they arrived, they told him to move his ass on down the road because if my grandpa didn’t beat the crap out of him, the cops would.

That Nazi still had freedom of speech but the businesses there weren’t going to tolerate it. My grandpa didn’t want Nazis coming to his motel.

Chris Best is making a choice to give space to Nazis. This isn’t censorship. It’s a business making a decision about its customer base. Maybe Chris Best wants those Nazis but he should embrace it. Because those of us that don’t want to do business with Nazis, that don’t want to associate with them – we’ll take our business elsewhere.

And here you are trying to conflate the actions of a private business with government censorship. You are either very confused or you’re doing it intentionally, which is a very common Nazi tactic.

So congratulations, Adolfina. We know who and what you are.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re:

There will always be Nazis. The least we can do is limit its reach

That’s mainly what pisses the “Freeze Peach” crowd off: They believe they (and their awful ideas) are entitled to free reach.

This comment has been flagged by the community. Click here to show it.

tin-foil-hat says:

Re: Re: If I don't agree ...

But it isn’t only Nazis. Some people think anything that offends them or anything they disagree with is hate speech.

Anonymous Coward says:

Re: Re: Re:

Which describes the 74 million and the rabid “warriors” who think it’s okay to harass people for a difference in opinion.

This comment has been deemed insightful by the community.
Orwelldesign says:

Re: Re: Re:2 Depends on the "difference of opinion" doesn't it?

I have a trans son. I am of the opinion that he deserves to exist. Many people are not. A great many people, most of them on the “right” and most of them Republicans.

LGBTQ rights are one of those subjects where there’s two sides and one of them is objectively wrong. That side is full of shitty bigots who try to impose their narrow Christianity on everyone else, as though their opinion on morality should somehow apply to my child.

Eff that. With a rusty spoon. Trans rights are human rights — and the “difference” that says it’s not okay for him to be who he can’t help being? Objectively shitty and wrong.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:3

Your child is whatever sex their body is, and will only ever be that sex. There is no such thing as “deserving to exist”, for your child or anyone else, except in the vacuous sense that, in a free society, no one is required to get permission to exist from anyone else. So your child is entitled to exist, but they are not entitled to have their delusion about their sex recognized as true by anyone else, and even if they live in a place where the government does force people to affirm the trans delusion, there is nothing they can do to make the physical universe comply.

bhull242 (profile) says:

Re: Re: Re:4

Your child is whatever sex their body is, and will only ever be that sex.

Again, that isn’t a statement anyone disagrees with, including transgender people.

So your child is entitled to exist, but they are not entitled to have their delusion about their sex recognized as true by anyone else, […]

Again, they are well aware of what their sex is, which is in agreement with everyone else.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:5

And yet, they seek to force their way into single-sex disc’s for which their bodies disqualify them. Transwomen are no more women than Messianic Jews are Jews. People can assert anything about themselves that they want, but they are not going to be allowed to force other people to affirm their delusions.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re: Re:6

but they are not going to be allowed to force other people to affirm their delusions.

But you will try to force others to agree with your delusion that ex and gender are the sane thing.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:6

they are not going to be allowed to force other people to affirm their delusions

Yes, yes, we all know that conservatives are working on a national ban on gender-affirming healthcare for trans people of all ages. Or do you think we didn’t hear about Missouri’s state attorney general banning it for adults?

Every street has a destination; no street ever reached its destination by accident. Anti-Queer Street started with attacks on gay people. When those became too much for the American public to stomach, the street branched off to focus its attacks on trans people⁠—a much smaller and much more vulnerable part of the population. Every attack that was tried on gay people is now aimed at trans people, and they’re working with frightening efficacy. In fact, of the ten stages of genocide, anti-trans assholes are up to at least Stage 6, if not Stage 7. Moves like the Missouri AG’s are inarguably an attempt to persecue trans people by denying them their civil rights⁠—after all, if cis people have the right to undergo gender-affirming treatments (e.g., getting breast implants), trans people should have the same right.

“There’s still time to change the road you’re on,” to lift a lyric from a song that is now in the National Recording Registry. Every time you attack trans people, you’re walking down the same street as the people who want to get rid of trans people by any means necessary. Whether you agree with their means or their ends doesn’t matter⁠—you’re repeating their anti-trans rhetoric and sharing the same anti-trans views that serve as the foundation for their increasingly violent hatred. And once they’re done with trans people, they won’t stop there. The street will turn back to the attacks on gay people⁠—as well as all other queer people, natch⁠—and you’ll have to choose whether you’ll join their cause or stand against it.

When those leopards you’re walking with eventually turn on you, don’t come crying to me about it. You don’t deserve and won’t receive my pity for your ignorance and hatred. Repent now or forever guard your face.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:7

You misplace where the violence is occurring. Men with the trans delusion have been physically attacking women who are protesting against men forcing their way into women’s spaces. It should come as no surprise that crazy men denied getting their way lash out at women. It’s what crazy men have done from time immemorial.

Transwomen are men. When they seek to force their way into women’s spaces, it is up to the voters and the law to stop them. Woke gender ideologues attached their lies and delusions to the gay liberation movement, hijacking the goodwill that Americans finally developed towards letting gay people live their lives as they see fit into a coercive force demanding affirmation of lies from people who do not believe them.

The worst part is that when woke gender ideologues colonized the educational and medical professions, it became forbidden to regard gender dysphoria as an illness that should be treated so that sufferers could learn to live comfortably in the only bodies they would ever have. Instead, when children manifest this condition, the colonizing ideologues “support” them by telling them that they do not have to be the sex of their bodies, and lead them on to mutilation and sterilization. And when the colonizers can infect government as well, they even pass laws making illegal treatment for people unhappy with their sexual orientation or gender dysphoria.

Your scaremongering about genocide pales in comparison to the “doctors” cutting the breasts off 13-year-old girls. One is emotional blackmail trying to force people to accept lies against their better judgement. The other is horrible destruction of children actually happening.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:8

You know why girls want to have their breasts chopped off? It’s because straight scum like you made it so hostile for girls, denying them the right to their bodies, that they needed others to do it to them first before you got to them.

Every decision made by a woman or a sexual and gender minority is the result of the harms that straight men have done to them.

Your failure to own your mistakes is why we relish in your mockery.

Anonymous Coward says:

Re: Re: Re:9

Uh.

While I’m aware that breast reduction surgery/mastecomy exists, it’s only done on request and in very dire medical situations, like, for example, to reduce or eliminate the spread of cancer.

That goes for Mr. Troll here, and double for Hyman Rosen.

Anonymous Coward says:

Re: Re: Re:11

I’ll have to admit that this is new to me, and this medical procedure does not publicly happen in my neck of the woods.

The more you know!

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
bhull242 (profile) says:

Re: Re: Re:8

Men with the trans delusion have been physically attacking women who are protesting against men forcing their way into women’s spaces.

[citation needed]

The worst part is that when woke gender ideologues colonized the educational and medical professions, […]

Read: When medical professionals and scientists started realizing that sex and gender among humans are a lot more complicated than they originally thought…

[…] it became forbidden to regard gender dysphoria as an illness […]

  1. Not every transgender person has gender dysphoria.
  2. Gender dysphoria is officially recognized as a mental disorder. You just don’t agree with the treatment.

[…] that should be treated so that sufferers could learn to live comfortably in the only bodies they would ever have.

Because all the research had shown that it wasn’t working, and technology had developed to allow us to make bodily modifications (like plastic surgery and hormone replacement therapy) to make their bodies feel more comfortable for them. But yeah, when a treatment method doesn’t work, doctors tend to stop recommending it. There’s no conspiracy or anything there.

Instead, when children manifest this condition, the colonizing ideologues “support” them by telling them that they do not have to be the sex of their bodies, and lead them on to mutilation and sterilization.

Recommending gender-affirming therapy (regardless of the fact that this just doesn’t happen the way you think with regards to children) is not telling them that they do not have to be the sex of their bodies. Literally no one is saying that.

As for the “mutilation and sterilization”, again, those treatments are rarely even offered to minors with gender dysphoria to begin with, and the physiological treatments that are offered nearly always go through a lot of steps of consultations and such before they are ever actually performed. Moreover, the patient can always say “no”, are nearly always fully informed (certain malpractice cases notwithstanding as malpractice occurs with literally every treatment of any illness and isn’t any more prevalent for this one than anything else), and are not pressured into saying “yes”.

And when the colonizers can infect government as well, they even pass laws making illegal treatment for people unhappy with their sexual orientation or gender dysphoria.

Because it is a scientifically demonstrated fact that neither sexual orientation nor the gender identity of cis- or transgender people can be changed, period, but are set in stone at birth, and the treatments have been shown to do a lot of harm. You can maybe convince them to pretend it has changed, but that doesn’t mean that it actually changed.

Oh, and remember how you were just complaining about them being told they could change their sex even though they can’t? Guess what: sexual orientation and gender identity are also determined at birth, just like physiological and genetic sex! The therapy you propose has that exact same problem, only doctors don’t actually claim that physiological or genetic sex can actually change (except superficially for physiological sex).

When a medical treatment demonstrably does not actually work and does a lot of harm when attempted, it gets banned as a treatment. This is also the case for chelation to attempt to treat autism: it doesn’t work, it’s extremely harmful, and it’s often banned as a result of that.

Your scaremongering about genocide pales in comparison to the “doctors” cutting the breasts off 13-year-old girls.

I see no evidence that this is a trend for treatment of gender dysphoria, specifically, rather than the many other conditions for which mastectomies are intended to treat (like breast cancer).

One is emotional blackmail trying to force people to accept lies against their better judgement.

Note that you still haven’t demonstrated anything actually said by pro-trans activists, doctors treating transgender patients, or transgender people themselves that is a lie. If anyone is lying here, it’s you when you repeatedly make false statements about what people actually claim or do. I’d prefer to say “mistaken”, but you’ve been corrected on this numerous times, so this is clearly willful ignorance on your part.

bhull242 (profile) says:

Re: Re: Re:9

I see no evidence that this is a trend for treatment of gender dysphoria, specifically, rather than the many other conditions for which mastectomies are intended to treat (like breast cancer).

Scratch this one. It does happen, but even so, this is a valid medical treatment, and it does minimal harm. That you don’t like it doesn’t make it immoral, nor does it make it medical malpractice to do so. That you don’t (or won’t) understand the science is fine as you are a layperson on this subject, but your opinions should not dictate what treatments should be allowed; that should remain between the doctor (as informed by up-to-date scientific research and ethics established by the medical community at large), the patient, and (where the patient is a minor) their parent(s)/guardian(s).

Anonymous Coward says:

Re: Re: Re:8

I am whatever gender the fuck I say I am. If that pisses you off, all the fucking better.

Get ready for the BBC futanari femboy paradigm shift, Hyman. Hold onto your ass because you won’t be able to handle it when it happens.

bhull242 (profile) says:

Re: Re: Re:6

And yet, they seek to force their way into single-sex disc’s for which their bodies disqualify them.

Nope. You can say that their bodies disqualify them all you like, but that doesn’t mean that the rule is actually based on your sex-at-birth rather than being a complicated, varied rule based on a combination of sex, gender identity, and ability to pass as a certain gender, along with some safety issues.

More importantly, none of that is relevant to what I said. That transgender people want to be treated according to their gender identity rather than the sex on their birth certificate at birth—including as to access to single-sex spaces—does not entail a belief that their physiological sex can change other than superficially or that their physiological and genetic sex doesn’t match what sex was on their birth certificate when they were born. The problem is that you believe a desire to change the rules is equivalent to a belief that one is already within the rules, but it’s not.

No matter how many times you point out transwomen wanting to enter women-only spaces, and no matter how true that statement is, that has absolutely no relevance to whether they hold the beliefs you ascribe to them as to their own bodies, nor is it evidence of any delusion at all.

Transwomen are no more women than Messianic Jews are Jews.

Religion is not analogous to gender. You aren’t born with certain religious beliefs (no matter what some claim), but gender and sex are both determined almost exclusively at birth, including for transgender people. Your inability to recognize the biological factors of gender identity and that they are determined at birth just like sexual orientation doesn’t change the facts of the matter.

And yes, transwomen are women. They may not be female physiologically and/or genetically, but that’s not necessarily required to be a woman socially or mentally. Maybe they aren’t women under your definition of the term, but yours isn’t the only valid definition.

People can assert anything about themselves that they want, but they are not going to be allowed to force other people to affirm their delusions.

Again, you have not demonstrated a belief that they actually have that is demonstrably false, so you haven’t demonstrated that they are delusional.

You have pointed to two things:

  1. An alleged delusion that their sex is not what is on their birth certificate (which isn’t a belief that they actually have).
  2. Their preference to use single-sex spaces corresponding to their gender identity rather than just their physiological sex (which isn’t even a belief about reality in the first place because that is an opinion about how reality should be).

Neither of these are actually beliefs about reality which transgender people actually have or which trans activists actually push, so they cannot be delusions that they have. And without delusions that they actually have, you cannot show that they are forcing their delusions on anyone at all.

If you want to argue about which single-sex spaces transgender people should be able to use and whatnot, fine. However, that doesn’t justify calling them “delusional” or saying that they have “delusions”. You have still failed to demonstrate that they are/have.

Toom1275 (profile) says:

Re: Re: Re:7

[The followimg is transcribed from a snarky but on-point meme/comic by Sophie Labelle:]

Is radical cis-hetero ideology being taught to your children?

As we speak, they are tricking kids into believing the abusive and authoritarian “Gender Essentialist Theory” which says the way our body looks defines who we are and who we love.

The grooming doesn’t stop there. They raise, educate, and dress kids differently based on their genitalia. They’re also passing laws to make kids who are trans dress in a certain way. Obsessed much?

Public-funded schools are also hiding the fact that 99.99% of their library books feature openly heterosexual relationships, and not a single trans, gender nonconforming or intersex character.

Are your children being brainwashed?

This comment has been deemed insightful by the community.
haitchfive (user link) says:

Re: Chris

What Chris is actually saying is that he prefers to keep his idealised model of free speech maximalism in his mind, impolute, pristine, and untouchable before ever recognising the very practical, physical, flesh and bone reality of non-white, non-straight people being persecuted, mistreated, and at times deported, and worse still sometimes beaten up, tortured, and murdered.

And as such, he is one raging bigot.

He likes to pretend he stands on some precious intellectual vantage point, a crystal cathedral of moral rectitude –all fascists believe that– but he’s really dragging himself into the mud of a pigsty in a mess of decaying parts of human guts, and tendons, and human misery that is Nazism.

I celebrate Nilay’s and Mike’s condemnation of Chris Worst, one raging Nazi who runs a nauseating Nazi bar.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

“It hasn’t built trust.”

I mean, that’s just not true, and is ignoring completely how trust & safety teams work. Obviously no site can be perfect, because that’s not possible. And you won’t be trusted by everyone, because that’s also not possible when one persons safety is another person’s ban.

Pretending that he can create a network that is welcoming for both the Nazis and minorities, the incels and women, etc. is just nonsense. No rule but “as long as its legal speech” is choosing to be trusted by bigots and trolls, and if he sticks to those rules he can definitely succeed at being a safe place for them, and no one else.

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Let The Users Decide

If you’re not going to moderate, and you don’t care that the biggest draws on your platform are pure nonsense peddlers preying on the most gullible people to get their subscriptions, fucking own it, Chris.

In the late 1400s, they used the same logic to try and outlaw the printing press. It sounds to me like someone is jealous that a platform is growing without doing moderation.

This comment has been deemed insightful by the community.
Kevin says:

Re:

He literally said it was legal: “And, look, sure, in the US, you can run the Nazi bar, thanks to the 1st Amendment.:

This comment has been deemed insightful by the community.
Staid Winnow says:

Bullying and Fascism are more popular than one would think

Chris Best wants all the revenue he can get. So he cannot disavow Nazis. This is the best way he could say “Nazis are welcome.”

Besides, it already excuses Andrew Sullivan and his gaggle of trans-bigots.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

I will be honest, I get the reasoning… He wants to grow and he wants to grow fast. That said, this is not the type of growth that one should want, it is the kind of clout that infects and sticks.

If he wants that, he is open to it but this two-step of just not saying the thing is a load of BS that we have all seen before. If he wants Nazi’s in his Bar, he should say it, take the money and don’t get on the soapbox. If he doesn’t, he should say it, refuse their cash, and kick them out of his bar.

Anonymous Coward says:

Re: Re:

One could even go a step further and even defend his response as it stands. Nazis are like bullies, they pathologically need people to bully.

That one isn’t me ofc. If fact, I think that’s the worst part of it.

This comment has been deemed insightful by the community.
Anon says:

Walls

The problem is – if everyone can see what you post, if it is randomly (algorithmically?) presented to anyone, then there is a stronger need for control and moderation than when there are silo-ed “chat rooms”. Anything anyone posts on Tik-tok or Twitter could show up in my feed. If I go to Reddit, I can enter a room where I only see content related to my interests. Therefore, Men-Behaving-Badly can coexist with Safe-Place-For-Women becuase the two universes don’t intersect. People can be banned in one and not the other. Twitter can’t say the same.

A service has to decide which one it is – and if it wants to feed everyone’s posts to everyone, it has to tamp down the crazies – because in an anonymous universe, crazy will always show up.

Anonymous Coward says:

I am broadly sympathetic with everything written here, and obviously the question is very pertinent as Substack moves into consumer apps, but I do think this piece should reckon with:

1 – substack has had this approach for a long time

2 – they continue to have a fairly good reputation and not be a “nazi bar”

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re:

they continue to have a fairly good reputation

That may change in the near future, given this interview and the way Best kept repeatedly stepping on the same rake. Most people don’t want a “Freeze Peach” social platform⁠—just look at how Parler, Gab, and Truth Social all failed to permanently siphon a significant number of users (and any number of significant users) from Twitter.

Anonymous Coward says:

Re: Re:

I agree they’re playing with fire with the consumer facing app and an algorithmic feed. But I do think it is worth noting many of these criticisms could have been (and were) made of their newsletter business and substack’s strategy in that regard seems to have been proven correct. So it’d be good to hear from Mike how long substack would have to continue not being a nazi bar until he’d reconsider this thesis

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:

many of these criticisms could have been (and were) made of their newsletter business and substack’s strategy in that regard seems to have been proven correct

And that strategy might serve that part of Substack well, but it won’t do anything to help its new social media service avoid looking like it’s welcoming to bigots. If Best truly wanted to make Substack Notes seem more appealing to broader audiences, he failed miserably. But I’m sure a lot of bigots will thank him for his willingness to step on that rake.

Anonymous Coward says:

Re: Re: Re:2

But I’m sure a lot of bigots will thank him for his willingness to step on that rake.

Or will they blame him for being almost honest and failing to attract users that they can attack?

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

#2 is false. They are well known as a Nazi bar outside right-wing and blinkered centrist circles

Bloof (profile) says:

Who could have known that the place that’s famous for being the ‘Intellectual Dark Web’ subscription blog service would be run by weaselly sh*theads who will tie themselves in knots trying to avoid openly admitting that they’re perfectly fine with, and will go out of their way to protect racists, bigots and skull measurers? Next someone will tell me Quillette is not the centrist outlet they claim to be and Fox News isn’t fair and balanced.

This comment has been deemed funny by the community.
K`Tetch (profile) says:

Substack in a nutshell

someone was asking me the other day about the Musk/tiabbi thing, and the substack mess, and didn’t understand what substack was. I think I came up with the best, simplest explination for what it is.

“Substack is basically an onlyfans for people whose kink is paying for badly written bullshit”

apologies to the few good writers [still] on there, but that’s what it feels like it’s become.

This comment has been deemed insightful by the community.
Samuel Abram (profile) says:

Re:

I subscribe to the good writers there, like this one: https://yourlocalepidemiologist.substack.com/

and this one:

https://jessica.substack.com/

The former has been invited to the White House (the current one) and the latter is a published author with numerous tomes.

And then there’s Ken “Popehat” White, who needs no introduction.

There definitely are good (and even great) writers on Substack, it’s just that the boss pays Nazis to write articles without disclosure, and then there’s the interview Mike highlighted here, so…

Nazi bar.

This comment has been deemed insightful by the community.
glennf (profile) says:

Re: Re: It's been a Nazi bar for a while

They’ve spent at least two years, maybe longer, ensuring that people who spread harmful misinformation about viruses and transphobia have a home. I have one paid subscription and will be seeing if i can convince the newsletter person to either take my money directly or migrate…

This comment has been deemed insightful by the community.
K`Tetch (profile) says:

Chris' Moderation Problem

I think the moderation problem isn’t address here, because Chris Best doesn’t have a handle on what the actual problem is. If he uses his product (I don’t know, I don’t bother with them) then it’s as a content producer, and if he looks at some, he’s looking at it probably on an internal system, perhaps with internal filters applied to remove a lot of the scum (like looking at a NYT piece with the comments set to ‘NYT recommended only’.

As ‘CEO’ he’s never had to get down in the weeds, and spend hours doing the content moderation work. I spent 15 years doing it at TorrentFreak and it’s horrible soul-destroying work. You see things far less in terms of simplistic right and wrong. Sometimes things look bad on first glance but are redeemed when taken in context. Some things look ok on their own but then look terrible in greater context (such as a string of comments in a thread that collectively push a strong anti-semetic screed.

I think he assumes that ‘bad comments’ are along the lines of 4chan shitposting, and not some of the vitriol that’s out there now, some posing as reasonable arguments.

Seriously, him going ‘undercover boss’ (but not really) in the trenches in the content moderation team might open his eyes to reality. Might open the eyes of all kinds of CEOs, because the musks, zucks, Neil Mohan, etc. might benefit from actually seeing and interacting with the bad comments directly, instead of being shielded from it by well-meaning protective underlings.

Anonymous Coward says:

Re:

While you have a point that this is a part of the business that they should see…. I kind of fear what would happen. Content moderation work is what I like to call a “dirty job”, you work in the sewers to keep it functioning… and deal with some of the worse things that humanity has to offer.

You have some unpleasant memories of your work, their were articles of Facebook’s moderation operation that was… bracing. Zuck has seen this, was attempt to automate himself out of it (he can’t). Musk… Elon is notoriously fragile, I fear what happens given how bad the man is currently.

This comment has been deemed insightful by the community.
Dr. Fancypants, Esq. (profile) says:

"Gotcha" questions

This is a drop in the bucket of this problematic interview, but it’s fascinating to see Chris use “gotcha question” to just mean “question I’m unprepared to answer”. That’s a tic I strongly associate with GOP politicians.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

'They're more guidelines than rules really...'

Bloody hell did he botch that one. Whether it was due to dishonesty or just never having been asked the question the CEO really screwed that one up.

Someone asking you if you would remove content that’s explicitly against your TOS and you pretending that’s a ‘gotcha’ and that it’s ‘not that simple’ is someone who just got you to admit, whether you realize it or not that the TOS is optional and users can say whatever the hell they want on your service.

This comment has been deemed insightful by the community.
glennf (profile) says:

Patreon

I know every social network goes through this, but I recall when Jack Conte had a free-speech maximalist position for Patreon, so all the Nazis, men’s rights, and other sketchy people engaging in what was obviously hate speech, misinformation, and bigotry glommed onto Patreon. It’s possible credit-card companies forced the issue, but it was maybe 12 to 18 months after Conte double down that they kicked all the bozos off and had a real trust and safety team. At one point, that team had grown to a significant percentage of all staff (but I don’t know that they’ve disclosed the size in the last few years).

Amazing that nobody wants to learn from that, ever.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

What personally bugs me is that we keep having this dance.

Maybe it’s me but after being a poster on 4Chan for a number of years, and watching the non hostile spaces get hostile…. there is a purity of thinking when it comes to unmoderated spaces that doesn’t exist.

When it comes to content moderation there is literally no in between, you either do it or you don’t. If you don’t, you bare subject to observing some of the worst things that humanity can offer. There is a multitude of people simply observing and waiting for any service that will welcome them, taking total advantage of any lack in content moderation…. these are not good people.

That One Guy (profile) says:

Re: Re: 'Why didn't anyone tell me this wasn't as easy as I claimed?!'

What personally bugs me is that we keep having this dance.

I imagine it’s a mix of arrogance in thinking that they know better, ego in refusing to accept that maybe there’s a reason the platforms they are ragging on moderate as they do that’s not ‘they’re not as smart as me’ and ‘the leopards would never eat my face’ thinking that prevents them from understanding that when they are the ones running a platform then they are the ones that will be dealing with all the ‘charming’ people unwelcome on the current platforms and their position will mean they have to deal with those people.

This comment has been flagged by the community. Click here to show it.

Seth Edenbaum (user link) says:

https://www.middleeastmonitor.com/20190930-german-jewish-daughter-of-a-holocaust-survivor-warned-not-to-support-bds/

Germany: “German Jewish daughter of a Holocaust survivor warned not to support BDS.”

Now please give us your lectures on the definition of racist speech.

Are these sites platforms or publishers? Can ATT block cell service to opponents of the Prime Minister of India?
“https://www.amnesty.org/en/latest/news/2022/09/myanmar-facebooks-systems-promoted-violence-against-rohingya-meta-owes-reparations-new-report/”
https://www.theguardian.com/world/2023/apr/05/twitter-accused-of-censorship-in-india-as-it-blocks-modi-critics-elon-musk

How about Facebook?
https://www.amnesty.org/en/latest/news/2022/09/myanmar-facebooks-systems-promoted-violence-against-rohingya-meta-owes-reparations-new-report/

“Facebook owner Meta’s dangerous algorithms and reckless pursuit of profit substantially contributed to the atrocities perpetrated by the Myanmar military against the Rohingya people in 2017, Amnesty International said in a new report published today.

The Social Atrocity: Meta and the right to remedy for the Rohingya, details how Meta knew or should have known that Facebook’s algorithmic systems were supercharging the spread of harmful anti-Rohingya content in Myanmar, but the company still failed to act.”

Because I really trust “content moderation” for state and profit.

This comment has been flagged by the community. Click here to show it.

Troy says:

Not so simple

“Brown people should be allowed in America.” Gosh, that’s terrible and any such comments should be immediately deleted.

But what about the following:

“People from Mexico and Central America should not be allowed to become American citizens”

“Men should not be allowed to compete in women’s sports.”

“Trans-women are not women. They are trans-women.”

“Israel has created an apartheid system that is as bad as South Africa’s once was.”

“Israelis are racists.”

elmo (profile) says:

Nazi bar, 4chan

Nazi bar, 4chan: potato, po-tah-to

Thank you for putting in the work to produce a more erudite exposition.

It’s time to call out the moral bankruptcy of VC-funded “social network platforms” that don’t take on the social responsibilities while grifting for the almighty buck.

This comment has been deemed funny by the community.
Christopher Best (profile) says:

Been Years Since I Logged In Here

But I did just to say for some reason I really hope this CEO shuts up and gets out of the news.
For some reason.

This comment has been flagged by the community. Click here to show it.

tin-foil-hat says:

1st Amendment

Social media companies need to be regulated when they have a large market share similar to the way monopolies are regulated.

There are plenty of conversations that were, and still are, unfairly silenced. It’s a great way for a government to abuse its power, violating the first amendment via proxy.

Large social media companies should be subject to the provisions of the first amendment or lose their near complete immunity from liability when threats are ignored and someone gets hurt.

It’s easy to say “they are a private company, they can do as they please” when you generally agree with the way things are moderated. I find it interesting how little it takes for those same people when the moderator sets its sight on those whose behavior was never questioned.

At this time there is virtually no free press in the US. There is no media source that reports current events in their entirety. You’ll have to go outside of US sources. If you read foreign news sources for the first time, you’ll be surprised what’s going on in your own country.

This isn’t only a political bias issue. There is profit-driven bias suppressing even more information. It’s time to proactively open the information gate.

The collateral damage of free speech is that you’re going to be offended. Very offended. It’s worth it compared to the alternative and the risks that come with shutting down a large percentage of the population or allow a government to control the flow of ideas secretly.

Anonymous Coward says:

Re:

Sir

Are you sure you’ve read Techdirt?

Mike has criticized the actions of OLD Twitter and many questionable content moderation issues from websites, legacy media, and other industries.

Censorship by copyright is a thing, you know.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re:

I need to kill a little time, so I’mma just dissect your entire post.

Social media companies need to be regulated when they have a large market share similar to the way monopolies are regulated.

Yeah, see, here’s the thing: By saying “companies”, you’re already admitting that they’re not monopolies. Sure, Twitter and Facebook are large, but they’re not the be-all end-all of social media services. Smaller ones do exist⁠—and some of them even thrive because they’re small.

There are plenty of conversations that were, and still are, unfairly silenced. It’s a great way for a government to abuse its power, violating the first amendment via proxy.

Yeah, except they’re being “silenced” on a single platform that nobody has a right to use unless they own it. The use of Twitter, Facebook, and other such services is a privilege. That privilege can always be revoked.

As for the “government abuse” thing: Yeah, it’s possible, but as the “Twitter Files” has shown, it isn’t highly probable.

Large social media companies should be subject to the provisions of the first amendment or lose their near complete immunity from liability when threats are ignored and someone gets hurt.

For what reason should this be true?

It’s easy to say “they are a private company, they can do as they please” when you generally agree with the way things are moderated. I find it interesting how little it takes for those same people when the moderator sets its sight on those whose behavior was never questioned.

Truth Social and other “conservative-friendly” services moderate in ways that I don’t like. I still think they should have the right to moderate as they see fit.

At this time there is virtually no free press in the US.

Is that “free” as in “speech” or “free” as in “beer”?

There is no media source that reports current events in their entirety.

No one outlet can do that unless it dedicates itself to one story at the expense of all others.

If you read foreign news sources for the first time, you’ll be surprised what’s going on in your own country.

I read domestic news sources and I’m still surprised at the depths to which people who hold public office will sink.

This isn’t only a political bias issue. There is profit-driven bias suppressing even more information.

Yes, companies will always try to make a profit. A company that scares away customers by becoming a “Nazi bar” (i.e., a safe haven for bigots and assholes) often lowers its chances of making a profit.

It’s time to proactively open the information gate.

And what specific information/speech do you believe should be put on social media services?

The collateral damage of free speech is that you’re going to be offended. Very offended.

The collateral damage of compelled hosting of speech is that you’re going to see a lot of marginalized voices silenced by the storm of bigotry and hatred that will be flung their way.

It’s worth it compared to the alternative and the risks that come with shutting down a large percentage of the population or allow a government to control the flow of ideas secretly.

Banning bigotry on social media services is not the same thing as “shutting down a large percentage of the population”⁠—not by a long shot. And plenty of other social media services (that are far smaller than Twitter and Facebook) allow that kind of speech.

The thing that upsets people like you isn’t the idea of “free speech is being silenced”. What upsets people like you is the idea of “my speech isn’t being heard by a large enough audience”. Your entire screed amounts to support for “free reach”⁠—i.e., the idea that someone owes you a platform, a bullhorn, and an audience only because you exist and you think you have something to say.

You’re not entitled to a spot (or an audience) on Twitter, Facebook, Parler, Truth Social, a Mastodon instance, a Discord server (public or private), or any other social media service on This God Damned Internet. Believing otherwise is a “you” problem.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: ... says you're wrong

Social media companies need to be regulated when they have a large market share similar to the way monopolies are regulated.

To which I would ask two questions, or one question two ways:

Do you trust the current administration to be the ones determining what speech is and is not allowed?

Would you have trusted the last administration to be the ones determining what speech is and is not allowed?

There are plenty of conversations that were, and still are, unfairly silenced. It’s a great way for a government to abuse its power, violating the first amendment via proxy.

Which conversations do you believe are being ‘unfairly silenced’ and as always with this question be specific.

Large social media companies should be subject to the provisions of the first amendment or lose their near complete immunity from liability when threats are ignored and someone gets hurt.

Hate to break it to you but the first amendment means they get to decide which speech they do and do not allow on their speech which means you don’t want them bound by the first amendment you want to strip it from them.

This isn’t only a political bias issue. There is profit-driven bias suppressing even more information. It’s time to proactively open the information gate.

See previous question regarding what is being ‘suppressed’.

The current major sources of news may be bad but if you think letting the floodgates open so that everyone gets to post unchecked is going to be an improvement I’ve got a few bridges you might be interested in buying.

The collateral damage of free speech is that you’re going to be offended. Very offended. It’s worth it compared to the alternative and the risks that come with shutting down a large percentage of the population or allow a government to control the flow of ideas secretly.

How would the meme go, ‘tell me you’re almost certainly a straight white male without telling me you’re almost certainly a straight white male’? To frame social media that’s required to host all speech covered by the first amendment as though it would just be ‘offensive’ is an understatement of truly colossal proportions, not to mention ignores how any such platform would very quickly become utterly useless to it’s users in direct contradiction to your stated goal of more speech.

This comment has been flagged by the community. Click here to show it.

William Null says:

Once again, Chris proves that he's truly the Best

Chris is right about Nilay playing “what if” games. You deal with this stuff if it actually happens. On top of that, Nilay comes up as a rude dude. He interrupts Chris. That’s not how discussion works.

In a proper discussion you let the other person say what he or she has to say in full, then you talk or ask any follow up questions.

And let’s be honest, Chris has the right idea here – if you don’t like to read or see something, you just don’t read or look at it. You scroll past it. You do not demand the service provider to remove it. You ignore it. You just hate to admit it that stuff the sites have been doing under the “tRUsT anD saFEtY” departments is downright censorship only pushes dangerous ideas underground where people who normally would just laugh at idiots such as Alex Jones or David Icke are now like “Hmm… why are they censoring them? Maybe they’re right?” and may actually start believe those morons, and that this stuff conflicts with the “more speech” approach you have been championing for.

This comment has been deemed insightful by the community.
Janne says:

Freedom for who?

If they intend to maximize the “freedom” in the society, they need to understand that maximizing the freedom of racists and bigots automatically reduces the freedom of the people those people are bigoted against. So they need to ask themselves whose freedom should be maximized, as you can maximize it for everybody.

This comment has been flagged by the community. Click here to show it.

Diogenes says:

The Future is Fascist

Wow, just wow.

To Nilay: yes, someone should be able to say we need fewer brown people in this country in Substack Notes. They should be able to say it on Twitter. They should be able to say it on Facebook. “Journolize” somewhere else, you hack.

To Masnick: supporting a platform where unpopular opinions are aired isn’t being in a Nazi bar. It’s the opposite. Pretending to be concerned about the fine print doesn’t change that.

To the sheep in the comments: wow, you’re really obsessed with Nazis, and with calling anyone to the right of you a Nazi. Sounds like you still need a night light. Maybe mommy stopped nursing you to soon.

I am dismayed that free speech is really no longer a thing. What is coming will be far less free than what we have now, and nobody seems to care. The “right” will make Putin look like a Democrat, and the “left” will be much closer to Stalin.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

To Nilay: yes, someone should be able to say we need fewer brown people in this country in Substack Notes. They should be able to say it on Twitter. They should be able to say it on Facebook.

Why, so that they can drive brown people into silence? You are making the mistake that free speech means you can sat what you want where you want, including being able to attack those you disagree with, or hare, wherever they gather. That is being a NAZI in that you are promoting the dominance of one race over another.

John Wilson says:

Nazi bars and beer distributors

Substack is not a Nazi bar. Substack is a beer distributor that sells to a Nazi bar. The question is, do we care if distributors supply products to bad people. In general, we do not. And I’m not sure I trust corporations that distribute beer to make the right judgments.

Anonymous Coward says:

Re:

It’s more like the distributor starting to give bars they supply advice on how they should be run, or encouraging gossip (however slanderous) between them. It’s seems naive to think that will go well.

Anonymous Coward says:

Re:

So what if they are? The point is that the free market will place judgement on businesses by supporting or avoiding them based on the views they express through their business decisions.

Bud Light is a beer, not a bar. When they decided to put rainbows on cans, many conservative consumers got pissed. Many said they would never drink at a bar that carried Bud Light ever again. Those are free market choices being made in response to protected free speech.

If Substack wants to avoid moderating hate speech on their site, they are making a choice. When advertisers choose to avoid advertising on any of the hosted content, that is also a choice. When readers avoid anything on that site because they associate it with hate speech, that is also a choice.

There is a perception that there exists a “cancel culture” that somehow drives the censorship of right wing ideals. It is easy to see how that perception exists, but it is based on a fundamental misunderstanding of the “silent majority.” The reason that people and businesses who embrace hateful ideals get “canceled” is because most people don’t want to support that kind of hate. Those people might not be as loud politically and on social media. but they do “vote” through their free market economy choices. Advertisers aren’t dumb. They see this, and choose not to associate themselves with people and brands who are toxic to so many of their paying customers.

Bud Light felt comfortable putting rainbows on their cans because their own market research has shown that being friendly and nice to a frequently abused portion of the population is generally well supported by consumers of their products. They might not have expected the extent of insanity expressed by those people who disliked their can decorating choice, but they realize that it’s a very loud reaction from a very small portion of their consumers. It’s the same reason why m&m’s stopped using cartoon mascots that reflected overtly sexist traits. It’s why Mel Gibson isn’t getting as many roles in movies and why newspapers stopped printing Dilbert. Most people don’t want to financially support racist assholes, and advertisers have learned this.

There isn’t a conspiracy to censor hate speech. It’s just that most people don’t want to support hate. a “Nazi bar” isn’t a think that gets shut down because the government or an angry mob forces them out of town, it’s a thing that goes out of business because nobody wants to be caught supporting it.

And it should be obvious by now that a Nazi bar is neither about Nazis or bars, it’s an analogy for any organization that wants to support unpopular culture, either through action or inaction. The Nazi bar in the example isn’t a bar that caters to Nazis, its a bar that won’t ask Nazis to leave, so they feel friendly meeting there, and other patrons stop coming because of the Nazis. Moderation is asking Nazis to leave because they are running off all of the customers the bar needs to stay in business. Nobody is calling the bar owner a Nazi for not asking Nazis to leave. They are just saying he is terrible at running a bar, and they do this by never going to that bar.

This comment has been deemed insightful by the community.
nasch (profile) says:

Figure it out

And literally, we launched this thing one day ago. We’re going to have to figure a lot of this stuff out.

Isn’t it better to figure out how you’re going to run a service before launching it?

Diane says:

Nazi Bar.

I enjoyed the read though you took out the quote I liked:

It’s not standing up for free speech. It’s building your own brand as the Nazi bar and abdicating your own free speech rights of association to kick Nazis out of your private property, and to craft a different kind of community

I hope you come to create an account on Spoutible which is creating a safe venue without bots and malicious actors. I don’t work there. But is Spouties love it and want voices like yours there.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

As usual, Masnick wants to be seen as a supporter of free speech, but cannot stand it when free speech is used to promote viewpoints that he hates but are popular. So when a platform like Substack exists that allows that free speech, he looks for an opportunity to attack it, in the same way that he has published endless attacks against Twitter since Musk bought it.

Anonymous Coward says:

Playing devil's advocate

I am very sympathetic to Nilay Patel. I’m disgusted by bigoted speech directed at him. I am also a Cynic, and I could see through the artifice of free-speech theater.

Whenever I see or hear “free speech”, particularly when counterposed with a snarl world like “woke” or “cancel culture,” I immediately see the side invoking “free speech” as engaging in moral hostage-taking.

It’s manipulative, it’s abusive and the intent is to give a weak participant an upper hand by forcing a concession or by forcing a tactical/strategic failure on the part of the adversary.

“Free speech” “absolutists”/”maximalists” hold a position that no one or nothing may impede the right of information to reach its maximum potential audience. In theory, this sounds heavenly.

In practice, the audience and platform lose their negative rights (to refuse obligations to listen or to present unwanted speech). With no recourse to stop a free speaker, the free speaker will be unburdened of persuasion and just go straight to coercion. It’s tyranny coming from a different direction.

I think Christopher Best and Substack as an institution understand this dilemma and a commitment to “free speech absolutism” is a noble gesture but practically untenable — it’s a great idea; let’s never do it again.

Watching the video and going back and reading the transcript of the dialogue, I side with Best in this exchange. I don’t see what Best said as “even Nazis are welcome in the Substack saloon” because he didn’t give Patel the answers or reassurances he sought.

I don’t know Best’s politics to know if he’s going to follow the tech bro trajectory that leads him across the Peter Thiel Memorial Bridge, but I surmise he has to give guarded answers because of his responsibilities as the head of a company.

A mere quote can change the material financial condition of Substack, or even alter the growth trajectory of its contributor or audience base. And by nature of his job, he has to take a long view and a broad view and think and speak strategically.

Also, what Patel and Best both have to realize is that whether Substack does become the Nazi bar really depends on the determination of the Nazi. For Patel and Best, it’s about the bar. For the Nazi, it’s about the Nazism.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re:

I don’t see what Best said as “even Nazis are welcome in the Substack saloon” because he didn’t give Patel the answers or reassurances he sought.

That’s the point being made by the article: Best could’ve said “we won’t allow racist garbage on Substack Notes” or something to that effect and moved on. That he tried his best to weasel out of answering those questions in that way tells me (and many other people) that he’d rather risk Substack Notes(/Substack as a whole) being seen as friendly to bigots and fascists than risk alienating conservative users (including those that Substack paid to write content).

His answers to the questions about moderation suggest he wants “unfettered free speech” on Substack but also wants to have the service seen as a place where everyone is welcome. But as you note, trying to hold both positions at the same time is untenable. Best can’t somehow balance the flourishing of hateful speech with the flourishing of marginalized voices⁠—and I’d like to think that, deep down, even he knows he can’t do that.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re:

As if “marginalized voices” can’t be some of the most hateful speakers there are. Oppression doesn’t make anyone right. It just makes them oppressed.

Anonymous Coward says:

Re: Re: Re:

How is “having the most watched new network promoting their hateful views” oppressing those hot takes?

How is “Elon Musk promoting their hateful views” oppressing them?

This comment has been deemed insightful by the community.
Bobson Dugnutt (profile) says:

Re: Re:

I think Substack is going to land in the same spot as Facebook, pre-Musk Twitter, Reddit, etc. and have to go through the adolescence phase of “I’m for FREE SPEECH TURNED UP TO 11!” and then going “How low can we turn the free speech volume down and still keep it close to 10?”

As a subscriber to a few Substacks, I’ve had a positive experience because I can subscribe and pay the writers I enjoy and sidestep the content that is the reading equivalent of a stomach ulcer. I like reading newsletters both as websites and email because the basic font is clean and uncluttered with ads or Taboola-like garbage.

Substack does have a few bulkheads in place to reduce troublesome interactions. Publishers can choose to allow comments by all, by subscribers, by paid subscribers only, or turn them off altogether. Publishers are given latitude to set house rules for their newsletters, and don’t have to platform anyone.

The Twitter clone is too new, but it will need some quality control aspects like mute, block, ignore (post a Note but not view all replies), yellow flag (one where a community of users can report something that is problematic but not an imminent TOS violation) and red flag/report (imminent TOS violations like CSAM, incitement, doxxing, presentations of criminal activity, self-harm, etc.).

Each user should be able to set their own boundaries on what content they can tolerate to consume and how others can engage with them.

This is still free speech, since every participant as communicator and recipient has both the right to transmit and receive and the right to refuse unwanted communication.

Also, as a warning to both Patel and Best: There Will Be Nazis. Like cockroaches, Nazis are prepared to survive anything from a can of Raid to nuclear annihilation to the heat death of the universe.

Nazis have bad intentions. They relish confrontation. Not only do they love to break the rules and exploit weaknesses in communities, but they also love to play the rules and how tendentiously they can argue the rules interpreted in their favor.

Nazis will find a way into the community. But non-Nazis (and a trust & safety team) must have tools to pound Nazis like the Home Alone villains.

This comment has been deemed insightful by the community.
KCatMama (profile) says:

Answer the damn question, Chris. It's not hard.

Such a great piece. Hard to believe Chris wouldn’t even state that yes, they would follow their own terms of service and censor the racist comment. And his excuse that they just started the notes feature one day prior is so weak. Come on, isn’t this the first thing you think about when even contemplating something like their notes feature? I love reading so many Substack writers, so this response and this attitude from the company’s leadership is disappointing.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re:

Because, as the saying goes, “When the facts are on your side, pound the facts. When the law is on your side, pound the law. When neither is on you side, pound the table.” Using vulgarity as an intensifier is the last resort of the factless (and the first resort of the inarticulate).

If Best had allowed himself to fall into the trap of agreeing to ban racism, the obvious next step of the woke would be to find writings on Substack against critical race theory and racial preferences, accuse those of being racist, and demand that Substack censor them as Best said he would. (They will do that anyway, of course.)

Anonymous Coward says:

Re: Re: Re:

He wasn’t asked to come out against racism. His interviewer demanded that he agree to silence speech on his platform.

This comment has been deemed insightful by the community.
This comment has been deemed funny by the community.
That One Guy (profile) says:

Re:

If you read an article about the CEO of a company refusing to give a solid yes or no answer to ‘will you enforce your own TOS and remove bigotry?’ and the part that offends you is the swearing then that’s a you problem and you might want to check your priorities.

urza9814 says:

Do algorithms/suggestions matter?

Maybe I’m just too ignorant of the details of Substack…but there’s a critical discussion I think is missing here. The thing that makes social media different from infrastructure isn’t necessarily about the target market or the UI of the product. It’s the single unified community.

Is this new product like social media in that it is all one big space and the leftists are getting the same ads and suggested content as the Nazis? Or is it like infrastructure where each creator has their own “siloed” site where they alone are responsible for advertising that content and growing their subscriber base? Because from this article it certainly sounds like it is only going to show me the content I have explicitly subscribed to. And while that doesn’t entirely resolve the potential for conflict, I think it does keep it firmly in the infrastructure category. Traditional social media has for the last few years been going out of its way to break the concept of subscription and make the content you see be more dependent on popularity and algorithmic suggestions rather than direct user choice. And that makes moderation a much bigger issue as those moderation policies have a much bigger impact on any individual user’s experience of that platform.

nasch (profile) says:

Re:

Because from this article it certainly sounds like it is only going to show me the content I have explicitly subscribed to. And while that doesn’t entirely resolve the potential for conflict, I think it does keep it firmly in the infrastructure category.

That’s not what infrastructure means. Infrastructure is something that other services are built on. It could be physical infrastructure like transmission lines, or virtual infrastructure like Amazon Web Services. But Substack is just a web platform, not infrastructure.

Evan Drince says:

Not Against Substack TOS

The interviewer INSISTED that if someone were to post “We shouldn’t allow so many brown people into the US” was clearly against Substack’s TOS. This is clearly incorrect.

From Substack:
“Substack cannot be used to publish content or fund initiatives that incite violence based on protected classes. Offending behavior includes credible threats of physical harm to people based on their race, ethnicity, national origin, religion, sex, gender identity, sexual orientation, age, disability or medical condition.”

The key word here is “credible threats”. To argue a policy position is not the same thing as “Hey let’s assemble together at this particular place and time with weapons and attack brown people!”.

Anonymous Coward says:

Chris best company values are obvious but you will not catch him saying it out loud. Surprise! It’s money! Plain, simple and ruthless.

There is money to be made in America and other countries by giving racists, domestic terrorist groups and seditionists a platform. That is money that laws, morals and basic human decency are forcing companies to leave on the table. Almost every damn CEO in the planet is very much thinking how to accomplish that without ending up dealing with the fallout that comes with monetizing evil.

If suddenly there were zero legal accountability for a social platform to not do any moderation at all they would all embrace it overnight and we would be flooded with all the illegal and horrendous filth humanity can muster by the next day. These companies would gladly monetize it.

The only reason this is not the default is the laws and the social pressures built into our current society that keep advertisers and platforms from being the absolutely amoral entities they are. In capitalism, ethics never trump profits in terms of priority.

It only happens because most people will shy away from the extremes and that ultimately reduces profits but corporations are driven by what makes them the most profit “all things considered.” This is why you will never hear any CEO say it out loud. We are not ready as a society to hear that aloud… yet.

AnAnonymousMouse says:

None of this is sincere

Obviously there is a wide variety of opinions and viewpoints on Substack, and reducing all of that to being PRO-NATSEE just because he doesn’t want to play fuck fuck games with a disingenuous hypothetical proves why the platform needs to exist in the first place. If you don’t like Substack, don’t use it.

Joanne A Martinez says:

Equality democrat anti fascists vs right wing in-your-face fascists/nazi on Substack.

Why does Substack think it should cater to nazi when they have umpteen platforms to bitch and complain about we liberals. We have NONE! ZERO. When i got aboard it was supposed to be for anti fascist professional writers. But it was false advertising. Substack is overloaded w/in your face hostility complaining about EVERYTHING that is democrat, liberal. Why do they have to be assaultive HERE? Substack is not so special. They allow the malcontents to intrude on the peacefulness of professional writers trying to earn a living. My blocks. Seem to NOT MATTER. They STILL write and blog w/harassing content w/LIES. Conspiracy. Hate “woke” anti vax. And everything else they dream up to bitch & complain about. I cannot communicate w/my questions to CEO and other administrative staff. I am extremely disappointed w/this issue. I want to ask them to form a nazi platform and get them out of Joanne’s Substack. Even so they don’t NEED another platform bc there already is PLENTY. Why can’t BLUE/Liberal have exclusive so we don’t have to have the annoyance likened to a bunch of nasty flies buzzing around & circling peaceful writers. ??!

Been Years Since I Logged In Here

But I did just to say for some reason I really hope this CEO shuts up and gets out of the news. For some reason.

— Christopher Best

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...