Everyone Hates Trust & Safety. Everyone Needs Trust & Safety. This Is A Problem.
from the doing-the-impossible-and-the-necessary dept
Here’s the contradiction at the heart of the internet: everyone complains about content moderation, but no one wants to use an unmoderated platform. Everyone thinks trust & safety professionals are either censorial scolds or corporate lackeys, but everyone expects them to magically solve the inherent problems of human behavior at scale.
I spent last week at TrustCon, the premier annual conference for the trust and safety industry (you can see our live podcast, Ctrl-Alt-Speech, here), listening to people trying to square this impossible circle.
The trust & safety space is a strange one in that I think it’s one of the least understood but most important industries that impacts tools most people use every day. So many people misunderstand the role of trust & safety professionals often (falsely) thinking that they are censors (they are not) or that they have some magical ability to stop bad things on the internet (they do not).
As Alex Feerst wrote for us a few years ago, trust and safety is largely made up of caring people who really, really are trying to make the internet a better place, figuring out the best way to create rules for specific platforms that encourage good/helpful/useful behavior and minimize bad/dangerous/risky behavior. Every platform has different rules and incentive structures, but every platform that hosts third party speech needs to have some sort of rules, with some sort of enforcement mechanisms.
This is true elsewhere as well. City council meetings have rules. Your home owners association has processes. The book club you’re in has some guidelines. The idea is that in a society where groups form for common purposes, there always need to be some sort of principles to help people get along without causing damage. And each group can create their own rules and their own enforcement—some may be stricter than others, but people get to decide which sites and which services with which rules and enforcement they feel most comfortable with, and use those.
A ton of trust & safety people legitimately care about making services work better and be safer for everyone. For all the talk of how they’re “censors,” a huge percentage of them come out of human rights and free speech work. For all the talk of how they’re holding back progress, a ton of them are there because they believe in the power of the internet to make the world better and are trying to help.
And yet many people believe that trust & safety people are something they are not. There are those, like Marc Andreessen (who as a board member for Meta and many other internet companies absolutely should know better), who believe that trust & safety is “the enemy of progress.” Or there are those, like Jim Jordan, who falsely believe that those keeping the internet safe are engaged in a vast conspiracy of censorship for merely enforcing the rules on private platforms.
But there are also those who seem to imagine too much power in the hands of trust & safety professionals and expect them to fix the inherent problems of humanity and society, something they could never do in the first place. We see regulators who think that companies need to be forced into doing things because their CEOs don’t care—but all that really tends to do is limit the ability of trust & safety folks to craft better, more creative solutions to the problems that human users bring to any service. Because as more government mandates come down from on high, the more those companies are forced to start checking boxes to show “compliance” rather than letting trust & safety experts actually figure out what works best.
It is a thankless task, but a necessary task. It’s not about censorship or holding back progress. Quite the opposite. It’s about making the wider internet safer for more people so that more people can benefit from the internet.
And my takeaway from TrustCon is that it’s being made much, much harder by those who don’t understand it.
One interesting thing I heard from multiple people was how the success of Jonathan Haidt’s “Anxious Generation” book had been a huge disaster for the entire field of trust & safety on multiple levels. I heard it from a couple different people on the first day and it took me a bit by surprise. After all, the book has been out for over a year, and while I have written about how it’s full of garbage cherry picked and misleading research that has done real damage in terms of the public, the media, and policymakers’ beliefs about child safety online, I had not considered as much the impact on actual online safety.
As I learned from talking to folks in the field, the success of the book harmed safety efforts in multiple ways:
- The book has cemented the false narrative that social media is inherently harmful to all kids, despite plenty of evidence to the contrary (the evidence suggests social media is very helpful for some kids, neutral for most, and only harmful for a very small number—and often the harm is because of other, untreated, issues).
- This means that so much effort has been put towards the backwards, impossible, and harmful goal of blocking kids entirely from social media, which means it has taken away from actual interventions that help. That is, it has resulted in less work on better safety tools for kids and better education of how to use the tools in an age appropriate way, with almost all of the efforts being focused on questionable and privacy-destroying age verification.
- The narrative also further cements the idea that there is no role for thoughtful trust & safety interventions, assuming that a pure abstinence approach is the only possible approach.
In many ways, it’s an almost exact replica of failed “abstinence only” efforts around drugs, alcohol, and sex. We’re making the same damn mistake all over again, and so many people are willing to trust it because it’s in a best-selling book.
But there was a larger through-line in some of the discussions, which revolved around how the general “worldview” has changed on trust and safety, with attacks coming from across the broad political spectrum. You have some places, including the EU, the UK, and Australia, where governments haven’t bothered to understand the natural trade-offs of trust & safety and seem to think they can regulate the internet to safety. Meanwhile, books like Haidt’s have convinced policymakers that the solution to online harms is keeping kids off platforms entirely, rather than building better safety tools and better educating kids how to use services appropriately.
That’s a fool’s errand.
At the other extreme, you have the MAGA VC world who falsely believe that trust & safety is about censorship and is evil and shouldn’t exist at all. And, in the US, those people currently have tremendous power, leading to nonsense from companies like Meta and X to falsely imply that they can remove safeguards and guardrails and nothing will go wrong. So far, that hasn’t worked out too well, mainly because it’s based on a totally faulty understanding of how all this works.
The end result is… not great.
Trust & safety professionals I spoke to at TrustCon kept talking about how this environment has done tremendous damage to their ability to actually keep things safe. I heard Daphne Keller’s piece about how regulators were turning trust & safety into a compliance function so many times at TrustCon, I think it was the unofficial article of the conference.
This gets to the heart of what I heard over and over: trust & safety professionals want to create better, safer services online, but feel trapped between impossible demands. Regulators want them to solve problems that can’t be solved through content moderation. Politicians attack them as censors. Best-selling authors like Jonathan Haidt blame them for harming children when they’re desperately trying to help, pushing everyone toward the failed “abstinence-only” approach of blocking kids from platforms entirely rather than building age-appropriate safety tools.
It’s an impossible situation to deal with, especially for a bunch of people who mostly, legitimately, are just trying to get people to play nice online in order to enjoy the wonder that is a global communication network.
Making this worse, some commentators are drawing exactly the wrong lessons from this crisis. Dean Jackson’s “realist’s perspective” on trust & safety concludes that only “state power” can realistically fix things—missing that regulations are already making trust & safety harder by forcing compliance theater over actual safety work.
A realist assessment of the current moment suggests that one force capable of moving tech titans in a better direction—perhaps the only force short of a mass consumer movement—is state power.
This echoes what I heard about Haidt’s influence: his book’s success has convinced everyone that social media is inherently harmful to all kids, sucking resources away from nuanced interventions that might actually help. Instead of building better safety tools and a better overall ecosystem, everyone’s fixated on impossible age verification schemes. It’s abstinence-only education for the internet age. It’s creating “trust and safety theater” rather than actually building up either trust or safety.
We talked about this on the podcast, where I noted I find Dean one of the more thoughtful journalists on this beat, but I felt this piece missed the mark. The piece is nuanced and certainly discusses the tradeoffs here, but seems to view the current state—where trust & safety is viewed so negatively across the board—as a kind of permanent state, and concludes that the only real way to deal with this is via state power telling the companies what to do.
It strikes me as an odd conclusion that we need “state power” to make a better internet, when one of the big takeaways from the conference was how regulations are consistently making it more difficult for trust & safety folks to do their job well, and are instead focused on regulatory compliance—checking boxes to keep regulators happy, rather than implementing systems and policies that actually keep people safer online.
This impossible situation helps explain why Casey Newton’s similar critique of the industry—that trust & safety leaders are, effectively, cowering and unwilling to speak up—also misses the mark. Newton wants them to quit their jobs and write op-eds defending their work. But why would they?
Look what happened to Yoel Roth, Twitter’s former head of trust & safety, when he spoke truthfully about his work: he was lied about, doxxed, and driven from his home. That’s the reward you get for public honesty in this space.
Newton’s frustration is understandable, but his solution—public martyrdom—ignores the basic risk assessment these professionals do for a living. When regulators, politicians, and pundits are all attacking you from different angles, going public just paints a bigger target on your back. And for what benefit? Better to keep your head down, check whatever boxes keep the wolves at bay, and try to actually make things better from within the constraints you’re given.
It’s not ideal, but it’s the kind of thing that people who care will do in these wild and ridiculous times.
I will note that many of the sessions at TrustCon this year (way more than in the past) were off-limits to the media. TrustCon lanyard badges indicate if you’re in the media or not (every year they accidentally give me a different lanyard, and I remind them that I’m press and should be given the press lanyard). It sucks a bit for me, because it means I don’t get to go to those sessions, but given the state of everything, it’s totally understandable.
Trust & safety folks are legitimately working to make the services you and I rely on better. It’s an impossible task. You and I will disagree with decisions made on every platform. I guarantee you that people who work at these platforms will also disagree with some of the decisions made because there are no right answers. There are, as one friend in trust & safety likes to say, “only least bad options.” All options are bad.
But making the “least bad” decision still involves careful thought, deliberation, and understanding of tradeoffs. TrustCon is an opportunity to share those difficult discussions and to think through the tradeoffs and considerations. And given how the press so frequently misrepresents those tradeoffs, it’s not the least bit surprising to me that many sessions decided to keep them out.
So what’s the solution? Stop expecting magic from people doing impossible work.
Trust & safety professionals aren’t going to solve humanity’s problems. They can’t make perfect content moderation decisions at the scale of billions. They can’t eliminate all harmful content without also eliminating lots of legitimate speech. These are not bugs in the system—they’re features of trying to moderate human behavior at internet scale.
Instead of attacking them from all sides, we need to create space for them to make the least bad choices available. That means regulators who understand the natural tradeoffs involved rather than demanding impossible outcomes. It means politicians who don’t scapegoat them for broader social problems. And it means recognizing that while their work is imperfect, the alternative—no moderation at all—is far worse.
The internet isn’t broken because trust & safety is doing a bad job. It’s strained because we’ve asked them to solve problems that can’t be solved through content moderation alone. Until we acknowledge that basic reality, we’ll keep making their impossible job even harder.
Filed Under: censorship, content moderation, internet, jonathan haidt, progress, trust & safety, trustcon


Comments on “Everyone Hates Trust & Safety. Everyone Needs Trust & Safety. This Is A Problem.”
I think this article could maybe have done with an editing pass to cut the repetition of the main point down from three+ times.
I’m also not enthused about ‘let corporate do whatever they want’ in the T&S sphere, which strikes me as having its own enormous set of problems. State regulation in the space is important and desirable as a result. The problem is that a lot of that regulation is currently founded on panic and ignorance, and is being wielded with all the subtlety of a nuclear exchange.
Re:
yeah, editing is needed, but it is a tedious chore and thus avoided by most all communicators.
Re: Re:
^Case in point.
But it will be broken soon enough by legislators who think age verification is the be-all, end-all, implement-this-or-die solution to trust and safety issues (among other things). You can already see it happening thanks to the UK’s new age verification scheme: Websites are preparing age verification to handle that scheme, and they’ll inevitably expand that to other countries when (not if) age verification becomes the law of the land.
Speak up now or the end of the Internet as we know it will be here faster than you think, folks. Do you really want to give YouTube your ID just to watch the latest Daily Dose of Internet video?
Re:
If it matters any, I see plenty people up in arms over it.
I don’t. And there are areas in which we don’t have so much centralized control, such as postal, telephone, and e-mail systems (which are not magically perfect, but mostly work well enough). I’m still kind of hoping we find some solutions like that to replace many of these “platforms”. There’s no great technical reason why I shouldn’t be able to adjust my social media filter like I adjust my e-mail filter; it just kind of developed that way, with everything happening on remote servers. A consequence of which is that these big platforms get all kinds of ostensibly-private data about their users.
Not only should they not be able to control what people are talking about, they shouldn’t even have a way to know. That would, in sane legal systems, also absolve them of any legal responsibility to “do something about it”.
“we need … regulators who understand the natural tradeoffs”
… Where do our officials find and hire such gifted government Regulators ??
if you concede that government politicians have any legal authority to regulate internet speech — you have abandoned 1st Amendment principles.
Re:
Where have I ever suggested they have the authority to regulate speech?
Re: Re:
Hmmm, maybe your use of the term ‘regulators’ was unclear.
Who are the regulators you had in mind and what are they regulating.
Re: Re: Re:
There are other things to regulate that are not speech. Antitrust, privacy, data breaches, etc. etc.
Re:
Internet’s global.
I have some sympathy, but ultimately, the trust & safety team is downstream of the CEO. If the CEO wants something, trust & safety is not going to be able to stop it (see e.g. Meta/Twitter, to say nothing of non-social media like WaPo/NYT). Even on less extreme sites, you can find plenty of examples of Reddit/Youtube of the trust& safety team coming down and censoring things (or vice versa, only coming down on certain things after media attention).
A well-meaning T&S does not trump the executive. They’re a subordinate system, they do not call the shots, and they’re just as subject to incentives as anyone else.
Part of making trust&safety work in any organization is a certain level of transparency and communication. This applies to social media companies just as much as it applies to police, or regulators, or the local school/Boy Scouts. Is it pleasant? No, and I say that as someone who has moderated before. It sucks ass, everyone hates you. But it really does matter, and it does head off things like hasty regulation.
State power is how we force groups to do things. If you want to be sure a group does something, instead of relying on good will/existing incentives, state power is the way to do it. However, it matters how it’s applied.
The problem is there are cases where they aren’t making the least bad choices. Do you really think Twitter is doing the least bad it can do on CSAM right now?
Re:
“Yes.” — Elon Musk, probably
Re:
That’s literally a matter of law, CSAM. Part of Trust and Safety for sure, but definitely outside the scope of the discussion here, as it should be. P.S. It is already regulated.
Re: Re:
I’m not sure why you think it’s outside of the scope of this discussion, but feel free to replace it with any other example (MechaHitler, Elon’s vindictive banning of journalists, or whatever) from the dumpster fire that is modern Twitter. It doesn’t change the overall point, T&S teams have limitations based on the direction of the CEO/broader company incentives, and Twitter (to the extent it even has T&S) very clearly isn’t even attempting for least bad outcomes.
Twitter is just a convenient example because it’s so extreme out of the major platforms, it can’t be brushed off as simply moderation being difficult with trade offs.
Parts of it are (not that that seems to be enforced re:Twitter/current administration). But things like implementation is left up to the discretion of the platform.
But that’s kind of irrelevant to Mike’s point, which is trusting that T&S teams will go for the least bad option. The fact that they’re not enforcing something that is regulated is if anything stronger evidence against that proposition.
If what you do, in whole or in part, is to filter, edit, redact, or remove material that doesn’t meet some set of criteria, then you are a censor. That may be good or bad, but it’s still what you are.
The word hasn’t ever (in halfway modern English) required that you work for the government. It’s never required that you were trying to enforce your restrictions society-wide. It’s never implied anything about what particular set of rules you might be enforcing, or about the source of your right to do so.
It’s obnoxious to try to redefine words because you don’t like their connotations. The overwhelming majority “Trust and Safety” has a large censorship component, and most people who work in “Trust and Safety” thus have censorship as a significant part of their jobs. They are censors, and they need to get comfortable with that.
Re:
Man, I haven’t had to use this copypasta in a while, but I still keep it on hand just in case:
Moderation is a platform/service owner or operator saying “we don’t do that here”. Personal discretion is an individual telling themselves “I won’t do that here”. Editorial discretion is an editor saying “we won’t print that here”, either to themselves or to a writer. Censorship is someone saying “you won’t do that anywhere” alongside threats or actions meant to suppress speech.
With this in mind, explain to me how an imageboard admin saying “nah, you ain’t postin’ that here” and deleting an image from that imageboard—all without being able to stop the person who posted the image from posting it elsewhere—is censorship. And don’t post some kind of “iT’s LoCaLiZeD cEnSoRsHiP!!!1!” bullshit; you’re not clever enough to sell it to me and I’m not stupid enough to buy it from you.
Re: Re:
I’ve had this argument with you before, and eventually elicited that the primary reason you draw this distinction is because you don’t like thinking of yourself as a censor.
But all censorship requires is suppression of speech by a third party, full stop and end of.
Re: Re: Re:
You have put forward this stupid argument before and it is still as stupid because you can only make it by ignoring all context.
Is it censorship when a newspaper refuses to print an ad for you because it doesn’t conform to their rules?
Is it censorship when you knowingly post something that breaks a site’s TOS and it is removed?
Is it censorship when someone refuses to repeat what you said?
Is it censorship when the USPS refuses to deliver a letter because you couldn’t be bothered to use the correct postage?
Is it censorship when someone tells you to shut the fuck up and leave their property?
This is why your argument is devoid of context because with it, it becomes utterly stupid.
It’s funny how no one except the diehard assholes complained about moderation for decades until the level of stupidity among “conservatives” reached a fever pitch.
Re: Re: Re:2
Yes, yes, no, no, yes.
I don’t see how this is difficult. It’s a clear and simple rule. What trips up people is the stupid-ass assumption that censorship is always bad and unwarranted. That’s the fallacy.
There’s plenty of times when censoring someone is perfectly fine and acceptable.
Re: Re: Re:3
Name twenty.
Re: Re: Re:4
Re: Re: Re:5
Oh, good, you actually had the guts to respond. That’s more than I can say for the usual cut-and-run assholes and Brave Sir Koby (Who Ran Away Bravely).
Re: Re: Re:6
I’m aware you don’t think some of that is censorship and/or that some of it wouldn’t be justified; I intentionally chose a broad range of examples ranging from the clear and obvious to some interesting cases to make my point: They are all censorship, for the simple reason that each and every one of them is someone preventing someone else from speaking; “if you can just go elsewhere it isn’t censorship” and “censorship is saying you can’t do that anywhere” are riders of your own invention.
And they can all be justified, or at least debated; their merits are independent of whether or not they are censorious, as even your own reply concurs by differentiating between acceptable (CSAM) and not (right to be forgotten) kinds of censorship.
And it is that decision on the merits is where the moral judgment comes into play, not on whether you label booting someone from an online service as censorship or not.
My own particular view is that doing so clearly, obviously is — the difference between that and what the payment processors are pulling with Steam, for example, is a matter of degree and scope, not of kind — and that it is fine that it is. The correct response to the people yelling “you’re censoring me!” when you ban them isn’t “no I’m not”, it is “Yes, and so what?”. Or, well, actually it’s nothing, because they’re banned and therefore you have no need to engage further anyway, but you get the point.
This is why, for example, I can distinguish between people complaining that Twitter booted ’em for being Nazis, and people who’re upset that X booted ’em for being pro-LGBTQ+. Mechanically and even legally (pre-Trump) those acts are exactly equivalent and censorious, but they’re distinct in the intent, so the first is fine and acceptable and the second is not.
Re: Re: Re:7
Yes or no: Do you sincerely believe that being denied the privilege of posting on a service like Twitter is equivalent to being denied the right to speak freely?
Re: Re: Re:8
Yes.
Saying you can’t speak at the public square but you can go across the tracks and talk is depriving you of your targeted audience.
You can’t talk to these people but you can go in the forest and talk to the squirrels? Thanks?
112 am
2025/08/02
Re: Re: Re:9
The First Amendment protects your rights to speak freely and associate with whomever you want. It doesn’t give you the right to make others listen, make others give you access to an audience, and/or make a personal soapbox out of private property you don’t own. Nobody owes you a platform or an audience at their expense.
You can’t be “censored” by losing the privilege of posting on Twitter because, despite any reference by anyone to Twitter being a “modern town square”, it is still a privately owned service that has every right to decide what speech (and what speakers) it will and will not host. Elon Musk can personally boot you from Twitter, but he can’t stop you from going to Bluesky or Facebook or Threads or a Mastodon instance or 4chan and repeating the same speech that made him give you the boot.
You can’t be “censored” because you lost an audience because if that were true, people who lose audiences naturally through a lack of compelling content would be “censored” because they don’t have people listening to them. No one is owed or entitled to make anyone pay attention to their speech. Shit, I put together a userstyle that blanks out the comments of one or two certain account-holding trolls here. You can tell me how my attempts to ignore those trolls is “censorship”, but you’ll have an easier time soloing a full climb in PEAK than you will in climbing that specific rhetorical mountain.
Re: Re: Re:10
No one says you have to listen, but you are saying no one can hear.
I thought it was cleared up above that censorship doesn’t just mean 1st issues, now you want to bring it back to just meaning issues of the first.
No sense in discussing if you are just going to keep moving the goal posts
Re: Re: Re:8
They are both censorship. They are not equivalent. We’ve been through this multiple times in this discussion alone, and others prior. The answer you get from me isn’t going to change no matter how many times you ask.
Re: Re: Re:9
How does losing the privilege of posting on Twitter—a privately owned and operated social media service that has no obligation whatsoever to host your speech and no method of stopping you from posting your speech elsewhere—suppress someone’s right to speak freely?
Re: Re: Re:10
A right is legally defined, but I didn’t say anything about suppressing someone’s right to free speech. I have said it suppresses someone’s speech. Which it does.
Re: Re: Re:11
It does.
On a single platform.
That they don’t have a guaranteed legal right to use.
Which means that they can take the speech that was “suppressed” on that platform and post it anywhere else without that “suppressive” platform being able to stop that person from doing so. And that’s the problem I have with people who say speech that’s been moderated has actually been “suppressed”: It’s the “I have been silenced” fallacy dressed in business casual instead of a three-piece suit. “Localized suppression”, or whatever fucking bitchass term you want to use for moderation, isn’t censorship—and if you really believe otherwise, you’re a step away from being someone who believes in “free reach” despite that not being a thing. Are you really willing to leap into that specific rhetorical Hell despite knowing how much it’s going to burn?
Re: Re: Re:6
Oh, and postscript: For 20., I was thinking in terms of like, classic WWII censorship of troop positions, or secret projects like cracking Enigma or Manhattan.
Re: Re: Re:7
I wouldn’t call that “censorship”, so much as I would call that “good opsec”.
Re: Re: Re:8
Inasmuch as you could get shot as a spy for breach of those, or life imprisonment even now, that’s… pretty wild to not consider as censoring the people in question.
Re: Re: Re:7
You’re totally right. Those were so heavily censored we don’t know anything about them today.
Re: Re: Re:6
“9. Ethics are a personal code of honor; since ethics are both a philsophical concept and a set of beliefs instead of an act, ethics literally can’t be censorship.
10. See #9.”
Well, I can tell you’re not in the professions. Lawyers, engineers, doctors, accountants, etc, have specific ethical and professional codes of conduct, which does include governing their speech, and to which they must (at least, in theory) adhere, and for violating such can face consequences up to and including loss of livelihood.
It’s not a purely personal or philosophical thing; these are codified and enforced. The one for my local engineering regulatory body runs to fifty plus pages and has a specific section on “expressing opinions in public”.
Re: Re: Re:7
And it is, by and large, their personal decision—their discretion, if you will—to adhere to such codes. If they decide to violate such codes and risk the consequences, that’s on them.
Re: Re: Re:8
It’s your decision whether or not you break a law saying you can’t say something, too. Does that change whether or not said law is censorship?
Re: Re: Re:6
That’s not correct. Censorship is specifically preventing speech and only speech, but anti-fraud legislation is also a bar on actions, specifically acquiring the financial gains inveigled out of people by false speech.
Re: Re: Re:6
Including if it’s the only way to get websites to purge a deadname a trans person was never publicly known under? Fucking transphobe.
Re: Re: Re:7
I’m against deadnaming and I support trans rights. I also think using the courts/the law to force information off the Internet is censorship, even if that information is a trans person’s deadname. Consider the following: Caitlyn Jenner won gold medals in the Olympics. Should footage of the events she competed in be expunged from the Internet only and specifically because said footage technically deadnames her?
Re: Re: Re:8
A point already addressed by AC when they said:
Caitlyn Jenner won Olympic gold under the name “Bruce”, and is thus still publicly known under that name even though she doesn’t use it now. So you are a transphobe. Not because you missed a trans-positive use of a law you find problematic, but because of your doubling down of that point and your (unsuccessful) attempt at shifting the goalposts in doing so.
Re: Re: Re:9
I’m against deadnaming trans people. For fuck’s sake, I didn’t even use Caitlyn Jenner’s deadname when talking about her Olympics accomplishments. My argument about “right to be forgotten” laws is that using such a law for any reason, including to prevent a website with factual information such as a trans person’s deadname from printing that information, is censorship and therefore bullshit. And yes, that may seem like I’m in favor of deadnaming, but the fact of the matter is that Caitlyn Jenner was once known by another name, and even Wikipedia acknowledges that fact. (Wikipedia also says this about her decathalon career: “Jenner’s Olympic events were men’s events that occurred prior to Jenner’s gender transition.”)
The use of a “right to be forgotten” law to prevent the deadnames of trans people from being talked about publicly might seem like a good idea. But the ends never justify the means, and using a law designed to suppress speech because someone’s feelings being hurt—and I recognize that I’m oversimplifying the situation with that remark—can never justify the suppression of that speech. Or would you like to tell me what other factual information you want deleted from the Internet by using a “right to be forgotten” law?
Re: Re: Re:10
Which was the point made by both the first AC and David. Why are you so dedicated to your point that it makes you this obtuse?
Re: Re: Re:10
As I stated in my original post, deadnames that the individuals in question were never publicly known by. For example, Suzy Izzard doesn’t have the right to the name “Eddie” being stricken from her Wikipedia article because she was famous for years under that name (and still performs under it as a result), but if the name on Annie Wallace’s first birth certificate were to appear in Wikipedia article, then she should have the right to have that information taken down under the right to be forgotten law since she has never been publicly known by her deadname.
Re: Re: Re:11
You know what? I’m hesitant to side with any “right to be forgotten” law because I think the censorship of any kind of factual information is a step towards a much broader censorship regime—but you still make a fair and reasonable argument for your position, and I respect that.
Re: Re: Re:3
You want it to be simple, but reality isn’t simple, it isn’t neat. You are the one who want to redefine what censorship entails.
And per your own answer, even you can’t keep your argument straight – because if a site owner refuses to pass your speech along you are now saying it isn’t censorship.
This is the stupidity of your argument, just as I said, you ignore context and proclaim that it’s always censorship when a “3rd party suppress speech” which you yourself just proved isn’t the case.
Re: Re: Re:4
I explicitly said it was censorship. Keep up.
Re: Re: Re:5
Yes, do keep up with what you are saying.
I said: Is it censorship when someone refuses to repeat what you said?
To which you answered no, so if an owner of a social media site refuses to host your speech so other can see it it isn’t censorship per your own answer to my question.
Re: Re: Re:6
Oh, I see. I was thinking of this in terms of a couple of people face to face.
If the second party is being asked to convey that speech to someone else than yes refusing to do so is censorship.
Re: Re: Re:7
okay but why though
Re: Re: Re:2
I mean, by the normal definition of censorship, yes, it is. (So is 2 and 5). This isn’t even an “people incorrectly use it out of laziness colloquially” issue, the dictionary definition straight up includes it.
It’s fine to use a different custom definition (and I would even say it’s useful, because it gets at some nuances that the colloquial usages don’t distinguish without additional caveats), but it’s weird to pretend like it’s the standard one.
Not my circus, not my monkeys, but this is such a weird semantic hill to die on. It’s especially weird when the argument starts because other people don’t use it by default.
There’s been complaints about moderation/censorship since the spoken word existed, and it’s not just diehards. Moderate anyone (even if it’s completely apolitical on a Pokemon forum or whatever) and there’s a good chance they get pissy about it. Even now, you see it beyond just the conservative shellgame of “censorship is anything that isn’t rightwing”. People love to complain about it, which is why Mike is always on about how moderation is hard.
Re: Re: Re:3
Three questions for clarification on your stances:
Re: Re: Re:4
1) Under the normal dictionary definition of the term, yes. Under yours, no.
2) Yes under dictionary, no under yours.
3) Borderline. Technically yes but not really in the spirit of the word (usually the term is applied to public communication, not private. But some definitions explicitly mention that aspect, others leave it implied). If I had to give a Yes/No, I’d say yes under the dictionary. No under yours.
I don’t have a strong opinion on which definition you want to use. They’re both fine, imo, just different.
Re: Re: Re:5
If you’re willing to say that a newspaper refusing to run an ad is censorship despite (A) the newspaper having every right to decide what content will run in a given issue and (B) the person(s) behind the ad being able to go to any other newspaper—or the Internet—and potentially run that ad there?
If you’re willing to say that a social media service choosing what content it will host is censorship despite (A) having no legal obligation to host any third-party speech and (B) the person posting some TOS-violating content being able to potentially post it anywhere else?
If you’re willing to say that you kicking someone out for insulting your dog is censorship even though (A) your home is private property where you can choose who can stay or leave based on what they say and (B) the person whom you kick out being able to go to any other patch of land in This God Damned World (or any site on This God Damned Internet) and say what they said about your dog?
Alls I have to say about that is this: You have a strong opinion about censorship that you’re not willing to own.
Re: Re: Re:6
Neither A) nor B) (in any of those scenarios) are required parts for the dictionary definition of censorship. I’m not sure what you want? I don’t write the dictionary, relaying what it says is not a matter of choice/willingness on my part. That’s how society uses the term, and widely has. That’s simply not how the term is used.
You don’t have to like it, you don’t have to use it. Like I said, yours has some more nuance than the dictionary one, which I think is a good thing. But that doesn’t make it the standard one. It’s not.
Well yeah, saying anything else would involve acknowledging a dictionary. Which for some reason you have some weird hang up around for this particular word?
1) What’re you talking about? I just said it publicly, that is the definition of owning it.
2) I’ve also been very public, open, and consistent about what forms of censorship I’m ok with, and what forms I’ve not. So that’s a load of horseshit. My views on censorship are not a secret, including ones that are not popular on TD. Happy to answer any other questions, as well.
You seem to be stuck on the idea that this is some kind of gotcha, so you’re circlejerking it with precanned replies instead of actually engaging with the argument. Which is really fucking weird. You should be better than this gotcha shit, especially for a definition one can trivially confirm in a dictionary, of all things. I really don’t get it.
(And as a side note, to be very explicit, since you’re insisting on trying to slide it by with implication, labeling something censorship or not does not imply agreement nor disagreement with it. “is this censorship?”, “is this legal?”, “is this ethical?” and “do you agree with this?” are not synonyms. Several examples that I labeled as censorship I do not think are ok, and some would be. A website choosing what it wants to host is both dictionary definition censorship and something I think they should be entitled to do. They’re not mutually exclusive)
Re: Re: Re:7
You’re willing to give me credit for having a nuanced definition of censorship, but then you go right on saying “yeah, I’mma just use the dictionary definition instead” and treating acts that aren’t censorship as if they are. What I want is for you to admit, based on what you’ve said, that you’re in the camp that believes “we don’t do that here” is equal to “you can’t do that anywhere”.
Keep in mind that such a belief is the first step to believing in the imaginary right of “free reach”, which is the idea that you’re owed a platform and an audience for your speech and anyone who denies you such a platform is “censoring” you. If you want to believe in “free reach”, that’s your right, but at least have the guts to admit it.
I have experience as an imageboard moderator. When I was in that role, I deleted plenty of posts that went against the TOS of the sites I was moderating. I don’t like being called a censor for doing that.
My hang-up on this word comes from the fact that far too many people are far too eager to define “censorship” in such a broad and expansive way that actions not intended to suppress speech everywhere are treated as if they are intended to do that. Kicking someone out of your home for shit-talking your dog or your mom or your favorite hand-woven Mongolian basket isn’t censorship; treating that act like it’s the same thing as the government telling you “unpublish that speech or we’ll destroy you” is how people end up thinking content moderation is censorship.
I define “censorship” more narrowly than the dictionary because I refuse to define some acts as censorious even if the dictionary definition would. I refuse to expand the idea of censorship to acts that aren’t censorship because I don’t want anyone (including myself) called a “censor” because they told someone “hey, we don’t allow that speech here, go somewhere else” and never once intended to make that person forever unable or unwilling to repeat that speech anywhere else.
Except for copypastas that I keep on hand because repeating them is easier than rephrasing them slightly in every new post (thank you based PhraseExpress), all my replies are ocean-fresh. If I sound “precanned”, it’s because I’ve been in arguments like these plenty of times over the years, and I’m well versed in both my arguments and the ones against which I’m railing. Few arguments on this specific issue catch me by surprise any more.
Censorship carries with it a negative connotation, regardless of whether you like that fact. If you were to talk about censorship of books, one of the first images that will likely come to mind is an image of books being burned by fascists/religious weirdos—and that’s not by accident.
Now I grant that a narrow amount of censorship is, in fact, good. (For example: CSAM.) But it’s a narrow amount for a reason: If you view too many acts as censorship, you will, on a long enough timeline, come to believe that you’re entitled to speak on platforms you don’t own and force people to become your audience. You’ll believe that losing your spot on a platform or people unfollowing/blocking you is suppression of your speech. And you’ll believe, to your own detriment, that such acts of “censorship” shouldn’t be legal because they’re no better than the Nazis burning books.
I treat censorship seriously and narrowly because I know, for an absolute fact, that I don’t have the imaginary right of “free reach”—and I’d rather not delude myself into thinking I do or should have it. I keep my definition of censorship as narrow as I can because expanding that definition to include content moderation would make me a censor, and a censor isn’t something I want to be. Telling someone to fuck off shouldn’t ever be considered an act of censorship, even (and especially) when it’s done in your own home—and by agreeing with the idea that it could (or even should!) be an act of censorship, you’re agreeing with a more expansive definition of censorship that makes censors out of people who aren’t trying to “suppress” a specific bit of speech from anywhere but their own homes.
That’s what I meant when I said you have a strong opinion you’re not willing to own, by the way. You do all this hedging about “well, it’s the dictionary definition” and whatnot to distance yourself from a controversial position (e.g., “telling someone to get out of your house for insulting you is censorship”), but you’re still repeating those positions as if you agree with them. If you believe in a more expansive form of censorship, say so and own it. If you don’t, say so and own it. But the fence-sitting and handwringing you’re doing to escape being pinned to a specific position—a potentially unpopular one, at that—isn’t helping your argument here.
Re: Re: Re:8
As I said, because we’ve done this dance before and I have a working memory, you’ve evolved this position in order to protect your feelings. It’s neither robust nor fit to any other purpose.
You are censor when you moderate, even if you lie to yourself to let you sleep at night. I would suggest instead abandoning the axiom that all censorship is evil, because if you do, you don’t require all these contortions to decide what label to apply to keep your conscience clean; you can simply judge the acts on their merits.
Re: Re: Re:9
If I’ve ever suggested that in the past, I don’t know. I recognize that there are a select few instances of censorship that are morally righteous. (For example: CSAM.) But that number of instances is small precisely because they are morally righteous. It’s really fucking hard to justify banning books from libraries because they have queer characters in them, y’see.
And I see censorship as an attempt (successful or otherwise) to suppress speech, either by intimidating speakers into silence or by removing information from public view. The problem with trying to define content moderation as censorship is that the speech being yoinked off a social media site for violating TOS can be repeated elsewhere, so it can’t really be suppressed unless every social media site colludes with one another to stop that specific speaker from posting that specific speech (which has never happened in the history of social media services). I have a comment being held up by moderation/spamfilters at the time of my writing this post that explains my position on book bans and social media moderation, so you can wait for that to see my position laid out better, but the tl;dr version is that book bans are censorship because their intent is to stop the general public (including poor/marginalized people) from accessing certain books, and the information within them, for free.
Convincing me that the owner of a Mastodon instance deleting a racist post (and banning the person who made it) is an act of censorship would require one hell of an argument to make me change my mind. No one to date has offered that argument—and “no one” includes you. Feel free to try again, but I promise you that more/stronger insults and “because I said so” will get you nowhere.
Re: Re: Re:8
I think they both fall under the dictionary definition. That’s not because they’re the same, it’s because the dictionary definition doesn’t distinguish (which is why I say yours is more nuanced/better). It’s just a straight up flaw/ambiguity with the normal definition. Yours is pretty good! Just not standard.
I don’t mind using whichever you prefer, but we do need to be clear when we’re using yours. My issue isn’t with your actual definition, but that you’re framing it as if it’s the dictionary one. As long as you make it consistently clear you’re referring to yours, I think that’s fine. That’s what people do in things like academic papers, they lay out a working definition and go from there. It communicates clearly regardless of whether the reader agrees with the definition or not. (And if you don’t want to explain it every time, I’d just hyperlink to the original post)
I think with the dictionary definition, you would technically be a censor (so would I). And yes, I get it, it used to make me uncomfortable too. But it’s a “technically correct is not the best type of correct” situation, because it doesn’t line up with the negative connotation censorship colloquially has.
Absolutely. And I think it’s ok to use a different definition to make the distinction, because it isn’t a negative thing to do that sort of moderation. I personally got over it, but it’s equally valid to approach it the other way of redefining.
Using different definitions is fine clear that up, but I do think it comes with the obligation to label it as your definition when using it. Because if you don’t state what you just did, there’s no way to tell if you’re doing it for that reason, or just because you don’t know the definition. It should be clear without a follow up back and forth.(and the only reason I stick with the dictionary definition, is personally I don’t like spending the time it takes to explain it to someone).
I’m not sure I’d agree. I think it is possible (not everyone will, but it’s not inevitable) be able to put the emotional part to the side. There’s a ton of things I’d label censorship under the dictionary definition that I absolutely have not moved an inch closer towards. And vice versa, there are things you’d label moderation that I still wouldn’t like. (I’ll put replies to the specific examples mentioned in a 2nd post replied to this one, as this is long and it’ll be a full post). Maybe it just hasn’t happened yet, but I’ve had plenty of debates around censorship using the normal definition, and pinning people down wasn’t an issue.
That’s because you’re trying to conflate “calling it censorship” with “I agree with it” (and tbf, I get it. With people like Koby, it is a useful trick because he’s both bad faith and stupid. But it is a bad way to pin it down, it only works because he’s an idiot). If you just stated it was your definition, it wouldn’t need to be commented on.
But there is an easier way to ask around that than the definition. You can just ask if someone should have the right to do something (you’ve done this before! We’ve had substantive discussions on it, despite disagreeing). That’s what matters, not the label. (Or if you’re really set on the definition, you can just phrase it as “do you think it fits the definition of censorship I gave above”. instead of “is this censorship”.)
I don’t fundamentally mind your definitions, but I think you slip into talking about them as if they’re standard. And unless someone just so happens to have read your post from 2020, they’re going to have no clue what you’re talking about. There’s a super easy way to avoid that confusion, by just saying “my definition”, instead of “is”. (FWIW, I say this as someone who started reading TD post-2020, so it was extremely fucking confusing the first time it was mentioned without a hyperlink to your original post. To this day I still don’t know if Mike used the same definition or not. Heck, I can’t even remember your distinctions without looking up the 2020 post to refresh it. And by the way, your original post uses this type of wording. e.g. : After reading the Washington Post article from which that quote comes, I would refer to this as censorship– nailed it with the “i would refer to this as”. The post was before my time, but it’s doing the right thing)
1/2
Re: Re: Re:9
I’m one lone jackoff commenting (way too often…) on a tech blog. I’m not trying to be The One True Source of Knowledge on the matter. But we’ll get back to that.
If you want to call yourself a censor, go ahead on—it’s your move. But don’t apply that label to me. I’m not out here trying to stop people from exercising their right to speak freely, no matter how much a bunch of these AC assholes want you to believe that I’m doing that by telling them to go pound sand.
And yet, you label moderation as censorship and immediately attach to it all of the negative connotations that come with the word “censorship”, such that you explicitly have to refer to it as a “good” kind of censorship as a workaround to said connotations. Curious. 🤔
If my definition doesn’t match the dictionary definition and you don’t see other people treating as the Word of God on the matter, you can safely assume it’s a personal definition.
Again: The word “censorship” comes with a lot of ingrained negative baggage. Anyone who can put their emotions when talking about censorship is a better man than I.
I don’t like anything I’ve done in terms of content moderation being called “censorship”, and that’s because I equate that term with book burnings and assholes like Collective Shout and SLAPPs, not with someone getting rid of a TOS-violating post on Twitter. My whole point in the bit you replied to is that if you keep referring to content moderation as censorship, you’ll eventually come to believe that a site is “censoring” you only and specifically because you lost the privilege to have a specific bit of speech hosted on property you don’t own and don’t have a right to use. (And I don’t know if you would, but if you’re thinking of trying that “Twitter is a public square” shit here, don’t.)
And yet, when you say “content moderation is censorship”, you’re agreeing with the assholes who genuinely believe that losing the privilege of speaking to an audience on someone else’s property is the same thing as being denied their right to speak by way of threats, lawsuits, or the life-changing power of actual physical violence. Curious. 🤔
Even if I did ask that, would that make you stop referring to moderation as censorship?
They are standard—for me. (Insert a Bane joke here.) I work from my personal definitions because they’re my personal definitions. I’m not trying to force anyone to agree with them or accept them as the Word of God; at most, I’m taking to task anyone who views censorship so expansively that supposed acts of censorship that I don’t view as such become normalized as censorship. Such a connection would taint those acts with the largely negative notions that, whether you like it or not, come attached to the word “censorship” from the get-go. My whole argument here is that if you keep expanding what counts as censorship, on a long enough timeline, people will accept without question that content moderation is censorship—and some idiots will (try to) write laws to heavily regulate that “censorship” in the name of “free speech for all”. I don’t know about you, but I’m not keen on the idea of the Trump administration having the authority to tell Elon Musk exactly what kind of speech he’s not allowed to boot from Twitter.
Censorship is a slippery slope. Defining it in a way that widens the slope and makes it even more slippery isn’t something I view as a good thing; no one has yet given a counterargument to my position that hasn’t sounded like apologia for censorship (or at least the “bad kind”, which is how I guess you’d put it).
Oh, and by the by, owning the label of “censor” ain’t gonna do you any favors. Look at Collective Shout: For whatever good they might have done in the past, now they’re being seen as censors for their role in the Steam/Itch/payment processor situation—and I’m pretty sure the only people on their side want to be worse censors than them. Is that where you want to be?
Re: Re: Re:10
“I don’t like anything I’ve done in terms of content moderation being called “censorship”, and that’s because I equate that term with book burnings and assholes like Collective Shout and SLAPPs, not with someone getting rid of a TOS-violating post on Twitter.”
Your personal hangups on this matter are not an argument nor worth anyone else’s time to consider.
You’re suppressing speech; therefore, you’re a censor. But the material reality, and merits, of what you do doesn’t change with that label: it is (I will do you the courtesy of presuming) useful and justifiable.
Re: Re: Re:11
Again: How does a content moderator on a single website “suppress” speech if the speaker can repeat that speech on any other website? How does losing the privilege of posting that speech on that site deny the speaker their right to post that speech on any other website willing to host it?
Re: Re: Re:12
“How does losing the privilege of posting that speech on that site deny the speaker their right to post that speech on any other website willing to host it?”
It doesn’t? But it’s an irrelevant question because it does not matter? This line of inquiry is the same one you keep trying and I don’t understand why you expect it will get any different response than ‘suppressing speech is suppressing speech, even if it could in theory be said elsewhere’, just like all the other times.
Re: Re: Re:13
Because I’m trying to get you to realize that “suppression” that doesn’t actually suppress speech outside of a single location in which that speech isn’t guaranteed a place or an audience isn’t the kind of suppression that comes from censorious moves like book bans and the payment processor/Steam/Itch bullshit. Twitter giving the boot to a post with a racial slur in it isn’t “suppressing” that speech from being reposted on 4chan, and neither are people who “suppress” that speech from their timelines by blocking the user who made the post. Suppression is far less “we don’t do that here” and far more “you won’t do that anywhere”. If you can’t get that idea, that’s your problem. I mean, I’m a Level 5 dumbass with a gold-star license in bitchassness, and I have no trouble understanding it.
Re: Re: Re:10
(sidenote: 2/2 was posted; says it’s still awaiting moderation)
I mean, yeah, the word has baggage in colloquial usage (ironically, baggage that isn’t in the dictionary definition). Gotta deal with it one way or the other.
I don’t think your average reader is going to know you to make that assumption, which is why they’re baiting you to reply the way they are. You’re trying to avoid them calling you a censor, but they’re baiting you to look like you’re declaring an incorrect definition, rather than having redefined it for good reason, knowing the average person won’t realize that without context. Especially if the new reader googles it, since the top answer will be a dictionary (You’re of course free to reply however, just pointing out you’re getting baited into giving them the response they want from you)
I’m agreeing with them that the dictionary lumps them under the same term. That’s not because they’re the same, the dictionary just lumps different things together (tbh, I’m not sure why dictionaries haven’t updated to account for the baggage. It’s certainly widespread enough). I don’t see the dictionary part that as something I have a choice on, the dictionary says what it says. But that doesn’t mean accepting the two step they’re playing with the negative baggage, which isn’t.
Eh, it’s tricky because we’re in a public forum. When talking to you or Rocky (I don’t know who else uses it?), I will try not to refer to it (and I’ll probably link to your post for clarity when following that).
If I’m talking to a rando, I still don’t refer to moderation as censorship, but I’m just going to use the colloquial definition (which isn’t quite the same as yours, it’s close though). Most people assume the negative connotation that it has in colloquial use is a part of the definition, and would find it confusing to “well akshually” the dictionary.
Personally, I don’t actually use the dictionary definition unprompted, for precisely the reason you redefined it. I only acknowledge it when it’s brought up directly, because it is technically what the dictionary says. But it’s easy enough to acknowledge it, point out the flaws with it, and move on.
There’s a multiple (changing over time) audience issue. If this was private I’d just use yours
I mean, I’m not doing it because I like it. It is what it is, at least in terms of the dictionary. I don’t typically call myself a censor, but I’m not going to shy from my beliefs. I think if people actually look into the details it’ll be reasonable regardless of the label.
That said, part of it is probably also because I’m not as pro-free speech as you are either, even if I’m no Collective Shout. Which along with that, comes with just accepting and not being bothered that people who disagree are inevitably going to call me names. No need to give that power, if I’m confident the argument can stand on substance.
Re: Re: Re:8
2/2
For the examples in Rocky’s post:
1- Generally fine / 2- Generally fine / 3- Fine /4- Fine /5- Fine
For your examples here:
1- Generally fine / 2- Generally fine / 3- Fine
For the ones I’ve labeled “generally fine”- in 99% of cases, I’m ok with them. However, when it comes to “freedom of reach”, while I generally don’t think people have a right to reach, I do think there is one major exception- if certain types of media are too powerful to the point where it’s an anti-trust issue. I tend to think of that as something akin to a heckler’s veto. I prefer solutions that don’t involve speech (like e.g. literal anti-trust breaking up some big firms), but if it’s necessary I am ok going there. I look at this as just as threatening as existing 1A exceptions like defamation/true threats, etc. And yes, I’ve considered potential for it to be used in bad faith.(I also have exceptions for e.g. anti-discrimination. We’ve discussed this before with e.g. 303 Creative. I don’t give a shit if being a bigot is in the TOS. I don’t really think of that as a reach issue though.). I do think it’s also problematic that reach is tied to things like wealth to begin with. Letting capitalism arbitrarily decide who gets the megaphone has unintended consequences.
The tldr is I’m not an absolutist when it comes to speech. I’m closer to a utilitarian, and I’m willing to look at context on a case by case basis. If something is going to cause ‘too much’ damage (e.g. a Nazi running Twitter), I’m ok with interfering. (Me being ok with it is also different from it being legal. Although I think you could slide most of it in via strict scrutiny. But there is probably stuff I’d want to do that wouldn’t clear the bar). While I generally lean heavily towards free speech, and the marketplace of ideas, it’s because they’re usually better than the alternatives. I do think it can break down and that bar overcome.
(Sidenote: Yes, I’m aware I’m not going to change your mind on 303 Creative, or some of the other stuff. That’s fine. We can agree to disagree. But never let it be said I won’t say what I believe if someone asks).
As a bonus round- The one quibble I have with your definition, is that I would consider local book bans (e.g. at a school/library. So it’s not banned “everywhere”, technically) censorship. Even for a private school. But honestly whatever, it’s still closer than the dictionary definition and your heart is in the right place. I’m willing to call it moderation that I think is unacceptable. Because it has all the reasons censorship has those negative connotations- trying to control kids thoughts, etc. (While writing this, I also realized your definition seems to miss the heckler’s veto. I’d think that should probably fall under censorship, but again, I can roll with it)
Re: Re: Re:9
I’m a free speech guy, not an anti-trust law guy, so whenever I talk about “free reach”, I’m talking about people who delude themselves into believing that the right to speak their mind entitles them to an audience and the use of someone else’s property as a soapbox when they’re entitled to neither of those things. It’s not quite the same thing as the “I have been silenced” fallacy, but that fallacy is at least adjacent to the “free reach” belief.
As much as I would enjoy seeing Elon fucked over by way of being legally forced to sell Twitter because it’s become a Nazi bar—and believe me, I would enjoy that a lot!—I don’t think the government should be in the business of doing that because…well, let me put it this way: Do you think giving that kind of power to the Trump administration is a wise idea considering how Trump wants the NFL’s Washington Commanders to bring back its old (and racist as hell) team name and logo?
I think you’re missing a “not” in there, but lemme just tell you why I think those bans are censorship.
If a library pulls a book from the shelf because it has low circulation numbers or is damaged beyond repair, I would consider that a form of moderation—not of content per se, but of general usefulness. (A damaged book won’t help anybody and a book that sits on a shelf for years without being checked out is taking up space that could be used for other books.) Neither of those are bad things.
What is bad is when a moral panic takes hold and an activist group demands the removal of books based on their content, no matter how innocuous. Anti-queer activists will literally act as if a children’s book about two male penguins raising a baby penguin is the same thing as hardcore pornography, and they’ll put pressure on both libraries (public or school) and politicians to have those books removed based only on their content. Ban-happy groups like Moms for Liberty want to deny people their right to access those books—especially if people can only access those books through libraries—and act like their personal opinion on And Tango Makes Three should be the fucking law that everyone must follow. And that…
…is why book bans are censorship: They’re a heckler’s veto meant to suppress speech from being accessed by the general public.
Re: Re: Re:3
Learn what the word suppress means when we are talking about censorship and maybe then you would have a point.
Here, I’ll give you the relevant definition so you don’t have search for it:
– to keep from public knowledge: such as
– to keep secret
– to stop or prohibit the publication or revelation of
Do you get it? Denying publication of an ad because it doesn’t conform to a newspapers rules isn’t suppressing speech to keep it from public knowledge.
This is the point every one who uses censorship as a catchall don’t get, censorship is about suppressing some speech so the public can’t see it at all. It’s not about rejecting poorly written ads or ejecting a loudmouth from your property.
As I said, nobody batted an eye at the word moderation for decades – and it meant a specific thing and nobody that mattered the tiniest bit called it censorship. That is, until the poor persecuted “conservatives” couldn’t stop being bigots and racists online and when they got moderated the screamed “censorship!”.
At which point do you think it becomes censorship because the speaker can’t be bothered to adhere to the conditions set out for the service they want to use? From my point of view, every such instance is them shoving their own fist into their mouth so they later can stupidly scream “Censorship! I have been violently silenced!”.
Sure, but we are talking about social media – not porn in public spaces, war correspondence, fascist regimes, religious nuttery, lèse-majesté or any other pre-internet shenanigans. Do keep to the topic at hand. For your argument to even have the tiniest of relevance, give us examples when people have called moderation for censorship pre internet or even pre modern social media – go ahead. And I don’t want to see some stupid anecdote, I want to see a factual effort to label moderation as censorship.
Re: Re: Re:4
I’m aware what the word suppress means. A few things:
1) not all definitions use suppress (Although it’s common, obviously). For instance, here’s Cambridge or Oxford. Alternatively, the Free Speech Institute.
2) To suppress something is not a synonym for totally suppressing it (although they often go hand in hand- if you want to suppress something, you probably want to go all the way). This is why synonyms for suppress include “to clamp down on” or “crack down on”.
And this should be very obvious by example. Are book bans in red state schools or libraries censorship if they don’t completely eliminate access anywhere? Obviously yes.
No, but it is to stop or prohibit the publication or revelation of.
Except it’s not, and you literally posted a definition saying otherwise. It also applies to things that are only partially suppressed.
Yes, they did.
Ehhh, kind of depends on how you’re using the term. If you’re going strictly by the dictionary definition and rules lawyer it, any restriction is technically censorship. But the way we colloquially use censorship is specifically to imply it’s being done due to disagreement over a viewpoint. There is an implied connotation of certain motives tied to the content/idea. Banning someone because they’re a boor is technically censorship, but if you call it censorship without context it’s pretty misleading and people will get the wrong idea.
The way we use censorship colloquially is almost always tied the actual underlying, which you see in most (not all) of the definitions. It also usually has a negative implication, that the content/idea being suppressed is being done for (or at least controversial) bad motives.
Like, censoring spam is a form of censorship, going by the dictionary. It’s fine, no one cares for obvious reasons.
Are you saying nobody, or nobody that mattered? Because those are two different things. There’s lots of randos bitching because /r/fatpeoplehate got moderated or whatever. But even in terms of “mattering”:
1) Here’s a rando HLR article bitching about “private censorship” on Facebook in 2015. 2) Not social media, but here’s EFF saying it’s censorship to remove adblock from the playstore. 2013. Certainly falls under TOS/moderation. 3) EFF complaining about Content ID being censorship-friendly. 4,5,6) Here’s Cory Doctorow talking about “Big Tech censorship”. ( 2, 3. I think Cory counts as mattering, or nah? 7) Again with the EFF, it has a whole project around TOS/moderation/censorship. 8) Here’s the Brennan Center complaining about Facebook censoring Syrian accounts. 9) Here’s the ACLU complaining about Facebook censoring offensive speech.
Those are all examples of moderation.
The definitions (and debate over what qualifies as censorship) existed prior to social media, and social media isn’t unique in that regard. You get a lot of the same social dynamics (and some new ones, of course). It’s pretty silly to pretend the term had some immaculate conception.
Why wouldn’t anecdotes count? When you say “no one”, do you mean moral panics or a mass movement? If by “no one”, you mean not dipshits like Haidt, or no mass movement, then sure, that didn’t happen pre-internet. That’s a very different thing.
As far as older examples, here’s a rando ass discussion from 1986 about moderation/censorship concerns at computer conferences, and how to avoid complaints. It is a thing people considered. Here’s an article (albeit from an FCC commissioner) handwringing about censorship in TV by the TV company due to economic incentives. Here’s one from [Ralph Nader](https://nader.org/1984/06/10/media-self-censorship/ again about corporate censorship in 1984.
Re: Re: Re:5
And here you are straying from what has been said, it was explicitly stated that “But all censorship requires is suppression of speech by a third party, full stop and end of.”
So do you conquer with that statement? Yes or no?
Is it censorship, yes or no?
I didn’t, if you look what I posted it says “such as:”, ie it categorizes different types of suppression to keep speech from the public.
Oh, they did? So this was a big issue 20 or more years ago that inflamed the internet and the news where people said that moderation is censorship?
Who are these “we” you are talking about? Is it the royal We or is it what you think it is? As you can see I and others don’t agree with you so you using “we” are incredible presumptuous.
I was talking about the usage of the word moderation and you somehow pivoted to using censorship. Is moderation censorship, yes or no?
Yes, but at no point has people equated all moderation with censorship which is the topic at hand.
Because anecdotes are anecdotes and that means the argument that moderation is censorship is something new based on people demanding free reach, not free speech.
If you want to put forth the argument that moderation is censorship, it has to be based on multiple sources and prevailing views pre internet. It was you who raised that specific point with your “since the spoken word existed”. And just because some people complained it doesn’t make it true, because if it was a lot of incredible stupid things just became true and civilization as we know it would be in ruins.
Re: Re: Re:6
I think that’s what the dictionary definition of censorship is, yes. Assuming ‘suppress’ here includes partial/attempted suppression. I can’t speak for the other person, but I don’t think they were taking suppress to mean what you think it does, and I’ve never seen suppress used to mean exclusively fully suppressing something.
I don’t think it fits with how people typically use the term colloquially (which is also different from how you’re using it). It’s almost always used with a negative connotation, despite the dictionary definition not explicitly giving that connotation. To use the spam example, that would be censorship under the dictionary definition; most people would not consider it censorship despite it technically fitting.
Using the dictionary definition, yes. Because it is is “stopping something from being published”, which is one of the dictionary definitions. With how people colloquially use the word, usually no because it doesn’t have the negative connotation (although you could find exceptions, depending on what was blocked, and why. As the many articles I posted show). Under Stephen’s definition, no.
As I said previously, was it a giant moral panic? No, and especially not to the degree bad faith right wingers made it. Were people worried about some types of moderation being censorship? Yes.
In this case, I’m referring to society, at least inasmuch as it’s reflected in the dictionary. There will be outliers, but your definition is not common, to the point where it didn’t show up despite skimming a half dozen dictionaries or so. There’s a reason I could find so many examples using it in a way that conflicts with your definition, in a few minutes. And a lot of them are specifically people/groups with free speech expertise.
If there was evidence that it fit your definition, you’d be linking it by now. It’s not presumptuous to point to a dictionary for typical definitions of words. Or something as widely covered/discussed as book bans. The way you guys are using it is niche (which is fine). You know it’s niche. There’s a reason you’re not giving any evidence of your definition being widely used outside a few TD posters.
People didn’t broadly worry about all moderation being censorship (also; you’re changing my wording. What I actually said was There’s been complaints about moderation/censorship since the spoken word existed). They worried about specific acts of moderation (particularly things you’re labeling moderation) being censorship. There is a distinction there you’re conflating.
The rub is, whether people worry about a bit of moderation being censorship hinges on whether they see it as reasonable or not. And the issue is, someone (usually the guy who got moderated, at minimum) is usually going to think a particular moderation action is unreasonable. It’s not a coincidence that the examples I gave are things like corporate censorship; that ties into that sense of wrongness.
The actions being described are acts of moderation (some of them are explicitly types of examples you listed), and they’re being called censorship. I didn’t pivot, that’s showing that people consider what you’re labeling moderation as censorship. They don’t call it moderation because in their eyes it’s censorship.
Under the dictionary definition, yes. As people typically use the term, it depends, it’s not a universal yes or no. Sometimes yes (as with the many examples I listed above), sometimes no.
Typically when people use the word censorship, it has an implied negative connotation. People don’t use it for normal moderation (except when they disagree over whether something is “normal”- the guy who is going to complain about moderating Rule 34 smut is going to be when you remove his Rule 34 smut for not following TOS).
People in general don’t complain about all moderation in the abstract (with exception of a few free speech absolutist weirdos). Specifically, they complain when what you consider is normal clashes with what they consider to be normal.
Anecdotes don’t have to be new? People complaining because they can’t drop the f-slur, or that they can’t talk politics in a nonpolitical chat are not new. And yes, people often conflate free reach with free speech. Although, worries about reach were a thing even with the printing press, not just social media. Those old examples of ‘corporate censorship’ are worrying about reach.
I’m putting forth that it matches the dictionary definition. Not that there was broad moral panics over all moderation, or that I think it’s a good definition that captures the distinction you’re aiming for. Those are not the same thing.
You’re mixing different things. You made two claims. One, about the definition of censorship. Two, that no one has worried about moderation as censorship until recently, and that it’s exclusively recent conservative bait.
The specific point about “since the spoken word existed” was in reference to the latter. Nonconservative people have in fact worried about specific types of moderation as censorship. Especially under the definition of moderation you’re using.
I’m not using the fact that people complained about it to say anything about the definition. That’s from the dictionary. I’m specifically saying people have complained about moderation being censorship(sometimes incorrectly and in direct conflict with both the dictionary definition, and how censorship is actually colloquially used, fwiw), in contrast to your claim that no one cared until the recent conservative crap. In 2 particular cases; a) when it is being applied to them or they don’t think it’s reasonable, and b) concerns over reach when it comes to things like Big Tech/Newspapers.
Re: Re: Re:6
I think that’s what the dictionary definition of censorship is, yes. Assuming ‘suppress’ here includes partial/attempted suppression. I can’t speak for the other person, but I don’t think they were taking suppress to mean what you think it does, and I’ve never seen suppress used to mean exclusively fully suppressing something.
I don’t think it fits with how people typically use the term colloquially (which is also different from how you’re using it). It’s almost always used with a negative connotation, despite the dictionary definition not explicitly giving that connotation. To use the spam example, that would be censorship under the dictionary definition; most people would not consider it censorship despite it technically fitting.
Using the dictionary definition, yes. Because it is is “stopping something from being published”, which is one of the dictionary definitions. With how people colloquially use the word, usually no because it doesn’t have the negative connotation (although you could find exceptions, depending on what was blocked, and why. As the many articles I posted show). Under Stephen’s definition, no.
As I said previously, was it a giant moral panic? No, and especially not to the degree bad faith right wingers made it. Were people worried about some types of moderation being censorship? Yes.
In this case, I’m referring to society, at least inasmuch as it’s reflected in the dictionary. There will be outliers, but your definition is not common, to the point where it didn’t show up despite skimming a half dozen dictionaries or so. There’s a reason I could find so many examples using it in a way that conflicts with your definition, in a few minutes. And a lot of them are specifically people/groups with free speech expertise.
If there was evidence that it fit your definition, you’d be linking it by now. It’s not presumptuous to point to a dictionary for typical definitions of words. Or something as widely covered/discussed as book bans. The way you guys are using it is niche (which is fine). You know it’s niche. There’s a reason you’re not giving any evidence of your definition being widely used outside a few TD posters.
People didn’t broadly worry about all moderation being censorship (also; you’re changing my wording. What I actually said was There’s been complaints about moderation/censorship since the spoken word existed). They worried about specific acts of moderation (particularly things you’re labeling moderation) being censorship. There is a distinction there you’re conflating.
The rub is, whether people worry about a bit of moderation being censorship hinges on whether they see it as reasonable or not. And the issue is, someone (usually the guy who got moderated, at minimum) is usually going to think a particular moderation action is unreasonable. It’s not a coincidence that the examples I gave are things like corporate censorship; that ties into that sense of wrongness.
The actions being described are acts of moderation (some of them are explicitly types of examples you listed), and they’re being called censorship. I didn’t pivot, that’s showing that people consider what you’re labeling moderation as censorship. They don’t call it moderation because in their eyes it’s censorship.
Under the dictionary definition, yes. As people typically use the term, it depends, it’s not a universal yes or no. Sometimes yes (as with the many examples I listed above), sometimes no.
Typically when people use the word censorship, it has an implied negative connotation. People don’t use it for normal moderation (except when they disagree over whether something is “normal”- the guy who is going to complain about moderating Rule 34 smut is going to be when you remove his Rule 34 smut for not following TOS).
People in general don’t complain about all moderation in the abstract (with exception of a few free speech absolutist weirdos). Specifically, they complain when what you consider is normal clashes with what they consider to be normal.
Anecdotes don’t have to be new? People complaining because they can’t drop the f-slur, or that they can’t talk politics in a nonpolitical chat are not new. And yes, people often conflate free reach with free speech. Although, worries about reach were a thing even with the printing press, not just social media. Those old examples of ‘corporate censorship’ are worrying about reach.
I’m putting forth that it matches the dictionary definition. Not that there was broad moral panics over all moderation, or that I think it’s a good definition that captures the distinction you’re aiming for. Those are not the same thing.
You’re mixing different things. You made two claims. One, about the definition of censorship. Two, that no one has worried about moderation as censorship until recently, and that it’s exclusively recent conservative bait.
The specific point about “since the spoken word existed” was in reference to the latter. Nonconservative people have in fact worried about specific types of moderation as censorship. Especially under the definition of moderation you’re using.
I’m not using the fact that people complained about it to say anything about the definition. That’s from the dictionary (and to repeat; that dictionary definition does miss some nuance that Stephen’s definition captures).
I’m specifically saying people have complained about moderation being censorship(sometimes incorrectly and in direct conflict with both the dictionary definition, and how censorship is actually colloquially used, fwiw), in contrast to your claim that no one cared until the recent conservative crap. In 2 particular cases; a) when it is being applied to them or they don’t think it’s reasonable, and b) concerns over reach when it comes to things like Big Tech/Newspapers.
Re: Re: Re:2
Unlike the other AC, I will answer “No” to all the examples you’ve given, but I will state that a ban using a certain type of software (I can’t remember the name) that bans you from other forums as well is definitely private censorship since it does stop you from posting elsewhere for something that may be forbidden on one platform but not another.
Re: Re: Re:
Again: Explain to me how an imageboard admin saying “nah, you ain’t postin’ that here” and deleting an image from that imageboard—all without being able to stop the person who posted the image from posting it elsewhere—is censorship.
Re: Re: Re:2
Because it’s the suppression of speech by a third party. We’ve been through this before, too.
You start from the visceral position that censorship is axiomatically bad, and it leads you into these silly gymnastics.
Re: Re: Re:3
How can that speech be “suppressed” when that third party can’t stop the speaker from going literally anywhere else and expressing the same speech there?
Re: Re: Re:4
Because it doesn’t matter if they can go elsewhere? You can go elsewhere if a government censors you too, you know.
We’ve been through this before, too.
Re: Re: Re:4
Because that’s not a requirement? It’s suppression even if it’s suppressed in a limited domain?
This is not hard! You just insist on overcomplicating it.
Re: Re: Re:5
It’s hardly “suppression” when the speaker can go anywhere else and say the exact same speech that was “suppressed in a limited domain” (i.e., moderated off a website). That’s my whole point: You’re treating “we don’t do that here” as if it’s equal to “you can’t do that anywhere”. You won’t find a lot of people here willing to back you on that, and you’re not doing a good job of convincing me—one of the most outspoken assholes on this site when it comes to this particular subject—that your argument is any good.
Again: Explain to me how an imageboard admin saying “nah, you ain’t postin’ that here” and deleting an image from that imageboard—all without being able to stop the person who posted the image from posting it elsewhere—is censorship.
Re: Re: Re:6
I have. If you aren’t convinced, that ain’t my problem.
You can ‘just go elsewhere’ for all sorts of instances of things that even you would call censorship, such as and for example, the actual wartime censor offices that redacted people’s letters in WWII. It is not an operative or useful criterion in deciding whether or not something is being censored. It is something you’ve decided you can use to shout down idiots mad they got banned from Twitter.
Literally all that’s required is a third party restricting expression. If you want to be slightly more tightly defined you could call it restricting expression based on viewpoint (vs mode/venue/timing/etc), but frankly “free speech zones” and other modal restrictions are absolutely censorious, so I prefer the expansive view.
But you’ve locked yourself in a circular argument so hard I’d get more intellectual response out of a sack of rocks.
Re: Re: Re:7
It is, though.
Let’s take a look at book bans in libraries. Obviously, a library has no obligation to carry any book, and it can pick and choose what books to carry on its own accord. But when those decisions are made under pressure, political or social, said decisions begin to look censorious. Removing a book under such pressure can deny people the right to read that book because maybe they can’t afford to buy their own copy or they can’t risk having a copy in their own home for whatever reason (and hold onto that though because I’mma circle back to it in a bit). Such removals deny the general public access to that book and any knowledge therein, and all to assuage the feelings of some asshole who thinks they know what’s best for everyone. Book bans are considered censorship because they’re almost always enacted from a position of “I don’t like it, so nobody else should be able to read it”. That speech is, even if it’s technically limited to that library, being suppressed by censorious forces who wish to deny the general public (including the poor and the marginalized) access to ideas and knowledge considered “taboo” by such forces. Ergo, book bans are censorship.
None of that applies to content moderation. Consider the following: A certain troll who will remain nameless became so reviled on this site that Mike Masnick used the site’s spamfilter to stop posts from the troll’s account from showing up. That troll wasn’t stopped from spreading their shit elsewhere because losing the privilege of posting here with their account wasn’t, and still isn’t, the same thing as losing the right to post anywhere. (And hell, they didn’t completely lose that privilege here, since they just kept posting as an Anonymous Coward.)
I can guess what you’re going to think next: “Well, people can just buy books that aren’t in the library, how can a book ban be censorship if content moderation isn’t?” And I have an answer for that: The intent of content moderation is to deny someone a space to speak, whereas the intent of book bans is to deny people access to speech that they might otherwise not have. A closeted LGBTQ teenager living in a conservative-heavy anti-queer city might only have access to LGBTQ-friendly books through their local library; if the library bans (and later trashes) those books, that teenager loses their ability to read those books, so the books may as well not exist any more. Again: That’s why book bans are censorship.
I know of Mastodon instances that forbid the posting of anti-queer speech. Tell me why that’s censorship even though it’s done by a private party on private “property” that has no obligation—legal, moral, or ethical—to host all legally protected speech and can’t stop such speech from being posted anywhere else. Then tell me why it’s censorship for you to kick someone out of your house for saying shit that pisses you off, even though you’re a private party in private property that has no obligation to let other people say that shit in your home and has no power to stop that person from saying that shit anywhere else. Then tell me why I am the idiot for believing those two acts aren’t censorship.
I’ll wait.
Re: Re: Re:8
“The intent of content moderation is to deny someone a space to speak”
Which is censorship. As we’ve been through. Many, many times.
Re: Re: Re:9
Yes or no: If someone is banned from Twitter for violating its TOS, can Twitter/Elon Musk stop that person from posting the same TOS-violating speech on any other website on the Internet?
Re: Re: Re:8
“I know of Mastodon instances that forbid the posting of anti-queer speech. Tell me why that’s censorship even though it’s done by a private party on private “property” that has no obligation—legal, moral, or ethical—to host all legally protected speech and can’t stop such speech from being posted anywhere else. Then tell me why it’s censorship for you to kick someone out of your house for saying shit that pisses you off, even though you’re a private party in private property that has no obligation to let other people say that shit in your home and has no power to stop that person from saying that shit anywhere else. Then tell me why I am the idiot for believing those two acts aren’t censorship.”
Because it is in all cases the suppression of someone else’s speech. As I have said.
It really is that simple. It does not matter if it’s private property, if there’s other options, how difficult those options are to access, or any of the other conditions you swing back and forth between being applicable as convenient to your argument of the moment.
There’s no ‘gotcha’ moment waiting in the weeds here. You aren’t an idiot for disagreeing, per se; you’re an idiot because you refuse to believe me when I tell you things, which is why you keep asking the same damn questions as if the answer, this time, will miraculously change from the last five times I have provided it.
Re: Re: Re:9
How is it “suppressed” when it can be repeated literally anywhere else but that Masto instance? How is a person’s right to speak freely threatened by losing the privilege of posting on that Masto instance?
I sincerely believe that you sincerely believe in the shit you’re shovelling in my direction. But you shovelling it at me doesn’t mean I have to accept it—especially without a solid argument as to why I’m not going to accept it.
And that’s my whole point here: You think you have an argument that can change my mind, and I’m giving you every fucking opportunity possible to give me that argument. I am refining my questions and raising salient points and trying to get you to understand my point of view on the matter because I genuinely and sincerely want more from you than insults and “nuh-uh to your uh-huh”. If I say “moderation isn’t censorship because it doesn’t suppress a person’s right to repeat that speech elsewhere”, repeating the same answer you gave me the last time I raised that example isn’t going to change my mind. Maybe try addressing the points I’m raising with a different angle than the one you’ve been using, because the argument you’ve been using isn’t very effective. I won’t be swayed by your ideas if I’ve heard them all before and found them lacking—which I have, by the by.
Tell me why I should—nay, why I must!—think “localized suppression” or whatever term you want to use is censorship. Give me a better argument than “it’s the dictionary definition” or “because I said so”. I am legit begging you to do this. And if you can’t bring something new to the table, don’t bother sitting down; I’m not hungry for a plate of reheated horseshit.
Re: Re: Re:10
There isn’t an argument because we have a fundamental disagreement in premises. I have presented to you exactly why I think what I think. You’re just completely unable to understand that I mean precisely what I said and that asking me, again and again, will not elicit different answers.
Re: Re: Re:11
I understand exactly what you mean.
And it ain’t changing my mind.
I’ve given you my arguments. You’ve given me the same kind of reheated horseshit that I’ve seen in these arguments before, and it’s not smelling any better that it did before even if you think your shit doesn’t stink. If you didn’t want to change my mind on this matter, you wouldn’t keep replying to me. But since you do seem to want to accomplish that, telling me “it’s censorship because the dictionary says so” or “it’s censorship because I say so and you don’t need to know why” won’t do the job.
I am begging for you to give me a better argument than that. I want to know why you think content moderation is censorship so I can understand why you think it’s the same kind of act as a government official trying to intimidate someone into silence or a SLAPP attempting to silence a critic. That’s what I’m not getting from you. I understand what you mean—but I don’t understand the reasoning behind it, and that’s what makes your argument so empty. Give me something better or stop wasting your time, but man, don’t give me that Glass Joe–level argument again.
Re: Re: Re:12
” I want to know why you think content moderation is censorship so I can understand why you think it’s the same kind of act as a government official trying to intimidate someone into silence or a SLAPP attempting to silence a critic.”
Because they’re both a third party acting to suppress someone else’s speech.
Asking the same question for the sixth time will not get you a different answer than the last five.
They are, as Rocky helpfully provided as an analogy elsewhere, the same kind of act in the same way both a surgeon and a murderer might cut someone open. The purpose, intent, precision, and outcome are very different — but they are both still cutting into someone.
There’s no trick or secret to this logic, no weird circumstance or additional justification; stopping someone else’s speech is an act of censorship. Full stop. Whether that’s justified is an entirely separate question that operates on a much more nuanced level, but then you’re considering the issue on the merits instead of trying to argue that a justified action isn’t censorious… even when it clearly would be.
Like, when Twitter was a global square (and monopoly) instead of three quarters bots and one quarter Nazis, being banned from it or having your tweets deleted was a big deal in terms of constricting someone’s ability to communicate a message; likewise, so is payment processors blackmailing Steam into ditching adult games, because Steam is huge.
You insist the first isn’t censorship, but have said the second one is. I think that’s logically incoherent, and that any distinction you might draw is special pleading: both actions are censorious. What matters is the whys and wherefores of what is going on, which is why even individual Twitter bans were very different in how I judged them.
Re: Re: Re:13
How can someone’s speech be “suppressed” when they can repost it literally anywhere else should one platform—a platform that no one has a right to use, by the way!—boots that person’s speech?
Except they’re not. Having a post deleted from Twitter is not the exact same kind of act as a SLAPP or a government agent trying to intimidate you into silence. The first act doesn’t stop you from exercising your right to speak freely—it only denies you the privilege of having that speech hosted on property you don’t own. The other two acts are, at the least, attempts to stop you from exercising your right to speak freely.
How can content moderation do that when it can’t even stop a person from copying the speech that got them “censored” on one website and reposting it on another?
Only in a colloquial sense, but go on.
And yet, the use of Twitter was, is, and assumedly always will be a privilege. You’re not owed a platform or an audience, even by Twitter. If you can’t build them yourself, sucks to be you. But being denied the right to build them on someone else’s property isn’t censorship. It’s you being told “your shit is unacceptable” before you’re shown the door and told to go somewhere else—even if “somewhere else” is your own website with blackjack and hookers.
I don’t. Having a post deleted from Twitter doesn’t stop the person who made the post from making it elsewhere. Having a game deleted from Steam and/or Itch, on the other hand, makes getting that game into people’s hands (metaphorically speaking) a much harder task to pull off. If game makers don’t have the funds to create a storefront of their own or, should they choose to give the game away, can’t find a space to host their content because of the nature of that content, their speech is being suppressed far more than any shitpost on Twitter ever could be—and all because a bunch of rich assholes wanted to use financial blackmail as a means of deciding exactly what kind of porn (if any) people do or don’t “deserve” to buy.
Re: Re: Re:14
“I don’t. Having a post deleted from Twitter doesn’t stop the person who made the post from making it elsewhere. Having a game deleted from Steam and/or Itch, on the other hand, makes getting that game into people’s hands (metaphorically speaking) a much harder task to pull off.”
As I said: Your position is logically incoherent. It’s still possible to sell the game, is it not? The actual game isn’t deleted off the creator’s computers, no? It still exists! You could distribute it any number of other ways. They wouldn’t have the reach of Steam, but you’ve been very vocal about how nobody’s entitled to an audience or ‘freedom of reach’, so it isn’t operative, per your own position, that it is ‘difficult’ to still get your message out.
This is what I mean when I say what you use to define censorship sways in and out with special pleading to support whatever your argument du jour is, and possibly why you’re so baffled at encountering an actually consistent principle of definition.
Re: Re: Re:15
Possible, yes, but far, far, far harder to do. Steam is inarguably the biggest stormfront for all kinds of games and Itch is arguably the biggest stormfront for indie games. Other storefronts don’t sell indie games at nearly the same level as Steam and Itch, and if they do, they probably don’t sell the kinds of games that were affected by all this Collective Shout bullshit. Point is: If you want to sell a game to a wide audience, Steam and Itch are the primary roads to take. So if an indie developer can’t sell a game with adult content through those two storefronts, chances are they won’t be able to sell the game at all unless they can scrounge up the money to put together a website and find a payment processor willing to work with them. Even then, there’s no guarantee that the game will sell because some people might not want to put their credit card info into a payment processor that isn’t as trusted and well-known as PayPal or Stripe (or, on a slightly broader scale, Visa and Mastercard).
A ban on adult content on Steam and Itch is effectively censorship because it prevents indie developers from having access to the most common marketplaces for indie games. When that ban comes not from decisions made by Steam and Itch on their own, but from decisions made by payment processors and credit card companies and banks, it’s those last three institutions that are deciding for adults what content they can or can’t purchase. That is why it’s censorship.
Now show me how someone’s post being deleted from Twitter for violating Twitter’s TOS is, philosophically if not literally, the exact same act as those content bans. I’ll wait.
Re: Re: Re:3
Learn what suppress means in the context of censorship before putting forth that argument, but I doubt you will.
Re: Re: Re:4
The definition you supplied is entirely compatible with what I’m saying. Or, alternatively, if you applied it strictly, that no censorship is possible because there’s always some way to get ’round it. Pick one.
Re: Re: Re:5
So not only don’t you understand context, we have now also established that you don’t even understand what intent is.
Re: Re: Re:6
You haven’t established anything except your own inability to understand what people are telling you.
Re: Re: Re:7
You have established by your own words that context and intent have no meaning, you have given answers that directly contradicts your own argument, you have attributed arguments to me which I haven’t even alluded to.
You can of course point to where anyone here said that censorship isn’t possible if there’s a way around it. What has been said, which you apparently didn’t grasp because everything is black/white in your world, is that if you can take your speech elsewhere without effort you haven’t been censored. If on the other hand you can’t take your speech elsewhere easily because someone is actively suppressing your speech, that’s censorship.
Do you also apply the logic behind your definition of censorship for other acts in life? If someone walks behind you they must be a stalker? A surgeon making an incision is a knife wielding murderer? A married couple doing some loving is rape?
If not, why are you specifically ignoring the intent and context behind a moderation/censorship act?
Re: Re: Re:8
“You have given answers that directly contradicts your own argument”
I don’t believe that I have, but if you could point me at them I would be willing to try to clear up the confusion.
“Why are you specifically ignoring the intent and context behind a moderation/censorship act?”
I’m not. It’s just not relevant to whether that act is censorship — it informs whether the censorship is justified, instead.
To take from your own examples here, of a surgeon cutting to help someone vs. a murderer cutting someone to kill them, it is in either case still cutting someone open. No context or intent changes that. What it changes is whether you want someone to do it.
You grasp?
Re: Re: Re:9
It is, though. Expanding the notion of what counts as censorship (which comes loaded with negative sociopolitical connotations from the get-go) makes justifying the worst forms of it easier. It’s the kind of thinking that goes into “think of the children” laws/movements: We already censor CSAM because it’s literally a crime, so why not censor other content that could potentially harm children, especially if it’s sexual in nature? And before you know it, libraries are pulling children’s books that happen to mention gay people in non-sexual ways while Steam and Itch are yanking down problematic-yet-legal adult-oriented games.
If you want to walk for miles inside that pit of danger, go for it. You won’t find a lot of people here willing to follow you, but you can always walk alone.
Re: Re: Re:10
“It is, though. Expanding the notion of what counts as censorship (which comes loaded with negative sociopolitical connotations from the get-go) makes justifying the worst forms of it easier. It’s the kind of thinking that goes into “think of the children” laws/movements: We already censor CSAM because it’s literally a crime, so why not censor other content that could potentially harm children, especially if it’s sexual in nature? And before you know it, libraries are pulling children’s books that happen to mention gay people in non-sexual ways while Steam and Itch are yanking down problematic-yet-legal adult-oriented games.”
And yet, all that happened anyway. It’s almost as if you’re attributing a false cause!
Re: Re: Re:11
Yes, and who do you think was responsible for that—people with a narrow definition and largely negative opinion of censorship, or people who think censorship in the name of a cause like “think of the children” is an inherently good thing?
Re: Re: Re:12
I think it’s entirely orthogonal to that, actually. Quite frankly, lots of people oppose censorship right up until their own ox is gored, at which point they’ll be fully in favour of shooting everyone involved. Whatever arguments are made beforehand are irrelevant.
Re: Re: Re:13
Jesse, what the fuck are you talking about
Re: Re: Re:14
That huge numbers of ‘free speech’ supporters are that only up until they encounter speech they dislike, at which point they will happily censor it without care or concern or thought as to principle.
Do you have trouble understanding any kind of rhetorical device other than the loaded question or something? You seem to have trouble with metaphor, simile, and hyperbole.
Re: Re: Re:15
I don’t. I have trouble understanding people when they use weird metaphors and their phrasing is fucked up besides.
And yes, I’m aware that some so-called supporters of free speech only support free speech until they get to speech that gives them the ick. I’m not one of those people. I have maintained that people whose speech I disagree with still deserve the right to express that speech. I also maintain that people who don’t want to listen to/host that speech shouldn’t be forced into doing that by people who believe in the imagined right of “free reach”. Losing the privilege to exercise your rights on property you don’t own isn’t the same thing as being denied the ability to exercise your rights anywhere else.
Re: Re: Re:16
That’s fine and all but we weren’t talking about you just now, were we? We were talking about the people who were getting big mad about porn.
Re: Re: Re:
Nope.
This comment has been flagged by the community. Click here to show it.
You just want to censor people.
Literally, you want to decide who gets to voice which opinion.
Too bad, you don’t get to do so, at least in the US.
I notice you refuse to talk about the UK’s Online Safety Act, you coward.
Re:
I don’t and have never wanted to. I don’t know if you’ve noticed, but I’ve spent decades explaining how censorship fails.
I literally don’t and never have. I know you like to claim I do because you don’t understand how trust & safety actually works, but that’s part of the reason I spend so much time explaining it to people so they understand what it does. Too many people falsely think it’s about this.
No one wants to decide who gets to voice which opinion.
Again, I’ve spent years defending the 1st Amendment and continue to do so when BOTH parties seek to remove it. You, if you are who I think you are, regularly defends the GOP when it seeks to censor voices. I’ve never supported any type of censorship.
I mean, what? I’ve talked about it for years? I spoke about it a couple weeks ago on the podcast, and I’ve been working on an omnibus story about all the stories from the past couple of weeks about how the new age verification requirements are a complete and total disaster leading to aggressive censorship.
Why do you always lie?
Re: Re:
I’m sorry to sound stupid but if censorship fails, why are the censors currently destroying the internet? I’d call that a censorship success.
Re: Re: Re:
In what way are “the censors” destroying the internet?
Re:
What opinions and who are you referring to?
Unless you are Trump or Musk of course, but since they are your heroes nothing they do can be bad, amirite bro? Just ignore the blatant bribery.
How does it feel being ignorant and living under a rock? At least its density matches what’s in your head.
Why don’t you actually get a life Matty, being a stupid asshole on the internet isn’t for you.
Re: Re:
oh, you know the ones
Re:
This one just gets another Nope.
How nice of you to boldly assert such bullshit.
I find Censoral scolds to be perfectly acceptable moderators, but not corporate lackies.
It feels like really bad places like X, Meta’s social platforms, and other spots are surviving, and surviving well, without Trust & Safety. It doesn’t seem like they actually need Trust & Safety to survive.
Re:
A body kept on life support while brain dead is surviving. It doesnt mean that body is going to be useful after it is no longer artificially propped up.
Re: Re:
At least a brain dead body is stuck in bed and can’t hurt people. X and Meta’s platforms are home to many a bad actor that use said platforms to turn hateful rhetoric and lies into real-world harm.