Content Moderation At Scale Is Impossible: The Case Of YouTube And 'Hacking' Videos

from the how-do-you-deal-with-this? dept

Last week there was a bit of an uproar about YouTube supposedly implementing a “new” policy that banned “hacking” videos on its platform. It came to light when Kody Kinzie from Hacker Interchange, tweeted about YouTube blocking an educational video he had made about launching fireworks via WiFi:

Kinzie noted that YouTube’s rules on “Harmful or dangerous content” now listed the following as an example of what kind of content not to post:

Instructional hacking and phishing: Showing users how to bypass secure computer systems or steal user credentials and personal data.

This resulted in some quite reasonable anger at what appeared to be a pretty dumb policy. Marcus “Malware Tech” Hutchins posted a detailed blog post on this change and why it was problematic, noting that it simply reinforces the misleading idea that all “hacking is bad.”

Computer science/security professor J. Alex Halderman chimed in as well, to highlight how important it is for security experts to learn how attackers think and function:

Of course, some noted that while this change to YouTube’s description of “dangerous content” appeared to date back to April, there were complaints about YouTube targeting “hacking” videos last year as well.

Eventually, YouTube responded to all of this and noted a few things: First, and most importantly, the removal of Kozie’s videos was a mistake and the videos have been restored. Second, that this wasn’t a “new” policy, but rather just the company adding some “examples” to existing policy.

This raises a few different points. While some will say that since this was just another moderation mistake and therefore it’s a non-story, it actually is still an important point in highlighting the impossibility of content moderation at scale. You can certainly understand why someone might decide that videos that explain how to “bypass secure computer systems or steal user credentials and personal data” would be bad and potentially dangerous — and you can understand the thinking that says “ban it.” And, on top of that, you can see how a less sophisticated reviewer might not be able to carefully distinguish the difference between “bypassing secure computer systems” and some sort of fun hacking project like “launching fireworks over WiFi.”

But it also demonstrates that there are different needs for different users — and having a single, centralized organization making all the decisions about what’s “good” and what’s “bad,” is inherently a problem. Going back to Hutchins’ and Halderman’s points above, even if the Kinzie video was taken down by mistake, and even if the policy is really supposed to be focused on nefarious hacking techniques, there is still value for security researchers and security professionals to be able to keep on top of what more nefarious hackers are up to.

This is not all that different than the debate over “terrorist content” online — where many are demanding that it be taken down immediately. And, conceptually, you can understand why. But when we look at the actual impact of that decision, we find that removing such content appears to make it harder to stop actual terrorist activity, because it’s now harder to track and to stop.

There is no easy solution here. Some people seem to think that there must be some magic wand that can be waved that says, “leave up the bad content for good people with good intentions to use to stop that bad behavior, but block it from the bad people who want to do bad things.” But… that’s not really possible. Yet, if we’re increasingly demanding that these centralized platforms rid the world of “bad” content, at the very least we owe it to ourselves to look to see if that set of decisions has some negative consequences — perhaps even worse than just letting that content stay up.

Filed Under: , , ,
Companies: youtube

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation At Scale Is Impossible: The Case Of YouTube And 'Hacking' Videos”

Subscribe: RSS Leave a comment
67 Comments
Mason Wheeler (profile) says:

Wow, out of all of the companies that could possibly fail to understand Kerckhoff’s Principle, you really wouldn’t think Google would be one of them!

For those who aren’t familiar with computer security, Kerckhoff’s Principle is one of the most counterintuitive, yet most important fundamental principles of the entire field: "the enemy knows the system." It means that any discussion of information security is not valid if it doesn’t begin with the ground-level assumption that the bad guys already know every detail of how your system works, and therefore if you aren’t secure even with that knowledge being out there, you aren’t secure period.

So what does that mean in this context? It means the bad guys already know about hacking–and probably not from YouTube. Taking hacking information off of YouTube isn’t going to shut down any attacks. What it will do, as Professor Halderman pointed out, is exactly the same thing that people using secrecy and obscurity in defiance of Kerckhoff’s Principle always accomplish: it makes it more difficult for the good guys to level the playing field.

Cliff Stoll made the same basic point in his classic book The Cuckoo’s Egg, which described in detail the techniques that a hacker used to break into his computer network and several others: he didn’t feel any qualms about publishing this information because the "people in the black hats" already know this stuff, and teaching everyone else about it allows them to be better informed and better able to defend against such attacks.

Frankly, in the light of Kerckhoff’s Principle alone this decision doesn’t add up, but it just looks worse when you consider research that suggests that approximately 89% of people are basically honest. Any intelligent admin who knows that there are somewhere in the neighborhood of 8 good guys for every bad guy would want to do everything possible to recruit and empower them, rather than keep them in the dark!

Chicken Dinner Road says:

If impossible, cut them down to size. No right to exist at all.

Since by your notions, mere users don’t have any right to use the platform (during good behavior by common law terms), then you can’t argue that you’re for The Public. So X that basis out…

Then you’re just as always arguing for corporate profits with ZERO responsibility.

Again round on this! Can’t you come up with any topic NOT blatantly pro-corporation propaganda?

NO, because you’re paid by Silicon Valley capital and Evil Central to spew this view:

https://copia.is/wp-content/uploads/2015/06/sponsors.png

Anonymous Coward says:

Re: Re:

If impossible, cut them down to size. No right to exist at all.

There are lots of things that are impossible. That doesn’t mean we should ban those things from existence. Human flight was once impossible, do you want to ban that too?

Since by your notions, mere users don’t have any right to use the platform (during good behavior by common law terms)

Rights vs. privileges, learn the difference. No, no one has the "right" to use platforms, but they do have the "privilege". However, as like young child with a toy, that privilege can be taken away if they misbehave. Rights cannot.

then you can’t argue that you’re for The Public

The public has a right to free speech, free from government oppression. They do NOT have a right to use any private platform they want to while violating the rules established by said platform.

Then you’re just as always arguing for corporate profits with ZERO responsibility.

And you still don’t understand freedom of speech does not mean guaranteed access to online platforms.

Again round on this!

I completely agree.

Can’t you come up with any topic NOT blatantly pro-corporation propaganda?

Can’t you come up with something other than sheer idiocy and conspiracy theories?

NO, because you’re paid by Silicon Valley capital and Evil Central to spew this view:

No he’s not and you continuing to link to that image doesn’t make it true, no matter how much you want it to.

Gary (profile) says:

Re: If impossible, cut them down to size. No right to exist at a

(during good behavior by common law terms)

Couldn’t pass that one up – What law are they violating again? Please cite!

And to be clear – are you saying:
"Google is breaking the law"
or
"Corporations – ANY corporations – should be banned under the premise of Cabbage Law."

Go on – I really want to hear this one!

Eric T says:

Irony, I guess Google Project Zero won't post to YouTube

Seriously, Google’s Project Zero disclosure policy has caused quite a few headaches for releasing information before patches. Some I agree need to be released, some should be delayed a bit for a patch to be QC’d before release, and thier own extension request policy has been most of the issues. If you are not familiar: https://googleprojectzero.blogspot.com/

Now I guess if someone else releases a disclosure on Google’s platforms they will be banned, but not if you are in their own security department?

sumgai (profile) says:

You gotta hand it to Alpha/GoogleYouTube, they’re just reading from the same page in the same book as the SESTA/FOSTA protagonists (gubbermint) – "if you can see it, you can make it go away by making sure that on one else can see it". We’re all familiar with how that’s working out, right?

Similar to that are the gun control activists (my inner warrior wants to use a much more derisive adjective here) who believe that if you can’t buy a gun, then you can’t commit a crime. I’m not even gonnna ask for a break here, I’m just gonna go find some cleaner air space that doesn’t contain so much wasted oxygen.

But there is one saying in the pro-gun community that should be co-opted into the computer security field, and that is: "An armed society is a polite society." I’d express it thus: "A knowledgable computer owner owns a safe computer". That goes for everyone from individuals all the way up to the top of the ladder. And it’s the very bottom-most underpinning of my personal computing philosophy: I practice safe hex. No one else can do that for me, it’s my fault if I get taken down/out, and no one else’s.

Ya know, after a few moments in review (before hitting Submit), I’ve come to realize something…. why is it that we can (have to) have all kinds of oversight, which I read as Big Brother-ism, and yet when things go wrong, we can’t sue those entities, public or private, that "promised" we’d be safe if we just follow their instructions? Seems like a lop-sided way of doing things, eh?

sumgai

Wendy Cockcroft (profile) says:

Re: Re:

"An armed society is a polite society."

If that were true the gun-related body count would be a hell of a lot lower.

You can’t just stroll into Tesco and buy a gun over here in the UK and we are generally a polite society.

US gun deaths this year alone: https://www.gunviolencearchive.org/reports/number-of-gun-deaths

UK gun deaths 2017-2018: https://www.ons.gov.uk/peoplepopulationandcommunity/crimeandjustice/bulletins/crimeinenglandandwales/yearendingdecember2018#offences-involving-knives-or-sharp-instruments-are-still-rising-while-firearms-offences-decrease

Even allowing for population differences access to firearms is the issue. It’s much easier to defend yourself against a knife than a gun:
https://en.wikipedia.org/wiki/Wolverhampton_machete_attack

Scary Devil Monastery (profile) says:

Re: Re: Re:

"If that were true the gun-related body count would be a hell of a lot lower. You can’t just stroll into Tesco and buy a gun over here in the UK and we are generally a polite society."

I keep saying both sides of the gun vs gun control debate are a bit wrong in their basic assumptions…

Switzerland and Sweden both have far more heavy-duty guns (assault rifles and hunting rifles respectively) than the US does, per capita. Yet we rank very low on gun-related murder.

Mexico city has some of the most draconian gun control laws in the world, and washington D.C. has the most rigorous gun control law in the US. In both cases these places stand out when it comes to gun-related murder.

I think you’ll find that the best correlation to murder isn’t the prevalence of guns, but the state of mental health in society. In the UK it’s difficult to be born into hopelessness. In the US if you’ve born in, say, Flint, odds are good you were literally born to lose. With much of the population in a fortress mentality towards some other part of the citizenry the gun simply becomes a more convenient killing/defense tool.

Add to that the mythology in the US of the gun being the "great equalizer" and other, similar catchphrases and you end up with large parts of the citizenry being not only aggrieved but convinced that holding a Glock will at least ensure they get a modicum of respect.

The entire debate of gun control vs the right to bear arms is irrelevant when the background situation is that of a low-intensity war.

John Snape (profile) says:

Wait a second...

Just last week I was told that YouTube can host or not host whatever it does or does not want to, and I had to just shut up and take it. Complaining was forbidden! If I didn’t like it, I could take the half a trillion dollars I have lying around and go build my own video hosting platform.

Now there’s a whole article on Techdirt bemoaning the fact that YouTube is doing what so many last week said they have every right to do.

So which is it?

Roy Rogers says:

Re: Re: Wait a second...

There was no problem hosting Crowder until Maza whined.

Their Company. Their way. The end.

John Snape, Techdirt wasn’t saying that, your comment needs to be directed to the peanut gallery, they are the culprits you seek.

Techdirt is saying what it says in this article
"…at the very least we owe it to ourselves to look to see if that set of decisions has some negative consequences — perhaps even worse than just letting that content stay up."

Same as it did in the post about backpage very recently

Techdirt has always said this, to the best of my knowledge

Lets go Trigger. The bigots are cumming down the slope

Bruce C. says:

Re: Wait a second...

Agreed. If you accept the premise that Youtube has full control over the content it allows on its platform, there is no such thing as a "bad call". they can ban whatever the hell they want.

If you don’t accept that premise, then you are accepting that there has to be some check on Youtube’s ability to police its own site. And if you accept that premise, it’s better to have a formal process than to just rely on social pressure from influential users to dictate what is "proper".

Bruce C. says:

Re: Re: Wait a second...

Edit: If Youtube had an appeal process that actually worked for the average user, that could suffice. But that’s impossible, because it just kicks the "content moderation at scale" can down the road: eventually the bad actors will just start appealing everything and overwhelm the appeals process.

Anonymous Coward says:

Re: Re: Re: Re:

And trying to moderate all the content on their site with 100% accuracy is no different, it’s completely overwhelming. Hence the point of this article.

So tell me, why is moderating billions of users who make multiple posts a day somehow magically possible but an appeals process for far less DMCA requests is completely overwhelmed and impossible? Hm?

Scary Devil Monastery (profile) says:

Re: Re: Re:2 Re:

" So what? They took it down, they find it easier than trying to determine the truth. Their company. The end"

Yeah, but the problem with the specific situation you responded to is that the legal background renders youtube liable if any of the takedowns they don’t respond to turns out to be real.

So they are basically intimidated into playing by rules their own company wouldn’t necessarily sanction.

That One Guy (profile) says:

Re: Re: 'Can do' does not always mean 'smart to do so'

Agreed. If you accept the premise that Youtube has full control over the content it allows on its platform, there is no such thing as a "bad call". they can ban whatever the hell they want.

Just because someone has the ability to smash their hand with a hammer does not mean it wouldn’t be a ‘bad call’ for them to do so.

Anonymous Coward says:

Re: Re: Re:

Agreed. If you accept the premise that Youtube has full control over the content it allows on its platform, there is no such thing as a "bad call".

That’s actually not how that works. Chess players are well within their rights to play however they want. That doesn’t change the fact that putting your king deliberately into a checkmate isn’t a bad call. Same applies here, they are within their rights to do it, but that doesn’t mean it isn’t a bad call and people can’t criticize them for it. I mean come on, you’re criticizing them for booting off Nazi’s, the scum of the earth, so why is it suddenly hypocritical for someone else to do the same thing you are?

If you don’t accept that premise, then you are accepting that there has to be some check on Youtube’s ability to police its own site.

No, that’s not how it works. Just because they have the right to police their site how they want doesn’t mean that everyone has to agree with it. You are saying that everyone’s speech (online or offline) should be policed just because you don’t particularly agree with them.

And if you accept that premise, it’s better to have a formal process than to just rely on social pressure from influential users to dictate what is "proper".

No, it’s really not. And the reason why is because that formal process by definition is a violation of the First Amendment. As soon as the government starts dictating what speech is or is not allowed, it runs flat into the First Amendment. Why is it so hard for you and your ilk to understand this despite being told innumerable times?

JMT (profile) says:

Re: Re: Wait a second...

"If you accept the premise that Youtube has full control over the content it allows on its platform, there is no such thing as a "bad call". they can ban whatever the hell they want."

That is completely false. The point of the article is that some of YouTube’s actions are arguably bad for society as a whole, such as making it harder to educate the public on how to protect themselves against hacking. Just because they’re entitled to do something doesn’t mean they can’t be legitimately criticized for it. That makes the rest of you comment equally false.

PaulT (profile) says:

Re: Re: Wait a second...

"If you don’t accept that premise, then you are accepting that there has to be some check on Youtube’s ability to police its own site"

No, it just means that one private group can exercise its freedom of speech to criticise the way another private group chooses to exercise theirs.

"it’s better to have a formal process "

Yes, which is why people are criticising the messy and opaque process YouTube currently have in place. That doesn’t that people accept that the only alternative is for the government to come in and prevent them from moderating their platform. Stop with that false premise.

Anonymous Coward says:

Re: Re:

Well if you could stop misrepresenting and lying about what was said you might have a point. Since you can’t, you don’t.

No one said any of what you claimed they did. What you want is to have the government FORCE companies to do something. We’re not wanting to force them to do anything, but we are pointing out that this particular policy is a fairly bad idea, but they are still within their rights to disagree with us and move forward with it anyway.

Also, note that the article is about how it’s impossible to do what you want it to do and as an example brings up this fact where the hacking videos were removed BY MISTAKE. So you’re taking an example that unequivocally proves you wrong and arguing that somehow we’re being hypocrites. Right.

PaulT (profile) says:

Re: Wait a second...

Both. YouTube should not be forced by government agents to host certain content, but if they privately choose to block content in ways that we believe is not the correct way we can still criticise them for making the wrong decision.

Life is far easier to deal with if you stop trying to apply false dichotomies to everything and learn to deal with shades of grey.

Bruce C. says:

But it also demonstrates that there are different needs for different users — and having a single, centralized organization making all the decisions about what’s "good" and what’s "bad," is inherently a problem.

Regardless of how it’s implemented, the platform owner will always be a single, centralized organization making all the final decisions about what’s "good" and what’s "bad".

I suppose you could argue that a user could "go find another platform", so it’s not really centralized. But again, that argument applies for extremist videos as much as it applies for educational hacking.

Anonymous Coward says:

Re: Re:

Regardless of how it’s implemented, the platform owner will always be a single, centralized organization making all the final decisions about what’s "good" and what’s "bad".

And that’s your first mistake. They aren’t making final decisions about what’s "good" or "bad". They are making decisions about what they do and do not want to allow on their platforms. A Star Wars fan site has every right to ban all fan discussions about Star Trek. This is no different, just on a larger, more inclusive scale.

I suppose you could argue that a user could "go find another platform", so it’s not really centralized.

And that’s your second mistake. Facebook, Google, Twitter, etc… are not the internet. The internet by design is decentralized. It’s impossible for ANY single company to control everything that goes on online. So yes, a user can just go find another platform, or they can make their own. You can create a free blog in 10 minutes, or with maybe a day or two’s worth of work, you can stand up your own Mastodon instance and create your own social media platform.

But again, that argument applies for extremist videos as much as it applies for educational hacking.

Yes it does, but nothing says we can’t criticize Youtube for a decision we don’t agree with. You want to take that decision away from them.

PaulT (profile) says:

Re: Re:

"But again, that argument applies for extremist videos as much as it applies for educational hacking."

It does. Then, the people who don’t want to host extremist videos are free to refuse them, while authorities have a nice place to go and see who is saying what without monitoring thousands of hours of non-extremist content every day to find the potential terrorists since they’ve already self-identified. Win/win.

Anonymous Coward says:

YouTube’s new policy will do nothing to stop bad guys, but it will definitely make it harder for the public to learn about security.

Plenty of recent decisions – and plenty of less recent decisions – are aimed at making the public complacent and ignorant about security.

How else are the NSA going to get their backdoors without the public throwing a well-deserved shitfit?

Zof (profile) says:

YouTube makes lots of "mistakes" now and they are fooling noone

There’s so much data showing the reality distortion YouTube does in the USA for the big media outlets that aren’t Fox. A CNN story that nobody views and might get 500 likes will trend when PewDiePie doesn’t. The major media outlets that aren’t Fox (fox never trends on youtube now) control YouTube now for Google.

Stephen T. Stone (profile) says:

Re:

fox never trends on youtube now

Hi! You made an absolutist statement wherein one example can prove you wrong. Here’s that one example. (It currently sits at #40 on Trending, but still, it’s there.)

There’s so much data showing the reality distortion YouTube does in the USA for the big media outlets that aren’t Fox.

By all means, share it.

Anonymous Coward says:

Re: Re:

There’s so much data showing the reality distortion YouTube does in the USA for the big media outlets

And since there’s so much data I’m sure you can provide some links to it and aren’t once again lying. RIGHT?

A CNN story that nobody views and might get 500 likes will trend when PewDiePie doesn’t.

Videos don’t trend if they aren’t viewed. That’s kind of the definition of "trending", is that people are watching them. So please, do explain how a video that "nobody views" gets enough views to start trending. I’ll wait.

The major media outlets that aren’t Fox….control YouTube now for Google.

You want to explain that a bit better? How exactly do they control them? Hm? Have they been given server admin access? Direct access to the website code? Please, do explain EXACTLY how they control Youtube.

(fox never trends on youtube now)

As Stephen points out, making absolutist statements is risky business. One example to the contrary and your entire argument goes up in smoke. Oh hey, look, smoke!

PaulT (profile) says:

Re: Re: Re:

"So please, do explain how a video that "nobody views" gets enough views to start trending. I’ll wait."

I assume he’ll just say that YouTube are making the numbers up, because we all know they get paid massively for ads that nobody clicks on or something.

"You want to explain that a bit better?"

Fox fans are told that they are simultaneously the most popular and most trustworthy news source, while they also rail against "mainstream media" for putting down the underdog. Anyone with the ability to believe all of this at the same time must get very confused when faced with the outside world. You can usually tell, because they’ll start ranting about CNN, even though most people who don’t watch fFox also don’t watch that.

Sadly, this seems to be driving them to bigger fiction writers and more extreme content than realising they’re being lied to.

Anonymous Coward says:

Re: Re:

Becoming "so pervasive" does not make it a public forum. Being run by the government makes it a public forum. As far as I’m aware, the government doesn’t own or run Twitter. It is a private company, which makes it NOT a public forum and NOT subject to First Amendment restrictions.

Also, I recommend taking a class in English. Your spelling and grammar is atrocious.

Gerald Robinson (profile) says:

Terrible Ide

Well pervasive sloped by me, I could be a liberal and blame my spelling corrector/auto-complete, but I’ll be an adult and accept responsibility. Prevailant is the right term. If you get 89% of the attention are you a platform or a public form? Address the issue not your goofy prescriptions ( I only have mature cincerns!).

Anonymous Coward says:

Re: Terrible Ide

I wasn’t referring to your choice of words, I was referring to the fact that you can’t spell them or use proper grammar.

Prevailant is the right term.

Pervasive or "prevailant" (I’m assuming you meant "prevalent" here, since "prevailant" isn’t a word) makes no difference, it’s still not owned/run by the government so it’s still not a public forum. It is literally irrelevant how popular or how widely used it is.

If you get 89% of the attention are you a platform or a public form?

It depends on if you are owned/run by the government. If you are owned/run by the government then you are a public forum (which is the correct term, not form). If you aren’t run by the government then you are a private platform/forum/form, no matter how "prevalent" and widely used you are.

Address the issue

I did.

not your goofy prescriptions

My doctor says I’m in good health and has not prescribed me any medications.

( I only have mature cincerns!)

I assume you meant to say "concerns"? Again, I recommend an English class so that you can learn to spell properly.

Thad (profile) says:

Re: Terrible Ide

Techdirt has a number of posts under the public forum tag that answer your question; you’ll have an easier time finding information on what a public forum is if you learn how to spell it correctly.

The short answer is, a privately-owned platform is not a public forum. It can become what’s called a limited-purpose public forum, if public officials use it for official business, but that does not mean the entire platform is a public forum. No matter how big its userbase.

You appear to be combining the ill-informed "public forum" talking point with the ill-informed "publisher, not a platform" talking point. Techdirt has covered that at considerable length too; here are two recent posts to get you started:

Once More With Feeling: There Is No Legal Distinction Between A ‘Platform’ And A ‘Publisher’

Explainer: How Letting Platforms Decide What Content To Facilitate Is What Makes Section 230 Work

Anonymous Coward says:

I am trying to rad a an issue. That, unlike newspapers some of the big guys: Facebook et al, YouTube, Twitter, are so prevalent and dominate to such a degree that they are in fact public forms!

Looking at the EUs’ massive failure enforces what a bad idea moderation is. Their moderation requirements have driven almost all the smaller platforms out leaving only the very big guys that the essentially anti-American regs were apparently aimed at.

It can be argued that a lot of user contributions are junk, disinformation, or spam. But I find that user reviews are useful if taken with a grain of salt. If size matters and fairness is the goal then very limited or no moderation is the answer. If we are to allow platforms as private enterprise then "fairness" and good moderation requirements become silly!

Anonymous Coward says:

Re: Re:

Facebook et al, YouTube, Twitter, are so prevalent and dominate to such a degree that they are in fact public forms!

Incorrect. The only thing that makes a forum public is whether or not it is owned/operated by the government. Facebook, Youtube, Twitter, et al are not, therefore they are not public forums.

Looking at the EUs’ massive failure enforces what a bad idea moderation is.

No, the EU’s massive failures show what a bad idea government interference in free speech is. Moderation is a form of free speech.

Their moderation requirements

You mean free speech restrictions and taxes.

It can be argued that a lot of user contributions are junk, disinformation, or spam.

Way to just dismiss the value of the entire human race. No wonder the world doesn’t like your view.

If size matters

It doesn’t.

fairness is the goal

Depends on your idea of fair. Is it fair to tell people what they can and cannot say on their own property? If you think so then no, fairness is not the goal.

then very limited or no moderation is the answer

Well since that’s NOT the goal and fairness is actually letting companies decide what kind of experience they want their users to have and moderating to achieve that, then moderation IS the answer.

If we are to allow platforms as private enterprise

And the alternative is….?

then "fairness" and good moderation requirements become silly!

The only thing silly here is your lack of understanding that moderation is part of freedom of speech and is the ONLY way to keep platforms from becoming a cesspool of vile and disgusting thoughts and ideas cluttering up the space.

Gerald Robinson (profile) says:

I understand moderation as a type of free speach. The public forum came about because of government restraining free speech in public places shutting it down .

My contention that some of these platforms have become so large that they are like the public spaces of old. Denying access in fact denies the right of free speach. This is just like, in the distant past denying access to the town square denied me the ability to express my views.

I ‘d like to hear some substantive comments not just reiterating the facts. Do we need to change public policy and should we? The EU demonstrates some of the problems.

"Way to just dismiss the value of the entire human race. No wonder the world doesn’t like your view."

So now you say there are no bots, SPAM, … and every comment is gold. Then we obviously don’t need moderation and should ban it!

You can’t or don’t read very well! My comment was "…If fairness is the goal". But you say it has no place! I agree, how do we define ‘fairness’. I don’t believe it’s possible. Just like defining hate speech etc.

". ..ONLY way to keep platforms from becoming a cesspool of vile and disgusting thoughts and ideas cluttering up the space.’. So now you are for censorship!

It appears that we have are two choices:
First, accept that "…cesspool of vile and disgusting thoughts and ideas cluttering up the space." And learn to deal with it OR
Second accept accept arbitrary capricious censorship as free speech (today’s situation).

Neither is really that good but I agree that moderation as free speech is the lessor of two evils. In this case we need to strengthen 230 and intermediate liability! The persecution of Back Page is a good reason why! We need to ban foreign governments and entities form interference with American free speech as the EU and Indian government have recently done. We need to rather embrace 230 or throw it out not waffle like we are doing now!

Anonymous Coward says:

Re: Re:

This has been explained to you many times before but apparently it has not sunk in:

A space/forum is only public, in regards to the First Amendment, when it is owned and/or operated by the government. A privately owned social media platform is not and never will be.

The public forum came about because of government restraining free speech in public places shutting it down

Yes, government owned public spaces, like a town hall, town square, etc… A privately owned convention center or amphitheater (or social media platform) is not the same thing.

My contention that some of these platforms have become so large that they are like the public spaces of old.

You can contend all you like but that won’t make it true. Size has absolutely nothing to do with whether the space is "public" or not, with regards to freedom of speech and the First Amendment.

Denying access in fact denies the right of free speach.

Only if owned/operated by the government. I can, for example, host a block party in my backyard and have you removed by force for spouting your nonsense if I so choose.

This is just like, in the distant past denying access to the town square denied me the ability to express my views.

Again, the town square is owned/operated by the government. Social media platforms are not.

I ‘d like to hear some substantive comments not just reiterating the facts.

You have, you just apparently don’t like it. Not to mention the facts say you are wrong. Facts are objective, comments not based in facts are subjective. Are you implying that you don’t like the facts and want more people to join you in denying reality?

Do we need to change public policy and should we?

No and no, for reasons obvious to everyone but you.

The EU demonstrates some of the problems.

Yes, they are an excellent example of what happens when the government tries to restrict the freedom of speech of people and companies, including social media. Let’s not do that here ok?

So now you say there are no bots, SPAM, … and every comment is gold.

Let me set some fire to that strawman you’ve constructed there. The original statement to which I wrote that reply states the following (bolding mine):

It can be argued that a lot of user contributions are junk, disinformation, or spam.

Given that, I took his comment to mean these submissions are being done by actual human beings, voicing their opinions, statements, point-of-views, content, etc… NOT auto-generated content by bots, that are generally not considered users. I did not address, nor imply, whether or not some content was generated by automated means and the veracity of said content. Do not put words in my mouth.

You can’t or don’t read very well!

That’s funny coming from you, since I just got done explaining how you didn’t properly read the context of my comment.

My comment was "…If fairness is the goal". But you say it has no place!

You know, lying about what I said just one comment up is not a good look. I said it depends on what you consider to be fair, then gave two example scenarios, clarifying that.

I agree, how do we define ‘fairness’. I don’t believe it’s possible. Just like defining hate speech etc.

Well then, if you can’t define it then it shouldn’t be made into laws then should it? If you think it similar to hate speech, then you should be in favor of not making laws dictating what is or is not "fair", and instead it should be left up to the platforms and users to decide. Hm?

So now you are for censorship!

Moderation is not censorship, and nowhere did I state that.

It appears that we have are two choices:
First, accept that "…cesspool of vile and disgusting thoughts and ideas cluttering up the space." And learn to deal with it

Newsflash, moderation is one method of dealing with it.

Second accept accept arbitrary capricious censorship as free speech (today’s situation).

I’m sorry, I wasn’t aware the government was already telling me what I can and cannot say online, since that is the definition of censorship. Private content moderation is not censorship, no matter how many times you lie about it. It’s also not arbitrary and capricious. The terms of use are clearly laid out when you sign up to use social media and you have to click the "I agree to abide by these terms" when you sign up. Since you agreed to abide by the rules, you have no right to complain when you get booted off for breaking them.

Neither is really that good

Only because you don’t understand what you are talking about.

I agree that moderation as free speech is the lessor of two evils.

It’s not an evil, it IS as much free speech as me stating my opinion. And if you agree, then why are you arguing against it?

In this case we need to strengthen 230 and intermediate liability!

Those things are in most cases mutually exclusive. 230 says the platforms shouldn’t be blamed for actions of their users. Strengthening that means you can’t hold them responsible in intermediary liability situations. You need to go back and re-educate yourself on these laws because it is painfully obvious you don’t understand any of this.

The persecution of Back Page is a good reason why!

It was shown that BackPage was actually catering, or at least turning a blind eye, to sex trafficking, which is illegal. That’s why they got taken down. If they had stepped up better, they would still be around.

We need to ban foreign governments and entities form interference with American free speech

This is an absolutely stupid and moronic statement. We need to do no such thing since foreign countries CAN’T interfere. Americans aren’t subject to foreign laws, so aside from sending cops and troops over here (which would be an invasion and act of war) how the hell would they even do this in the first place? You really have no concept of how the world works.

as the EU and Indian government have recently done

[Citation needed.]

We need to rather embrace 230

Duh, we already have. That’s why it’s a law and courts have upheld it. You started this post out arguing against 230, now you are for it. Pick one.

or throw it out not waffle like we are doing now!

Then tell the idiots in government to stop being morons and technological illiterate luddites and STOP trying to undo 230 protections.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...