Gizmodo: Why Can't YouTube Do 'Good' Content Moderation? Answer: Because It's Fucking Impossible

from the super-fucking-impossible dept

We’ve had something of a long-running series of posts on the topic of content moderation, with our stance generally being that any attempt to do this at scale is laughably difficult. Like, to the point of being functionally impossible. This becomes all the more difficult when the content in question is not universally considered objectionable.

Tech firms tend to find themselves in the most trouble when they try to bow to this demand for content moderation, rather than simply declaring it to be impossible and moving on. The largest platforms have found themselves in this mess, namely Facebook and YouTube. YouTube, for instance, has released new moderation policies over the past two months or so that seek to give it broad powers to eliminate content that it deems to be hate speech, or speech centered on demographic supremacy. Wanting to eliminate that sort of thing is understandable, even if you still think it’s problematic. Actually eliminating it at scale, and in a way that doesn’t sweep up collateral damage and garners wide support, is impossible.

Which makes it frustrating to read headlines such as Gizmodo’s recent piece on how YouTube is doing with all of this.

YouTube Said It Was Getting Serious About Hate Speech. Why Is It Still Full of Extremists?

Because it’s fucking impossible, that’s why. There is simply no world in which YouTube both successfully eliminates all, or even the majority, of speech that some large group or another considers hate speech or “extreme.” That’s never going to happen. YouTube never should have suggested it would happen. The screw up here is YouTube not properly setting the public’s expectations as to what its policy would achieve. Yeah, there is still a good deal of extremist content on YouTube. Whipping up anger at content that’s available at this moment is trivially easy.

Making it more frustrating is Gizmodo’s assertion, with a sinister connotation, that all of this is “part of YouTube’s plan.”

Strangely, this isn’t a simple oversight by YouTube’s parent company, Google. In fact, it’s the policy working as planned. YouTube hosts more than 23 million channels, making it impossible to identify each and every one that is involved with the hate movement—especially since one person’s unacceptable hate speech is another person’s reasonable argument. With that in mind, we used lists of organizations promoting hate from the Southern Poverty Law Center, Hope Not Hate, the Canadian Anti-Hate Network, and the Counter Extremism Project, in addition to channels recommended on the white supremacist forum Stormfront, to create a compendium of 226 extremist YouTube channels earlier this year.

While less than scientific (and suffering from a definite selection bias), this list of channels provided a hazy window to watch what YouTube’s promises to counteract hate looked like in practice. And since June 5th, just 31 channels from our list of more than 200 have been terminated for hate speech. (Eight others were either banned before this date or went offline for unspecified reasons.)

Before publishing this story, we shared our list with Google, which told us almost 60 percent of the channels on it have had at least one video removed, with more than 3,000 individual videos removed from them in total. The company also emphasized it was still ramping up enforcement. These numbers, however, suggest YouTube is aware of many of the hate speech issues concerning the remaining 187 channels—and has allowed them to stay active.

I would suggest that these numbers actually likely represent YouTube blocking too much content, rather than not enough. In a politically divided country like ours getting some significant number of people to state that even a relatively innocuous video is “extreme” would be pretty easy. Add to that fact that the selection bias mentioned above is way understated in this article, and the problem deepens. Layer on top of that the simple fact that some of the sources for this list of “extremist” content — namely the SPLC — have been caught quite recently being rather cavalier about the labels they throw around, and this whole experiment begins to look bunk.

Making Gizmodo’s analysis all the worse is that it seems to complain that YouTube is only policing the content that appears on its platform, rather than banning all content from uploaders who take nefarious actions off of YouTube’s platform.

To understand why these channels continue to operate, it’s important to know how YouTube polices its platform. YouTube’s enforcement actions are largely confined to what happens directly on its website. There are some exceptions—like when a channel’s content is connected to outside criminality—but YouTube generally doesn’t consider the external behavior of a group or individual behind an account. It just determines whether a specific video violated a specific policy.

Heidi Beirich, who runs the Southern Poverty Law Center’s Intelligence Project, charges that YouTube’s approach puts it far behind peers like Facebook, which takes a more holistic view of who is allowed to post on its site, prohibiting hate groups and their leaders from using the social network.

“Because YouTube only deals with the content posted, it allows serious white supremacists like Richard Spencer and David Duke to keep content up,” Beirich said. “In general, our feeling is that YouTube has got to get serious about removing radicalizing materials given the impact these videos can have on young, white men.”

It’s an insane request. Because a person or group says some things that are obviously objectionable, we want their voices silenced on YouTube, even when the content there isn’t objectionable? That’s fairly antithetical to how our country operates. YouTube is of course not governed by the First Amendment and can take down whatever content it chooses, but the concept of free speech and the free exchange of ideas in America is much more global as an ideal than the specific prescriptions outlined in the Constitution. Silencing all potential speech from a party simply because some of that speech is objectionable is quite plainly un-American.

Gizmodo then complains about the inconsistencies in enforcing this impossible policy.

The apparent inconsistencies go on: The channel of South African neo-Nazigroup AWB was terminated. Two others dedicated to violent Greek neo-Naziparty Golden Dawn remain active. The channel of white nationalist group American Identity Movement, famous for distributing fliers on college campuses, is still up. As is a channel for the white nationalist group VDARE. And, notably, none of the 33 channels on our list run by organizations designated by the Southern Poverty Law Center as anti-LGBTQ hate groups have been removed from the platform.

In addition to giving many hateful channels a pass, this agnosticism to uploaders’ motives means that some channels with no interest in promoting white supremacy have been punished as YouTube enforces its policies.

Unlike what Gizmodo — and even YouTube — says, this is a bug, not a feature. We cannot say this enough: there is no good way to do this. Frankly, save for criminal content, YouTube probably shouldn’t even be trying. Alternatively, if it does want to try, it probably would be more satisfying if YouTube’s public stance was something like: “We’ll block whatever we want, because we’re allowed to. If those blocks don’t seem to make sense to you, deal with it.” At least that would set the proper expectations with the public.

And then maybe there would be less consternation as to why YouTube hasn’t yet achieved the impossible. Impossible, in this case, being both doing content moderation at scale and simultaneously making everybody happy.

Filed Under: ,
Companies: youtube

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Gizmodo: Why Can't YouTube Do 'Good' Content Moderation? Answer: Because It's Fucking Impossible”

Subscribe: RSS Leave a comment
76 Comments
Samuel Abram (profile) says:

This is a bad post

This post is a bad post. It is a unilateral rant which has absolutely none of the thoughtfulness of Mike Masnick. If speech doesn’t have consequences, how do you explain libel laws? The
Christchurch shooter and the El Paso shooter were radicalized by hate speech from white supremacists like Donald Trump.

Once again, don’t be so callous like that famous Neil DeGrasse Tyson tweet.

ECA (profile) says:

Re: Re: This is a bad complaint

I add to this..
What is illegal in 1 area may not be in others..Bestiality isnt illegal in ALL 50 states..
Selling your Daughter, ISNT a bad thing. In some nations.
Customs and Many things are different when you look around..

Why are we trying to Impose our Concepts/ideals/customs/religion/Corporate rules and TONS of other Stuff, into others Worlds.
WE DONT RULE THE NET..
And the best place to LEARN about others, is to USE the internet..

Samuel Abram (profile) says:

Re: Re: This is a bad post

It’s because of this:

Tech firms tend to find themselves in the most trouble when they try to bow to this demand for content moderation, rather than simply declaring it to be impossible and moving on. The largest platforms have found themselves in this mess, namely Facebook and YouTube. YouTube, for instance, has released new moderation policies over the past two months or so that seek to give it broad powers to eliminate content that it deems to be hate speech, or speech centered on demographic supremacy. Wanting to eliminate that sort of thing is understandable, even if you still think it’s problematic.

And this:

Unlike what Gizmodo — and even YouTube — says, this is a bug, not a feature. We cannot say this enough: there is no good way to do this. Frankly, save for criminal content, YouTube probably shouldn’t even be trying. Alternatively, if it does want to try, it probably would be more satisfying if YouTube’s public stance was something like: "We’ll block whatever we want, because we’re allowed to. If those blocks don’t seem to make sense to you, deal with it." At least that would set the proper expectations with the public.

If the Social Media companies took that stance, they would be under a lot more pressure from both the public and their shareholders with people dropping them like flies for alternatives (not to mention the threat of legislation from all over the world). The social media companies don’t take the position to minimize hate speech because they want to but because they have to, and for good reason: many people would stop using their platforms if that were the case except for the white supremacists and trolls. I don’t think Twitter wants to become like Gab, for instance.

ECA (profile) says:

Re: Re: Re: This is a bad post

There is little reason for 90% of what is happening to the Big tech corps..
Trying to Nationally regulate the net is like trying to stop water from flowing through a Sieve..

The ideals of other nations is interesting and can TEACH us something..
For some Odd reasoning, its as if someone is trying to close our eyes to everything Else. I can understand a parent being abit Skeptical about the net..THEN DONT LET THEM ON THE NET… If you want to Control and protect your Child from seeing WHAT the world has and IS.. then get rid of the net.. Go watch cable TV. and protect your child from reality and what LIFE in the world is like..

Who Here has been to a REAL magazine store, there are not many left. selections of Mags from all over the place and they will import a few for other people. Then the ADULT sections.
Yep, you can read the military ones and the ones showing News and wars around the the world. but you AINT supposed to see what you mother looks like naked. Nor what your parents Might be doing in the dark..

Anonymous Coward says:

Re: Re: This is a bad post

While white supremacism is both ethically bankrupt and morally repugnant there is nothing illegal about being a white supremacist nor any other kind of extremist. Their rants and tirades are constitutionally protected speech and the government (of any USA flavor) should do nothing to prevent them from speaking.

The only way to answer your questions is with pure opinion that carries no legal weight. There’s no law that should stop Trump and his lemmings from jumping off the cliff that is public opinion on Twitter or any other social media outlet. THey have that right as do you and I to post our own screeds.

In closing: Fuck Trump, fuck white supremacists and fuck anyone who aligns themselves with either or both (not that the difference is clear). But also fuck anyone that fights to silence any of them.

Shufflepants (profile) says:

Re: Re: Re: This is a bad post

"there is nothing illegal about being a white supremacist"

Correct, but it’s not a protected class either. So, there’s nothing illegal about twitter banning a user for being a white supremacist. And people are arguing that maybe they should. This is not silencing them. They will not go to jail for saying these things. It will just be twitter deciding not to provide a platform for them to help spread their hateful ideology.

As always, there’s a relevant xkcd for this notion.
https://xkcd.com/1357/

Anonymous Coward says:

Re: Re: Re:2 This is a bad post

I said nothing to disagree with what you posted so I’m not sure that counts as a rebuttal. I may agree with you but the question of "should they be kicked off" becomes dangerous when used to whip people up into a hate frenzy even if that hate is directed at other haters.

Twitter and Facebook are free to kick off whoever they (dis)like, I just hope they choose to boot those I also dislike.

Scary Devil Monastery (profile) says:

Re: Re: This is a bad post

"If Donald Trump is a white supremacist"

He certainly does racial profiling and has found the white supremacy movement quite receptive to him. It’s been pretty well established that being black is a good way to get on his bad side.

He’s well established as a racist and bigot and has only tried to seriously hide that since he began campaigning for president – with mixed results.

Should he be removed from twitter?

I don’t think so. Depriving a racist of the ability to make people judge his speech and conduct is counterproductive to democracy.

Anonymous Coward says:

Re: This is a bad post

If speech doesn’t have consequences, how do you explain libel laws?

Not necessarily the same thing are they?

Some speech apparently does have consequence in a civil court room after having spend large quantities of cash upon legalese mouth pieces, however – not all speech is false claims causing damage to an individual.

For example, if you were to say that you hate this or that it is a statement of opinion, not a declaration of fact and therefore it in no way can be considered to be libel.

Gary (profile) says:

Content at Scale

Any site with more than a few hundred users is going to have the same problem.

But small sites can’t fight off the non-stop assaults by copyright lawyers. Only big sites like YouTube can afford to deal with user created content. We can’t go back to the early days where there were thousands and thousands of little sites hosting a wide range of content – we see stories every day about them being shut down by non-stop copyright demands.

Mark A. Line says:

Yes, as "Abrams" writes, BAD POST! Remove yourself, Timmy!

I’ll help by stating that agree though you don’t fully point out that the "hate speech" definition is made by visibly "leftist" and "extremist" organized "influencers", as usual, to exactly serve their purposes, while "white male nationalists" like YOU Timmy (and surely all readers, there’s not even a female here…) get zero input.

This is good illustration that YOUR views are too extreme for "some", Timmy. Take heed. Just your mis-use of the word "mantra" WILL get you in trouble.

Samuel Abram (profile) says:

Re: That's "Mr. Abram" to you

You clearly misunderstood my response. I called him out on the knee-jerk reactivity of the piece that lacked the thoughtfulness of Mike Masnick.

That being said, Mr. Geigner is correct that content moderation gets more and more impossible the more people on a platform there is.

Also, my last name is Abram, not Abrams, which is a battle tank.

Dark Helmet (profile) says:

Re: Re: That's "Mr. Abram" to you

Mr. Abram:

Appreciate the comment and the thought you put into it. Every writer has his or her style, and not every post requires that same style, in my opinion.

While I’ve read your series of comments and have let them roll around in my head for half a day or so, I will suggest that part of my intent in this piece was to express that, while there is nuance in everything, some issues and stances are binary enough to be worth putting down a flat marker. I think this is one of those situations, which is why I wrote the post in the way I did.

I’m not blind to the concerns about bad speech on platforms. I do however think that the cure is more good speech, rather than attempts at moderation that will simply fail at scale.

Again, sincerely appreciate your comments. If this one didn’t land for you, I can only wish it had. If it didn’t land for large swaths of people, which I’m not sure is the case, then that’s worth knowing as well.

Cheers!

Mark A. Line says:

But it's NOT "Fucking Impossible": you're positing bad system.

IF uploading is subject to prior review — since it’s not gov’t but private corporation having control, can be no objection — then it’ll cut down number, and automatically reduce the urge by making it far more difficult and unlikely.

What’s cheap isn’t valued. As I’ve said, every yahoo shouting isn’t good, can’t be made to work, is guaranteed to tear society apart, as you surely must all concede since it’s the premise of stifling "hate speech".

Anonymous Coward says:

Re: But it's NOT "Fucking Impossible": you're positing bad syste

IF uploading is subject to prior review

All the social media sites cease to exist, as their role in society is enabling people to self publish without the need to seek prior permission. Put a dam in the way of the flow of self publication, and the site will not have the content to attract the mass of users they need to survive.

Stephen T. Stone (profile) says:

Re:

every yahoo shouting isn’t good, can’t be made to work, is guaranteed to tear society apart

How, then, would you decide who to silence — who does and doesn’t deserve the chance to “shout” what they believe, no matter how odious or distasteful? For that matter, who the fuck are you to think you can make that decision?

Anonymous Coward says:

Re: Re: Re:

Blue won’t answer because the last thing he needs is accountability on the rubbish he vomits, but given his post history (easily trackable by the daily unique pseudonyms he pulls from a kid’s joke book), one can hazard a guess.

Blue is invested in the idea of pre-emptive, irreversible takedowns because that’s the same goddamn unicorn that his overlords in the RIAA have been asking for. If something exists that is so indisputably heinous it merits the deepest invasion of privacy possible, the RIAA will spare no man, woman or child in demanding the same privilege. Even if it means Viacom demanding that YouTube take down Viacom content that Viacom uploaded. Exhibit A: Every single attempt to parallel the Dancing Baby video, and other cursory associations with IP infringement, to child porn.

The funny thing is, given the political climate right now, Blue will simply go back to flooding the Nunes thread even harder when he finds all his favorite MAGA media sources have been moderated. "Be careful what you wish for" applies to these rabid IP-tards, every time.

Anonymous Coward says:

Re: Re:

But it’s NOT "Fucking Impossible": you’re positing bad system.

I’m sure then you can tell us how to do it in a feasible way while maintaining the current level of service?

IF uploading is subject to prior review

Ah, that’s how.

since it’s not gov’t but private corporation having control, can be no objection

I object. This is completely irrelevant.

then it’ll cut down number

Yes, by about 100% and the services will all close. You would need hundreds of thousands of people to review the content before it goes up to get it up in a reasonable amount of time. Anything less and people will stop using it altogether.

automatically reduce the urge by making it far more difficult and unlikely.

I knew you were daft but I didn’t think you were out to destroy social media entirely. Yet here you are stating that is your exact objective. Just because you don’t like it, doesn’t give you the right to shut it down for everyone else, nor does it mean your views are correct.

What’s cheap isn’t valued.

Tell that to any parent that has received a hand drawn painting or card from their child. Yeah it’s cheap, but that little scrap of paper has more value to them than probably any artwork made by DaVinci or anything coming out of Hollywood.

As I’ve said, every yahoo shouting isn’t good

Why not? Which "yahoos" should not be given the right to shout? Who are you to decide who gets to speak or not?

can’t be made to work

Seems to be working just fine to me.

is guaranteed to tear society apart

The ability to speak is not what tears society apart. It’s what the person chooses to say that does, and so we must address the root problem: why do these people feel the need to say it?

as you surely must all concede since it’s the premise of stifling "hate speech".

I concede nothing. I don’t care to listen to someone being a total jerk and denigrating other humans. I would kick them out of my house if they said something like that while on my property. Social media platforms have the right to do the same on their personal property: their platforms.

Scary Devil Monastery (profile) says:

Re: But it's NOT "Fucking Impossible": you're positing bad syste

"But it’s NOT "Fucking Impossible": you’re positing bad system."

No, we’re positing a reality where actual magic still isn’t possible.

"What’s cheap isn’t valued."

Like oxygen, you mean? I think you need to understand that fundamentally "value" isn’t a valid criteria to assign to needs. "Value" is only a valid descriptor when you’re talking about luxuries.

"As I’ve said, every yahoo shouting isn’t good…"

I think we all know by now that you view freedom of speech as a nuisance. Who the hell determines whether someone is a yahoo or not? The majority? Then you yourself are instantly SOL on any of the boards you keep spouting bullshit on.
The "enlightened" minority? Congrats, you just made the case for reinstating british imperial rule. Feudalism at its finest.

"…as you surely must all concede since it’s the premise of stifling "hate speech"."

No, because the premise of stifling hate speech isn’t to try to shut the haters up – that doesn’t work.
Better by far to let the haters hang themselves by their own words – which they do and subsequently are barred at the door because real people have chosen to report the offensive speech while taking note that there are still monsters out there which need to be opposed.

So in short you advocate – openly – abolishing free speech because people you choose to define as shouting yahoos are offensive to you. And your leading argument is by assigning cheapness to a core value and requirement of democracy.

I don’t know what’s worse, Baghdad Bob. That you can’t understand basic logic or the fact that you also can not understand the basic values of the democratic process.

cpt kangarooski says:

I disagree. I have no problem with private entities running websites banning specific users from posting anything based on the users’ other posts and/or the users’ membership in certain groups, if the entity that owns the site decides that it doesn’t like those posts or those groups. The exception is that discrimination in violation of the law is not allowed, but that’s clearly not happening here.

Frankly, I’m disappointed that more sites don’t work together to identify these bad actors and ban them from posting on or participating in an enormous swathe of the Internet.

Tim thinks this is unamerican, but it’s just as unamerican to suggest that anyone should have to aid the spread of speech with which they disagree. That would be compelled speech, and the First Amendment protects people from that.

Gary (profile) says:

Re: Re:

Tim thinks this is unamerican, but it’s just as unamerican to suggest that anyone should have to aid the spread of speech with which they disagree.

I can’t speak for Tim (Despite what Blue Balls may say) but I don’t see anywhere in the article that he says YouTube shouldn’t do anything. He’s saying that it’s impossible to make this problem go away with a snap of the fingers. And somehow rating posters based on content they make while not on YouTube wold be incredibly complicated. (And would never work as intended.)

And cries for more moderation would mean that AC posting would need to be banned completely in order to properly track "bad actors."

cpt kangarooski says:

Re: Re: Re:

Absolutely. It’s an extremely hard problem and probably not solvable. I don’t think that the absence of a perfect solution or the need to do hard work should stop the attempt though. And I think that rather than looking purely at content, a social graph to help zero in on bad actors is a better way to go. Use that to limit the scope of keyword and content searches to something more manageable. After all, there’s a lot more people posting cat videos to YouTube than there are nazis posting hateful messages, and working out who’s who can avoid having to watch all those cats.

As for AC, let’s be honest; except on the rare sites that deliberately don’t log anything, there’s not that much anonymity against the site itself. Most places that allow it could still do so while quietly banning certain users. And if everyone, including the creepily observant titans like Google and Facebook, cooperated, I think it would work out.

Thad (profile) says:

Re: Re:

Tim thinks this is unamerican, but it’s just as unamerican to suggest that anyone should have to aid the spread of speech with which they disagree. That would be compelled speech, and the First Amendment protects people from that.

I don’t think Tim’s saying that YouTube should be required by law to host all legal speech, merely that he believes that it should choose to host all legal speech. I don’t agree with his argument, but I think you’re mischaracterizing it.

Anonymous Coward says:

Re: Pretending Pre-Election Censorship is Moderation

300 hours of video are uploaded to youtube every minute. Please tell me how you would moderate that? Also they are not censoring content to appease hateful morons, they are censoring content based on the designs of the copyright industry wanted. The fact that it is easily abused is apparently a feature that it copyright industry like.

Anonymous Coward says:

Re: Re: Re:

300 hours of video are uploaded to youtube every minute. Please tell me how you would moderate that?

He’s not going to. But that’s the beauty of his plan – he’s not making the moderation he wants his responsibility, he’s leaving it to someone else he can blame for telling him that the task is impossible. And when the task is attempted and someone else inevitably fails it, he can blame them again.

It’s the exact same dumbfuck logic used by copyright fucknuggets every time they don’t get their magical notice-and-staydown system that electrocutes someone they dislike in real time.

Anonymous Coward says:

Re: Re: Re:2 more depth

Gawker: we aren’t going to take down this picture of hulk Hogan’s #### Becuase it has him saying something that may newsworthy even if you don’t like it.
Gizmodo current day: hey internet why can’t you moderate and take down these pictures of some guys Becuase it’s simple and clear cut like we begged a judge not to do to us back when we still had an extra site?

Anonymous Coward says:

Holy shit. And this is right next to a disapproving post describing the dystopian Chinese Cultural Credit scheme.

But that’s what you fuckers appear to want.

While TechDirt has admirably well-thought-out positions on patent, copyright and monetizing lack of scarcity, the commentariat here are a bunch of shits who are going to deserve the internet that’s coming.

Here’s a clue: It won’t be you virtuous fucks filling the Internet with rigidly-enforced "goodness", based on removing your (largely invented) evil foes.

Anonymous Hero says:

This is reminiscent of the takedown of Backpage. The authorities were full of themselves when they did it, but then later realized Backpage was a great resource for tracking pimps and arresting them.

Banning videos will not get rid of white supremacy. Keeping the videos up may help the authorities track these nuts. Authorities may even be able to prevent a mass shooting or two.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...