Scaling Content Moderation Is Impossible: YouTube Flags Horror Video As ‘For Kids’

from the the-horror-the-horror dept

Masnick’s Impossibility Theorem strikes again! The idea put forth by Mike is that the moment a platform really starts to scale upward, doing any actual content moderation well becomes inherently impossible. There are a ton of examples of this, as some reasons as to why it’s so difficult. The spoiler alert on this is that pretty much the moment content moderation is in any way automated, it all starts to fall apart.

And sometimes in fairly spectacular ways, too. For instance, horror YouTube series Local58TV was created by Kris Straub. It’s a very dark series of shorts, and Straub made a point of setting the age rating for adults, but then YouTube’s automated system listed it as available in “YouTube Kids”.

YouTube doesn’t get the Local58TV vibe, though. It automatically flagged one episode, titled “Show For Children,” as for children. You can see how an AI bot might get its wires crossed from that title, but it immediately says “Not for Children” in the description, and the creator, Straub, originally set the video’s age rating as “18+” when it was uploaded.

The episode is a black-and-white cartoon where a cute cartoon skeleton wanders around a graveyard looking for a cute cartoon girlfriend skeleton, only to find horrifying, more realistic skeletons and other creatures in the open graves. At the end of the video, seemingly from depression, the cute skeleton lays down in a grave and dies, turning into a realistic skeleton. The cartoon is something an AI bot might not understand, but a human could immediately tell the unsettling video is not kid-friendly.

And there you have it. The automated system simply couldn’t decipher the content of the video and setup kids to watch what is clearly not an appropriate video for them to see. Frustratingly, this system is so automated that Straub can’t even go in and edit any of this himself.

YouTube not only flagged a video explicitly marked as “inappropriate for kids” as “made for kids,” it also won’t let the creator change it back. The video’s content is now labeled “Made for kids (set by YouTube)” and Straub is forced to file an appeal with YouTube to get the video’s age rating corrected.

Now, Straub appealed the “for kids” designation and YouTube responded quickly, correcting the issue. But that isn’t really the point. If automated content moderation is so broken that a video the creator marked as for adults can be auto-marked as for children, then what hope does such a system have to tackle more nuanced moderation issues.

Filed Under: , , , , ,
Companies: youtube

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Scaling Content Moderation Is Impossible: YouTube Flags Horror Video As ‘For Kids’”

Subscribe: RSS Leave a comment
31 Comments
Lostinlodos (profile) says:

Re: Re: Re:2

For those unaware: the Heavy Metal films are fantasy films based on recurring story lines from the 1970s/1980s fantasy magazine Heavy Metal.
Pages drawn by the most famous fantasy and D&D artists to walk the earth. And definitely adult magazines.

Most countries put the films between 12 and 18+
The MPAA considered it R. Personally I think PG13 is better choice.. But I’m a liberal on both fictional violence and nudity.

The films are definite not for children though. The first film is one of the era’s better mind-bender wth films. Like the live action Prowler (1987 i&g) and Isle (1981 NF). And worked well even if you never read the magazines.

The second film is more of the late night non-porn adult animation genre. More a western version of the late-80s anime story drawn in purely western style.
With the brutal Deathmetal and Crossover soundtrack of 2000 contrasting to the soft 70s/80s metal tunes.

I don’t think they’re all that bad for teens but the first one is a definite mind-fuck.
The pair together have taken spot 6 on my Must-see list.

https://en.wikipedia.org/wiki/Heavy_Metal_(film)

Rekrul says:

Re: Re:

Argh! That should have been “ALL video games…”.

Normally I’d usually catch a typo like that in the preview, but of course, previewing a post here is a super-advanced feature that only works on modern browsers. Hell, half the time, my posts still don’t show up until a day later. This being the only website I post on where the comments are this broken for me.

Anonymous Coward says:

And another thing: Why is the process of becoming “For Kids” fucking automated crawling? This should be something (automated or not) which is only done upon channel request. For Kids can bea really annoying category which doesn’t allow saving to a playlist or who knows what else at this point. Comments get turned off (and lost?) last i knew.

WTAF YT?

“But so so so many kids/parents like it and we want it safe!”

Well then make a fucking properly attributed copy on an autogenerated channel like you do with othershit.

Geez Louise.

David says:

Re: It's systemic failure

And another thing: Why is the process of becoming “For Kids” fucking automated crawling?

Because “for kids” means that Youtube is constrained in the advertising materials it might use for monetization. Because there are laws against marketing all kinds of fraudulent shit to impressionable kids, Youtube has to automark stuff as “for kids” if it is likely to draw a kid audience contrary to what the author checkmarked.

So erring on the side of “content for kids” puts Youtube off the hook for marketing unsuitable material to children. And because for the actual content they are only a provider under Section 230, it makes the content creator responsible for the content being made available to children.

“for kids” just means that Youtube makes sure the advertisements are suitable for kids, not that the content is suitable for kids. Because that’s what Youtube can be sued over for, that’s what they care about.

Naughty Autie says:

Re: Re:

And because for the actual content they are only a provider under Section 230, it makes the content creator responsible for the content being made available to children.

Wrong. Automated or not, YouTube is making an editorial decision in regards to that specific content, making them jointly liable as a publisher under Section 230. If they want to retain their Section 230 protections in relation to their ‘For Kids’ offerings, they need to have their bots respect the decisions of the uploaders and allow users to flag adult content that isn’t properly indicated, which of course carries its own risks. Damned if they do, damned if they don’t, but only allowing flagging will make it so YouTube can’t be sued if a five-year-old sees a graphic scene from a Child’s Play, for example.

Christenson says:

Human curation required

I am starting to think that each content posting entity requires a certain quantum of moderation, and the larger sites simply haven’t figured out how to crowdsource it in a trustworthy way.

The other difficulty is that reasonable people can differ on moderation decisions like the one above, and some way of resolving such very human differences becomes needed.

Christenson says:

Re: Re: Adults needed

Given that we are confused as to exactly what this “for kids” label might mean, there’s no way we have enough information.

As to a crowdsourced setup, Techdirt has one that does a very reasonable job, given the very large number of readers it has. Wikipedia and Reddit also seem to do a pretty reasonable job most of the time. So I think there are some interesting possibilities out there, but I am not implementing a website.

nasch (profile) says:

Re: Re: Re:

Techdirt has one that does a very reasonable job, given the very large number of readers it has.

Large? Techdirt’s audience is minuscule compared to the major social media sites, and the number who are interested in the comment section is even smaller.

Wikipedia and Reddit also seem to do a pretty reasonable job most of the time.

Wikipedia seems like a special case given their very different mission, but there could be interesting lessons learned from Reddit.

This comment has been flagged by the community. Click here to show it.

John85851 (profile) says:

When will his account be turned off?

Here’s how I see the scenario:
YouTube’s bots tag his horror video as safe for kids.
A parent sees their kid watching the video and files a complaint.
YouTube takes his channel down even though ot was their own automated system that said his videos were safe for kids.
Then he has to argue that he never set his channel to be safe for kids and try to convince the admins that the bots did it.

DNY says:

Content Moderation and Austrian School Economics

It just occurred to me that content moderation at scale is impossible for the same reasons Hayek and vonMises gave for the impossibility of centrally planned economies working: one can’t gather all the needed information (vonMises) and one can’t process it in real-time (Hayek).

Thus, the solution is the same: a content-moderation analogue of the free market. Let individual users or voluntarily-formed small communities of users moderate the content they see, rather than have a central authority moderate content for all.

Well, that’s the solution for those of us who want an open internet. For those who see content-moderation as the means of censoring political opponents and allowing the professional managerial class to define the bounds of allowable discussion, I guess it’s a non-starter, the way free market economics is a non-starter for a devoted Stalinist.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...