Scaling Content Moderation Is Impossible: YouTube Flags Horror Video As ‘For Kids’
from the the-horror-the-horror dept
Masnick’s Impossibility Theorem strikes again! The idea put forth by Mike is that the moment a platform really starts to scale upward, doing any actual content moderation well becomes inherently impossible. There are a ton of examples of this, as some reasons as to why it’s so difficult. The spoiler alert on this is that pretty much the moment content moderation is in any way automated, it all starts to fall apart.
And sometimes in fairly spectacular ways, too. For instance, horror YouTube series Local58TV was created by Kris Straub. It’s a very dark series of shorts, and Straub made a point of setting the age rating for adults, but then YouTube’s automated system listed it as available in “YouTube Kids”.
YouTube doesn’t get the Local58TV vibe, though. It automatically flagged one episode, titled “Show For Children,” as for children. You can see how an AI bot might get its wires crossed from that title, but it immediately says “Not for Children” in the description, and the creator, Straub, originally set the video’s age rating as “18+” when it was uploaded.
The episode is a black-and-white cartoon where a cute cartoon skeleton wanders around a graveyard looking for a cute cartoon girlfriend skeleton, only to find horrifying, more realistic skeletons and other creatures in the open graves. At the end of the video, seemingly from depression, the cute skeleton lays down in a grave and dies, turning into a realistic skeleton. The cartoon is something an AI bot might not understand, but a human could immediately tell the unsettling video is not kid-friendly.
And there you have it. The automated system simply couldn’t decipher the content of the video and setup kids to watch what is clearly not an appropriate video for them to see. Frustratingly, this system is so automated that Straub can’t even go in and edit any of this himself.
YouTube not only flagged a video explicitly marked as “inappropriate for kids” as “made for kids,” it also won’t let the creator change it back. The video’s content is now labeled “Made for kids (set by YouTube)” and Straub is forced to file an appeal with YouTube to get the video’s age rating corrected.
Now, Straub appealed the “for kids” designation and YouTube responded quickly, correcting the issue. But that isn’t really the point. If automated content moderation is so broken that a video the creator marked as for adults can be auto-marked as for children, then what hope does such a system have to tackle more nuanced moderation issues.