Once Again: Content Moderation Often Mistakes Reporting On Bad Behavior With Celebrating Bad Behavior
from the and-it-will-always-do-so dept
On Monday, the Twitter account Right Wing Watch — which is famous for highlighting some of the nuttier nonsense said by Republicans — announced that its YouTube account had been permanently banned.
Our efforts to expose the bigoted view and dangerous conspiracy theories spread by right-wing activists has now resulted in @YouTube banning our channel and removing thousands of our videos. We attempted to appeal this decision, and YouTube rejected it. pic.twitter.com/74Rfi31uQe
— Right Wing Watch (@RightWingWatch) June 28, 2021
As you can see, that ban was initially put in place over a claim that the videos violated YouTube’s Community Guidelines. RWW appealed, and was told that YouTube had “decided to keep your account suspended” even after the appeal.
This sort of thing happens all the time, of course. For over a decade, we’ve highlighted how demands that social media take down “terrorist” content resulted in the company shutting down accounts that tracked evidence of war crimes. Because the very same videos that might serve as terrorist propaganda can also serve as an archive and evidence of war crimes.
In short, context matters, and that context goes way beyond the content of a video.
And this seems to be the same sort of case. Lots of people (including, somewhat ironically, the Right Wing Watch account itself) have been demanding that social media websites be more aggressive in moderating the accounts of conspiracy theorists and propagandists peddling nonsense about elections and the pandemic and the like. But, in highlighting the examples of extremists promoting that nonsense, RWW is showing the same content itself.
Not surprisingly, after this story started going viral, YouTube said it had been a mistake and reinstated the account:
?Right Wing Watch?s YouTube channel was mistakenly suspended, but upon further review, has now been reinstated,? a YouTube spokesperson told The Daily Beast on Monday afternoon. The social-media site also suggested that the issue was a mistake due to high volume of content and that they attempted to move quickly to undo the ban.
Right Wing Watch also confirmed that YouTube informed the site on Monday afternoon that their channel was back online.
?We are glad that by reinstating our account, YouTube recognizes our position that there is a world of difference between reporting on offensive activities and committing them,? Right Wing Watch director Adele Stan said in a statement after the reinstatement. ?Without the ability to accurately portray dangerous behavior, meaningful journalism and public education about that behavior would cease to exist.?
And, indeed, it is true that there is a world of difference, but the important point is that it’s not easy to tell that difference when you’re a content moderation reviewer just looking at the content. They won’t have the context, and it’s almost impossible to get them the proper context in an easy to understand manner. Someone not familiar with the RWW account is not going to understand what it’s doing without understanding a much wider context in which that account operates.
And, this is just one of many, many, many reasons why content moderation at scale is impossible to do well.