Content Moderation At Scale Is Impossible To Do Well: Series About Antisemitism Removed By Instagram For Being Antisemetic

from the hate-vs.-reporting-on-hate dept

I’ve written a lot about the impossibility of doing content moderation well at scale, and there are lots of reasons for that. But one of the most common is the difficulty both AI and human beings have in distinguishing hateful/trollish/harassing behavior from those reporting on that behavior. We’ve pointed this out over and over again in a variety of contexts. One classic example is social media websites pulling down human rights activists highlighting war crimes by saying it’s “terrorist content.” Another were the many examples of people on social media talking about racism and how they’re victims of racist attacks having their accounts and posts shut down over claims of racism.

And now we have another similar example. A new video series about antisemitism posted its trailer to Instagram… where it was removed for violating community guidelines.

You can see the video on YouTube, and it’s not difficult to figure out how this happened. The message from Instagram says it violates that organization’s community guidelines against “violence or dangerous organizations.” The video in question, all about antisemitism, does include some Nazi imagery, obviously to make the point that in its extreme form, antisemitism can lead to the murder of Jews. But, Instagram has banned all Nazi content, in part due to those who complained about antisemitism on Instagram.

And that leads to a dilemma. If you’re banning Nazi content, you also have to realize how that might lead to content about Nazis (to criticize them and to warn about what they might do) also getting banned. And, again, this isn’t new. Earlier this year we had a case study on how YouTube’s similar ban took down historical and educational videos about the Holocaust.

The point here is that there is no easy answer. You can say that it should be obvious to anyone reviewing this that trailer (highlighting how bad antisemitism is) is different from actual antisemitism, but it’s a lot harder in practice at massive scale. First you need people who actually understand the difference, and you have to be able to write rules that can go out to thousands of moderators in a simple enough manner that explicitly makes clear the differences. And, you also need to give reviewers enough time to actually understand the context, which is kind of impossible given the scale of the content that needs to be reviewed. In such situations the “simpler” versions of the rules often are what get written: “No Nazi content.” That’s clear and scalable, but leads to these kinds of “mistakes.”

Filed Under: , , , , ,
Companies: facebook, instagram

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation At Scale Is Impossible To Do Well: Series About Antisemitism Removed By Instagram For Being Antisemetic”

Subscribe: RSS Leave a comment
27 Comments

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Scary Devil Monastery (profile) says:

Re: Re: Rules Can Be Illegitimate

"Getting censored proves that your opinion is the strongest."

Your assertion that the venom-spewing fuckwit standing at the head of a "Kill all the jews" rally has more credibility than a history professor or humanitarian speaker is duly noted, Koby.

If I were you I’d find a more appropriate quote to try to back your bullshit with.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Back from cheering on ISIS I see

-Repeatedly lying about being ‘censored’ because people keep showing you the door of their private property proves that you’re not just a person no-one wants to be around but a dishonest one who refuses to own their own words and deeds and instead blames others.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Rules Can Be Illegitimate

Of course, when the topic is something we can all agree on, such as fighting against antisemitism, the old mantra of "their platform, their rules" goes out the window.

Did you even read the article? Please point to the sentence and / or paragraph where Mike is complaining about their right to moderate.

I’ll wait.

It wasn’t about "their platform, their rules". It was about how content moderation at scale is impossible to get 100% correct.

If you would get your head out of Trump’s ass, maybe you would understand the difference, which is obvious to anybody who read the article and has a modicum of common sense.

Getting censored proves that your opinion is the strongest.

How does if feel to be a supporter of ISIS, Nazis, racists, homophobes, bigots, Alex Jones, Nick Fuentes and every other asshole who has been moderated from social media? Doesn’t your statement mean that they have the strongest opinions and by making that statement, you are supporting their opinions.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re: You should try to find examples that they don't support

How does if feel to be a supporter of ISIS, Nazis, racists, homophobes, bigots, Alex Jones, Nick Fuentes and every other asshole who has been moderated from social media?

What makes you think being on their side would be a problem for Koby, he may not be honest enough to own his own position but he’s sure as hell not complaining about fiscal conservatives being shown the door when he whines about how social media is silencing people.

This comment has been deemed insightful by the community.
Mike Masnick (profile) says:

Re: Rules Can Be Illegitimate

Of course, when the topic is something we can all agree on, such as fighting against antisemitism, the old mantra of "their platform, their rules" goes out the window.

Where did I say that? It is still their platform and their rules. I have remained entirely consistent, unlike you.

Koby, I used to think you were just a confused, ignorant dupe. But now you’re actively making shit up, I can only conclude that you’re a troll.

-Getting censored proves that your opinion is the strongest.

So.. by this argument, supporters of Nazis have the strongest opinions?

This comment has been deemed insightful by the community.
Toom1275 (profile) says:

Re: Re: Rules Can Be Illegitimate

So.. by this argument, supporters of Nazis have the strongest opinions?

It means platforms’ moderation decisions are the strongest opinions, since the only censorship going on in the real world is Republicans’ constant attempts to silence that constitutional free speech.

This comment has been deemed insightful by the community.
PaulT (profile) says:

Re: Rules Can Be Illegitimate

"the old mantra of "their platform, their rules" goes out the window"

Weirdly, that’s nothing like what the article says. If it did, it would be an argument for Instagram to be forced to host content against its will just because you agree with what’s being said, regardless of their rights to control their own property or exercise their won freedom of speech and association – which is an argument only idiots like you are making. It’s never been made here by anyone with any credibility.

"-Getting censored proves that your opinion is the strongest."

No, it doesn’t. It might mean that you’re a loud obnoxious asshole who needs to be quietened in order for everyone else in the room to continue talking, but you can be an asshole and wrong at the same time. Loud != correct, no matter how many times Ben Shapiro tells you it is.

christenson says:

And let me make the situation MORE impossible...

So instagram will probably restore the decent anti- antisemitic video, due to the public outcry.

For the sake of argument, let me construct a little series of partial embeds of all the parts of the video showing bad things, stripping the context, and adding my own: "this is good!". I’m sure we can all name a trumplestiltskin website where they’d love to host that….

Now, is that content on instagram good or bad??? Should it be left up, given the definitely transformative fair use of turning it to a pro-nazi purpose???

Especially given that the exact same transformation can be accomplished by taking the whole and using a video editor and posting the result and posting, and some Nazi probably already did that.

Scary Devil Monastery (profile) says:

Re: Re: Re:

"Context is the key to this situation. You should know that."

US republicans don’t believe in context. That’s the only way we can interpret the way Koby and his less eloquent ilk keep trying to compare government sending someone to jail for speaking up with a private entity tossing some asshole out of their own property.

urza9814 (profile) says:

Faulty premise

The problem with this whole argument is it’s all built upon what I think is likely a flawed premise that this removal was some kind of unintentional collateral damage. Given Face-tagram’s history, I find that implausible.

If you consider the possibility that the moderation guidelines aren’t meant to promote "good" content or police hate speech or anything like that, but are instead designed primarily to reduce controversy, then this makes a lot more sense. They don’t want to kick out or drive away the Nazis, that’s bad for business! They also don’t want to be investigated and questioned by cops and government officials. They want a bland, sanitized, advertiser-friendly network. They aren’t doing this to push social progress, they’re doing it so they can be the place where Nazis and Antifa alike can chat with grandma and wish their college roommate a happy birthday and never encounter anything that might make them too uncomfortable.

Perhaps, with sufficient public outrage, they’ll restore this video. Perhaps they’ll decide the controversy of removing it is worse than the controversy of leaving it up. That still won’t make them good people though…

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Faulty premise

The problem with this whole argument is it’s all built upon what I think is likely a flawed premise that this removal was some kind of unintentional collateral damage.

Not really. The article concludes that this was most likely intentional collateral damage.

See:
In such situations the "simpler" versions of the rules often are what get written: "No Nazi content." That’s clear and scalable, but leads to these kinds of "mistakes."

This comment has been deemed insightful by the community.
nasch (profile) says:

Re: Re: Re: Faulty premise

Advertisers want posts that get a lot of attention. That’s largely controversial posts. They don’t want to get any of the controversy on themselves of course, so they and FB want to suppress anything that makes them look bad. But their moderation is clearly not designed to prune controversial content.

Leave a Reply to techflaws Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...