Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Facebook Nudity Filter Blocks Historical Content And News Reports About The Error (June 2020)

from the content-moderation-is-hard dept

Summary: Though social media networks take a wide variety of evolving approaches to their content policies, most have long maintained relatively broad bans on nudity and sexual content, and have heavily employed automated takedown systems to enforce these bans. Many controversies have arisen from this, leading some networks to adopt exceptions in recent years: Facebook now allows images of breastfeeding, child-birth, post-mastectomy scars, and post-gender-reassignment surgery photos, while Facebook-owned Instagram is still developing its exception for nudity in artistic works. However, even with exceptions in place, the heavy reliance on imperfect automated filters can obstruct political and social conversations, and block the sharing of relevant news reports.

One such instance occurred on June 11, 2020 following controversial comments by Australian Prime Minister Scott Morrison, who stated in a radio interview that ?there was no slavery in Australia?. This sparked widespread condemnation and rebuttals from both the public and the press, pointing to the long history of enslavement of Australian Aboriginals and Pacific Islanders in the country. One Australian Facebook user posted a late 19th century photo from the state library of Western Australia, depicting Aboriginal men chained together by their necks, along with a statement:

Kidnapped, ripped from the arms of their loved ones and forced into back-breaking labour: The brutal reality of life as a Kanaka worker – but Scott Morrison claims ?there was no slavery in Australia?

Facebook removed the post and image for violation of their policy against nudity, although no genitals are visible, and restricted the user?s account. The Guardian Australia contacted Facebook to determine if this decision was made in error and, the following day, Facebook restored the post and apologized to the user, explaining that it was an erroneous takedown caused by a false positive in the automated nudity filter. However, at the same time, Facebook continued to block posts that included The Guardian?s news story about the incident, which featured the same photo, and placed 30-day suspensions on some users who attempted to share it. Facebook?s community standards report shows that in the first three months of 2020, 39.5-million pieces of content were removed for nudity or sexual activity, over 99% of those takedowns were automated, 2.5-million appeals were filed, and 613,000 of the takedowns were reversed.

Decisions to be made by Facebook:

  • Can nudity filters be improved to result in fewer false-positives, and/or is more human review required?
  • For appeals of automated takedowns, what is an adequate review and response time?
  • Should automated nudity filters be applied to the sharing of content from major journalistic sources such as The Guardian?
  • Should questions about content takedowns from major news organizations be prioritized over those from regular users?
  • Should 30-day suspensions and similar account restrictions be manually reviewed only if the user files an appeal?

Questions and policy implications to consider:

  • Should automated filter systems be able to trigger account suspensions and restrictions without human review?
  • Should content that has been restored in one instance be exempted from takedown, or flagged for automatic review, when it is shared again in future in different contexts?
  • How quickly can erroneous takedowns be reviewed and reversed, and is this sufficient when dealing with current, rapidly-developing political conversations?
  • Should nudity policies include exemptions for historical material, even when such material does include visible genitals, such as occurred in a related 2016 controversy over a Vietnam War photo?
    • Should these policies take into account the source of the content?
    • Should these policies take into account the associated messaging?

Resolution: Facebook?s restoration of the original post was undermined by its simultaneous blocking of The Guardian?s news reporting on the issue. After receiving dozens of reports from its readers that they were blocked from sharing the article and in some cases suspended for trying, The Guardian reached out to Facebook again and, by Monday, June 15, 2020, users were able to share the article without restriction. The difference in response times between the original incident and the blocking of posts is possibly attributable to the fact that the latter came to the fore on a weekend, but this meant that critical reporting on an unfolding political issue was blocked for several days while the subject was being widely discussed online.

Photo Credit (for first photo):
State Library of Western Australia
[Screenshot is taken directly from a Twitter embed]

Filed Under: , , , , ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Facebook Nudity Filter Blocks Historical Content And News Reports About The Error (June 2020)”

Subscribe: RSS Leave a comment
9 Comments
Anonymous Anonymous Coward (profile) says:

Prurientitis

Naked breasts (female) are not illegal around the world. Some countries have topless/naked beaches and are not inflicted by such prurient interests. Some countries allow the publication of nudes in publicly available periodicals, widely available to the public, of all ages, though topless rather than full frontal nudes seems to be the norm. Then, of course there are the publicly available porn publications, which are not illegal in many places, but are in some.

Several things to consider, the first is that the naked breast (female) is the first feeding station for most children. The second is that for some reason the naked (female) breast is considered erotic to some males (being male I am not sure how this translates to the female of the species, or in all markets). The third is that with sufficient hormones the male breast can be enhanced to look like a female breast, though I am not sure that it performs the same function (feeding babies, while the erotic nature to males is another story).

My conclusion is that there are puritanical persons who desire that there should be no naked breasts (female, though given the above why do they discriminate?) or full nudity shown anywhere, anytime, and that they for some reason have control of something, whether that something is a piece of the current ‘cancel culture’ or are merely a part of the ‘new woke’ cadre I am not sure they speak for any majority. They may scream their abhorrence, and they are entitled to that, but they should not actually be able to impact the rest of us with any of their ideology, unless they obtain a majority in each and every constituency they attack, which seems to be everywhere.

For myself, I am not bothered by a naked breast (or full frontal nudity), and for some unknown reason do find them mildly erotic, I don’t think that most of my acquaintances are bothered (either male or female) by that, and some go out of their way (both male and female) to flaunt their attributes in the interest of interesting the opposite sex. I don’t see this as wrong, though I suspect that the methodology for creating interest in the opposite sex has meandered over the years.

The question is, especially for the prurient minded, what the hell business is it of yours what I think or don’t think? You won’t change me, no matter what you do. Nor will you change others. So why try? You make a nuisance of yourselves, and create animosity from those who disagree, and there seems to be more of the disagree variety than the agree variety. The Catholic Church failed to get me to agree to the concept of ‘Original Sin’, and I think many others who took logic and reason in place of dogma, even at the age of seven (though my mother disagreed with me and forced me to ‘follow’ until she no longer had control).

If disruption is their goal, they have achieved that, but in the long run, once people and organizations (a.k.a. companies) realize that bowing to the vocal minority is not necessary for their long term well being and they may get along with their business as usual, so long as their policies don’t actually hurt some (potentially small) portions of the population. Being open to all the peoples of the world is a good thing. Being forced to acquiesce to some demands by some isn’t. The variety of ‘ism’s’ is large, and some of the harms are large, but the reverse could be the antithesis of a solution. For example:

"Ibrim X. Kendi is the patron saint of the “Anti-Racist” movement, which promotes the argument that if you are not anti-racist, meaning that you do not dedicate yourself, your “privilege,” your time, money, assets and, perhaps, even your physical existence, to the cause of affirmatively fighting racism, then you are a racist. In the long parade of horribles, there is none worse than being racist."

So if you say you aren’t racist, you are, and if you say you are racist you are, and for some the only solution is a final (as in death, though we will take all of your assets instead) one. Same goes for all other ‘ism’s’.

The only way to stand up to these whiners is to stand up to them. The question is how to do so in the moderation space where any ‘question’ of your intent is a ‘serious problem’ (from a PR standpoint, which tend to be short term if dealt with correctly) is to not accept questions unless they have some level of veracity (volume might be one factor, sensibility might be another though how to set standards for sensibility in this day and age might be problematic).

Anonymous Coward says:

Re: Prurientitis

Naked breasts (female) are not illegal around the world. Some countries have topless/naked beaches and are not inflicted by such prurient interests.

Most beaches are topless for men. Some countries have discriminatory policies toward women in this matter, whereas others would consider that illegal discrimination.

The third is that with sufficient hormones the male breast can be enhanced to look like a female breast, though I am not sure that it performs the same function (feeding babies

Yes, but it’s not common.

Anonymous Coward says:

The nudity filter is not working too well, there’s no one nude in the photo.
This shows the problem with automatic filters.
All filters will tend to overblock content, and this will result in censorship, restrictions on free speech.
Imagine what will happen when the EU laws come into force, eg every website will have to filter all images and audio, video content for content that might be infringing.
Facebook could whitelist websites like gaurdian.co.uk it’s a newspaper, it does not publish porn or erotic content.
Should newspapers websites be treated like 4chan
or reddit or playboy.

This comment has been deemed insightful by the community.
PaulT (profile) says:

Re: Re:

"The nudity filter is not working too well, there’s no one nude in the photo."

So… given that the options are to block or allow a photos, not to edit it on the fly, what would you expect to happen with a visible nude in a photograph? Automated filters cannot make context-related decisions, and there is a nude visible.

"All filters will tend to overblock content"

Yes, and that’s unavoidable. The primary reason for filters is to remove content that will either generate customer complaints, or lead to legal action against the company. They will therefore always be preferred to block some legitimate content that can be reinstated if a complaint is received, rather than allow something and leave the company open to negative action.

"Facebook could whitelist websites like gaurdian.co.uk it’s a newspaper"

Given that the paper was often referred to as the Grauniad as it had a reputation for typos, your misspelling is slightly hilarious.

"Should newspapers websites be treated like 4chan or reddit or playboy."

Well, this is the question. In general terms, everything should be treated equally. All those sites have protection for what their users post, and people who object have every chance to just not use that platform.

If you’re referring to posts from those sources on their official Facebook feeds, then it’s down to how Facebook wishes to treat them. But, if an image is objectionable, it shouldn’t matter where it came from. A nude from Playboy doesn’t magically become not a nude if The Guardian reposts the same image. Reddit posting a historical picture should be just as acceptable as if the paper posts it.

If you start carving out exceptions where one source can do something another can’t that’s where things get murky. Especially when it comes to news reporting – The Guardian is a left-leaning operation, and you can bet your ass that some right-wing blog will cry censorship if their repost is blocked, even if there’s clear non-partisan reasons for the block. Inconsistent application of community rules is one of the big complaints at the moment.

Anonymous Anonymous Coward (profile) says:

Re: Optional

Because ‘puritans’ wish to control you. They wish that everybody had the same morals as them, and if you don’t they will impose their morals upon you.

This ‘control of others’ thing is almost as absolute as ‘absolute power’. They seem to go hand in hand, but does one begat the other or is it the other that begat’s the one?

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow