Content Moderation Case Study: Facebook Nudity Filter Blocks Historical Content And News Reports About The Error (June 2020)
from the content-moderation-is-hard dept
Summary: Though social media networks take a wide variety of evolving approaches to their content policies, most have long maintained relatively broad bans on nudity and sexual content, and have heavily employed automated takedown systems to enforce these bans. Many controversies have arisen from this, leading some networks to adopt exceptions in recent years: Facebook now allows images of breastfeeding, child-birth, post-mastectomy scars, and post-gender-reassignment surgery photos, while Facebook-owned Instagram is still developing its exception for nudity in artistic works. However, even with exceptions in place, the heavy reliance on imperfect automated filters can obstruct political and social conversations, and block the sharing of relevant news reports.
One such instance occurred on June 11, 2020 following controversial comments by Australian Prime Minister Scott Morrison, who stated in a radio interview that ?there was no slavery in Australia?. This sparked widespread condemnation and rebuttals from both the public and the press, pointing to the long history of enslavement of Australian Aboriginals and Pacific Islanders in the country. One Australian Facebook user posted a late 19th century photo from the state library of Western Australia, depicting Aboriginal men chained together by their necks, along with a statement:
Kidnapped, ripped from the arms of their loved ones and forced into back-breaking labour: The brutal reality of life as a Kanaka worker – but Scott Morrison claims ?there was no slavery in Australia?
Facebook removed the post and image for violation of their policy against nudity, although no genitals are visible, and restricted the user?s account. The Guardian Australia contacted Facebook to determine if this decision was made in error and, the following day, Facebook restored the post and apologized to the user, explaining that it was an erroneous takedown caused by a false positive in the automated nudity filter. However, at the same time, Facebook continued to block posts that included The Guardian?s news story about the incident, which featured the same photo, and placed 30-day suspensions on some users who attempted to share it. Facebook?s community standards report shows that in the first three months of 2020, 39.5-million pieces of content were removed for nudity or sexual activity, over 99% of those takedowns were automated, 2.5-million appeals were filed, and 613,000 of the takedowns were reversed.
Decisions to be made by Facebook:
- Can nudity filters be improved to result in fewer false-positives, and/or is more human review required?
- For appeals of automated takedowns, what is an adequate review and response time?
- Should automated nudity filters be applied to the sharing of content from major journalistic sources such as The Guardian?
- Should questions about content takedowns from major news organizations be prioritized over those from regular users?
- Should 30-day suspensions and similar account restrictions be manually reviewed only if the user files an appeal?
Questions and policy implications to consider:
- Should automated filter systems be able to trigger account suspensions and restrictions without human review?
- Should content that has been restored in one instance be exempted from takedown, or flagged for automatic review, when it is shared again in future in different contexts?
- How quickly can erroneous takedowns be reviewed and reversed, and is this sufficient when dealing with current, rapidly-developing political conversations?
- Should nudity policies include exemptions for historical material, even when such material does include visible genitals, such as occurred in a related 2016 controversy over a Vietnam War photo?
- Should these policies take into account the source of the content?
- Should these policies take into account the associated messaging?
Resolution: Facebook?s restoration of the original post was undermined by its simultaneous blocking of The Guardian?s news reporting on the issue. After receiving dozens of reports from its readers that they were blocked from sharing the article and in some cases suspended for trying, The Guardian reached out to Facebook again and, by Monday, June 15, 2020, users were able to share the article without restriction. The difference in response times between the original incident and the blocking of posts is possibly attributable to the fact that the latter came to the fore on a weekend, but this meant that critical reporting on an unfolding political issue was blocked for several days while the subject was being widely discussed online.
Photo Credit (for first photo):
State Library of Western Australia
[Screenshot is taken directly from a Twitter embed]
Filed Under: case study, consistency, content moderation, historical content, nudity, reporting
Comments on “Content Moderation Case Study: Facebook Nudity Filter Blocks Historical Content And News Reports About The Error (June 2020)”
Naked breasts (female) are not illegal around the world. Some countries have topless/naked beaches and are not inflicted by such prurient interests. Some countries allow the publication of nudes in publicly available periodicals, widely available to the public, of all ages, though topless rather than full frontal nudes seems to be the norm. Then, of course there are the publicly available porn publications, which are not illegal in many places, but are in some.
Several things to consider, the first is that the naked breast (female) is the first feeding station for most children. The second is that for some reason the naked (female) breast is considered erotic to some males (being male I am not sure how this translates to the female of the species, or in all markets). The third is that with sufficient hormones the male breast can be enhanced to look like a female breast, though I am not sure that it performs the same function (feeding babies, while the erotic nature to males is another story).
My conclusion is that there are puritanical persons who desire that there should be no naked breasts (female, though given the above why do they discriminate?) or full nudity shown anywhere, anytime, and that they for some reason have control of something, whether that something is a piece of the current ‘cancel culture’ or are merely a part of the ‘new woke’ cadre I am not sure they speak for any majority. They may scream their abhorrence, and they are entitled to that, but they should not actually be able to impact the rest of us with any of their ideology, unless they obtain a majority in each and every constituency they attack, which seems to be everywhere.
For myself, I am not bothered by a naked breast (or full frontal nudity), and for some unknown reason do find them mildly erotic, I don’t think that most of my acquaintances are bothered (either male or female) by that, and some go out of their way (both male and female) to flaunt their attributes in the interest of interesting the opposite sex. I don’t see this as wrong, though I suspect that the methodology for creating interest in the opposite sex has meandered over the years.
The question is, especially for the prurient minded, what the hell business is it of yours what I think or don’t think? You won’t change me, no matter what you do. Nor will you change others. So why try? You make a nuisance of yourselves, and create animosity from those who disagree, and there seems to be more of the disagree variety than the agree variety. The Catholic Church failed to get me to agree to the concept of ‘Original Sin’, and I think many others who took logic and reason in place of dogma, even at the age of seven (though my mother disagreed with me and forced me to ‘follow’ until she no longer had control).
If disruption is their goal, they have achieved that, but in the long run, once people and organizations (a.k.a. companies) realize that bowing to the vocal minority is not necessary for their long term well being and they may get along with their business as usual, so long as their policies don’t actually hurt some (potentially small) portions of the population. Being open to all the peoples of the world is a good thing. Being forced to acquiesce to some demands by some isn’t. The variety of ‘ism’s’ is large, and some of the harms are large, but the reverse could be the antithesis of a solution. For example:
So if you say you aren’t racist, you are, and if you say you are racist you are, and for some the only solution is a final (as in death, though we will take all of your assets instead) one. Same goes for all other ‘ism’s’.
The only way to stand up to these whiners is to stand up to them. The question is how to do so in the moderation space where any ‘question’ of your intent is a ‘serious problem’ (from a PR standpoint, which tend to be short term if dealt with correctly) is to not accept questions unless they have some level of veracity (volume might be one factor, sensibility might be another though how to set standards for sensibility in this day and age might be problematic).
Most beaches are topless for men. Some countries have discriminatory policies toward women in this matter, whereas others would consider that illegal discrimination.
Yes, but it’s not common.
My nudity filter blocks Facebook…
The nudity filter is not working too well, there’s no one nude in the photo.
This shows the problem with automatic filters.
All filters will tend to overblock content, and this will result in censorship, restrictions on free speech.
Imagine what will happen when the EU laws come into force, eg every website will have to filter all images and audio, video content for content that might be infringing.
Facebook could whitelist websites like gaurdian.co.uk it’s a newspaper, it does not publish porn or erotic content.
Should newspapers websites be treated like 4chan
or reddit or playboy.
"The nudity filter is not working too well, there’s no one nude in the photo."
So… given that the options are to block or allow a photos, not to edit it on the fly, what would you expect to happen with a visible nude in a photograph? Automated filters cannot make context-related decisions, and there is a nude visible.
"All filters will tend to overblock content"
Yes, and that’s unavoidable. The primary reason for filters is to remove content that will either generate customer complaints, or lead to legal action against the company. They will therefore always be preferred to block some legitimate content that can be reinstated if a complaint is received, rather than allow something and leave the company open to negative action.
"Facebook could whitelist websites like gaurdian.co.uk it’s a newspaper"
Given that the paper was often referred to as the Grauniad as it had a reputation for typos, your misspelling is slightly hilarious.
"Should newspapers websites be treated like 4chan or reddit or playboy."
Well, this is the question. In general terms, everything should be treated equally. All those sites have protection for what their users post, and people who object have every chance to just not use that platform.
If you’re referring to posts from those sources on their official Facebook feeds, then it’s down to how Facebook wishes to treat them. But, if an image is objectionable, it shouldn’t matter where it came from. A nude from Playboy doesn’t magically become not a nude if The Guardian reposts the same image. Reddit posting a historical picture should be just as acceptable as if the paper posts it.
If you start carving out exceptions where one source can do something another can’t that’s where things get murky. Especially when it comes to news reporting – The Guardian is a left-leaning operation, and you can bet your ass that some right-wing blog will cry censorship if their repost is blocked, even if there’s clear non-partisan reasons for the block. Inconsistent application of community rules is one of the big complaints at the moment.
Why isn’t the nudity filter a user option? People who don’t want to see nudity can leave it turned on, and people who don’t care can turn it off.
Because there are people who would turn it off so that they can find and complain about nudity.
Because ‘puritans’ wish to control you. They wish that everybody had the same morals as them, and if you don’t they will impose their morals upon you.
This ‘control of others’ thing is almost as absolute as ‘absolute power’. They seem to go hand in hand, but does one begat the other or is it the other that begat’s the one?
* gets out magnifying glass for the embedded tweet *
* gets frustrated at the blurriness as it isn’t a tweet *
* finds actual tweet *