Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Studies: Facebook Suspends Account For Showing Topless Aboriginal Women (2016)

from the double-standards? dept

Summary: Facebook’s challenges of dealing with content moderation around “nudity” have been covered many times, but part of the reason the discussion comes up so often is that there are so many scenarios to consider that it is difficult to create policies that cover them all.

In March of 2016, activist Celeste Liddle gave the keynote address at the Queen Victoria Women’s Centre’s annual International Women’s Day address. The speech covered many aspects of the challenges facing aboriginal women in Australia, and mentions in passing at one point that Liddle’s Facebook account had been repeatedly suspended for posting images of topless aboriginal women that were shown in a trailer for a TV show.

“I don’t know if people remember, but last year the Indigenous comedy show 8MMM was released on ABC. I was very much looking forward to this show, particularly since it was based in Alice and therefore I knew quite a few people involved.

“Yet there was controversy because when 8MMM released a promotional trailer for the show prior to it going to air. This trailer was banned by Facebook because it featured topless desert women painted up for ceremony engaging in traditional dance.

“Facebook saw these topless women as “indecent” and in violation of their no nudity clause. On hearing this, I was outraged that Arrernte woman undertaking ceremony could ever be seen in this way so I posted the trailer up on my own page stating as such.

“What I didn’t count on was a group of narrow-minded little white men deciding to troll my page so each time I posted it, I not only got reported by them but I also got locked out and the video got removed.” — Celeste Liddle

The publication New Matilda published a transcript of the entire speech, which Liddle then linked to herself, leading her account to be suspended for 24 hours and New Matilda’s post being removed — highlighting the point that Liddle was making. As she told New Matilda in a follow up article about the removal and the suspension:

“My ban is because I’ve previously published images of nudity… I’m apparently a ‘repeat nudity poster offender’…

“I feel decidedly smug this morning, because everything I spoke about in my speech on this particular topic just seems to have been proven completely true…

“It’s actually a highly amusing outcome.” — Celeste Liddle

Facebook’s notice to New Matilda claimed that it was restricted for posting “nudity” and said that the policy has an exception if the content is posted for “educational, humorous or satirical purposes,” but did not give New Matilda a way to argue that the usage in the article was “educational.”

Many publications, starting with New Matilda, highlighted the contrast that the same day Liddle gave her speech (International Women’s Day), Esquire released a cover story about Kim Kardashian which featured an image of her naked but partially painted. Both images, then, involved topless women, with their skin partially painted. However, those posting the aboriginal women faced bans from Facebook, while the Kardashian image not only remained up, but went viral.

Company Considerations:

  • How can policies regarding nudity be written to take into account cultural and regional differences?
  • Is there a way to adequately determine if nudity falls into one of the qualified exemptions, such as “educational” use?
  • What would be an effective and scalable way to enable an appeals process that would allow users like Liddle to inform Facebook the nature of the content that resulted in her temporary suspension?

Issue Considerations:

  • Questions about moderating “nudity” have been challenging for many websites. Are there reasonable and scalable policies that can be put in place that adequately take context into account?
  • Many websites start out with a “no nudity” policy to avoid having to deal with adult material on their websites. What other factors should any website consider regarding why a more nuanced policy may make more sense?

Resolution: After this story got some attention, Liddle launched a Change.org petition asking Facebook to recognize that aboriginal women “practicing culture are not offensive.”

Facebook’s standards are a joke. They are blatantly racist, sexist and offensive. They show a complete lack of respect for the oldest continuing culture in the world. They also show that Facebook continually fails to address their own shortfalls in knowledge. Finally, they show that Facebook is more than willing to allow scurrilous bullying to continue rather than educate themselves. — Celeste Liddle

New Matilda requested comment from Facebook regarding the removal of the link to its story and were told that even if the sharing was for an “awareness campaign” Facebook still believed it should be removed because some audiences in Facebook’s “global community” would be “sensitive” to such content. The company also notes that in order to allow its content moderators to apply rules “uniformly” they sometimes need to be “more blunt than we would like.”

“We are aware that people sometimes share content containing nudity for reasons like awareness campaigns, artistic projects or cultural investigations. The reason we restrict the display of nudity is because some audiences within our global community may be sensitive to this type of content – particularly because of cultural background or age. In order to treat people fairly and respond to reports quickly, it is essential that we have policies in place that our global teams can apply uniformly and easily when reviewing content. As a result, our policies can sometimes be more blunt than we would like, and restrict content shared for legitimate purposes. We encourage people to share Celeste Liddle’s speech on Facebook by simply removing the image before posting it.”

Originally posted to the Trust & Safety Foundation website.

Filed Under: , , , ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Studies: Facebook Suspends Account For Showing Topless Aboriginal Women (2016)”

Subscribe: RSS Leave a comment
20 Comments
Scary Devil Monastery (profile) says:

Well, it doesn't exactly come as a surprise.

The loudest complainers regarding nudity and other forms of entartete kunst have always been closely linked to an aversion of the other. It comes as no surprise that a well-regarded mainstream magazine receives a pass not granted to less connected sources. Nor that imagery of nude people of difference are treated more harshly.

Basically the rules are such; If the image would grudgingly pass the censorious eye of a white christian puritan, it will stand. Otherwise the "filth" will get blocked out, lest, oh, horror, children manage to learn about the parts of the world which isn’t white, christian, straight and cis.

It’s the slow crawl of crap like this which has me convinced that not only are we heading for the eventual fragmentation of the internet, it’ll be a good thing not to have to share that space with people so obsessed with puritan values history and anthropology are denied to everyone.

Scary Devil Monastery (profile) says:

Re: Re: Well, it doesn't exactly come as a surprise.

"Facebook doesn’t have to make itself a place for anthropology, though."

They surely don’t but given that Kardaschian’s body paint picture slipped right through the message appears to be that it’s OK with partial or full nudity as long as the subject is young and attractive but the same nudity in the form of actual science and art is a big fscking NOPE.

Arijirija says:

Shock! Horror!

Some cultures don’t have that sort of hang-up about the unclothed body. And among them the First Australians. I’ll say this, Facebook’s censorship in this case is the highly offensive behaviour.

Ironic, because seeing human female breasts in operation used to be quite common amongst Europeans – how else was a woman to feed her child? And if she didn’t feed her child, how was it going to survive? (This was before the advent of infant formula from the babyfood companies.)

I still remember vividly an European(descent) woman’s comments on seeing Papua New Guinean carvings of naked (though heavily stylized) men with their penes rampant – she wanted to go and knock them off with a chisel. How’s that for both sexist and racist attitudes?

Anonymous Coward says:

This goes to what I’ve said before, that social media companies should maintain a more realistic assessment of what can actually be accomplished with moderation and what can’t, so they wouldn’t keep making high-sounding policies they can’t in practice implement fairly or accurately at the scale at which they operate. While a simple no nudity rule can be implemented on the purely factual basis of whether forbidden bodyparts are visible, subjective exceptions are doomed to be judged by the whims of individual opinion that people will inevitably disagree with no matter what.

In this case, it isn’t at all unexpected that moderators would be more favorable to content from a celebrity they are most likely at least somewhat familiar with than a total unknown. It’s human nature. Nothing is gained by complaining that a subjective rule is interpreted subjectively.

Now, whether nudity should be banned in the first place is a matter worthy of debate.

Anonymous Coward says:

Re: Re:

The social media companies are forced to do moderation. They do have a fairly realistic assessment about moderation, but the various lawmakers and regulators (in multiple parts of the world) don’t want to hear it.
Until those lawmakers and regulators get a clue, it’s not going to get any better. But, they keep getting elected. So, maybe it’s not them who are clueless, but the ones who elect/appoint them.

Anonymous Coward says:

Re: Re:

While a simple no nudity rule can be implemented on the purely factual basis of whether forbidden bodyparts are visible

Except in this case, where the body parts are only "forbidden" in some cultures and not others, and only in some situations but not others. Whoops, looks like nudity is too subjective for the no nudity rule to exist.

Rules that operate like you want don’t actually exist.

You can’t say "no profanity" because a universally accepted definition can’t be created.

You can’t say " no using the word ‘fuck’" because that can’t be accurately and non-subjectively implemented at the scale they operate.

You would have to drill down to something so innanely specific that it’s effectively worthless: "no posting of the unicode text string "space, f, u, c, k, space" and "no posting of the unicode text string "space, F, u, c, k, space" etc. etc.

This will achieve nothing at all, since the number of ways to legibly write "fuck" is always higher than the number of objective rules you have implemented, and of course doesn’t include any verbal or image based usage, since those could never be accurately implemented at all.

Anonymous Coward says:

Re: Re: Re:

That answer to the issues that you list is as always, find or build a platform that allows what you want. Facebook and Twitter are not the Internet, and you can can use more than one social media platform. If people stopped demanding that platforms changed their moderation to suite their mores, but rather looked for one that more closely matched their mores, moderation would be less contentious. Maybe you end up using one platform to keep in touch with family, and a different platform to keep in touch with those who share you kinks.

Arijirija says:

Re: Re:

I don’t know if it actually happened or was just a snide comment by the British on the British, but the Victorian British allegedly stooped to covering the legs of grand pianos for fear of the thoughts they would provoke in the innocent …

By all means Facebook should require all grand pianos to have their legs decently covered, for fear of the lustful thoughts that nude piano legs might provoke in the innocent … that way lies serious mental health issues.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow