Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020)

from the leaders-acting-badly dept

Summary: There is an inherent tension in handling content moderation of world leaders — especially more controversial ones. If those leaders break the rules on social media, some reasonably call for the content, or the accounts, to be removed for violating policies. Others, however, point out that it is important for the public to be aware of what world leaders are saying, rather than removing and hiding the speech.

Twitter has had a public interest exception for tweets from world leaders since at least 2019. Under that policy, Twitter may choose to leave up some content from a world leader that the company admits violates its rules, under the belief that it is more important that the world know what that leader has said. Since 2019, Twitter announced that when it found such content, it would label it clearly — publicly noting that it violated the company?s policies, but was being kept up due to the public interest.

The policy was put to the test in October 2020, following the murder of a teacher in a Paris suburb, after the teacher had shown students cartoons of the Prophet Muhammad while discussing the controversy over such drawings. A week later, three people were stabbed in Nice, in southern France. French President Emmanuel Macron described both attacks as ?Islamist terrorist attacks.”

Soon after the latter attack, former prime minister of Malaysia Mahathir Bin Mohamad posted a Twitter thread discussing both attacks. While the thread touched on a variety of points, urged people not to scapegoat entire religions, and said he did not approve of the killings, the twelfth tweet raised many concerns by stating: “Muslims have a right to be angry and kill millions of French people for the massacres of the past.”

Twitter posted its public interest notice on this particular tweet, noting that it violated Twitter?s rules about glorifying violence, but Twitter felt that it ?may be in the public?s interest for the tweet to remain accessible.?

Many disagreed with this decision, including French officials. France?s digital minister, C?dric O, claimed that if Twitter did not remove the tweet, it would make the company an ?accomplice to a formal call for murder.?

Decisions to be made by Twitter:

  • What qualifies a tweet from a foreign leader that violates policies to remain up under a ?public interest? exception?
  • Under what conditions would Twitter reverse this policy and remove tweets?
  • How much context should Twitter take into account regarding the tweets? That is, how much should the attacks in France play into the decision regarding this tweet?

Questions and policy implications to consider:

  • Whether or not Twitter removes this particular tweet, it is likely to get attention and news coverage. How much does it matter whether or not Twitter removes or labels the particular tweet?
  • Should world leaders get special treatment by nature of their position and the fact that what they say can impact world events? 

Resolution: Twitter only kept the tweet up for a few hours before reversing course and deciding to remove the tweet entirely for violating its rules. Twitter did not comment on why it changed its position on this particular tweet, only telling the media that the tweet was removed for violating its policies on glorifying violence. The company chose not to explain why it initially qualified for a public interest exception, only to be changed later.

Originally posted to the Trust & Safety Foundation website.

Filed Under: , , , , , ,
Companies: twitter

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020)”

Subscribe: RSS Leave a comment
5 Comments

This comment has been flagged by the community. Click here to show it.

nandmotors says:

Best Apollo Tyre shop in Noida Extension

Nand Motors are one of the leading tyre dealer shop in Noida and near Noida extension, So if you are looking authorised Apollo tyre dealers then near you in Noida then call us or visit ou workshop, we will give you best car care services. Right now, we have 2 work shops in Noida. We have all kind of branded tyres available with us but most of the demanding tyre Apollo, we can give you on best prices because we are the authourised dealer of Apollo tyre.

http://www.nandmotors.in/apollo-tyres-dealer-shop-noida.php

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow