Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Dealing With Demands From Foreign Governments (January 2016)

from the gets-tricky-quickly dept

Summary: US companies obviously need to obey US laws, but dealing with demands from foreign governments can present challenging dilemmas. The Sarawak Report, a London-based investigative journalism operation that reports on issues and corruption in Malaysia, was banned by the Malaysian government in the summer of 2015. The publication chose to republish its own articles on the US-based Medium.com website (beyond its own website) in an effort to get around the Malaysian ban.

In January of 2016, the Sarawak Report had an article about Najib Razak, then prime minister of Malaysia, entitled: ?Najib Negotiates His Exit BUT He Wants Safe Passage AND All The Money!? related to allegations of corruption that were first published in the Wall Street Journal, regarding money flows from the state owned 1MDB investment firm.

The Malaysian government sent Medium a letter demanding that the article be taken down. The letter claimed that the article contained false information and that it violated Section 233 of the Communications and Multimedia Act, a 1998 law that prohibits the sharing of offensive and menacing content. In response, Medium requested further evidence of what was false in the article.

Rather than responding to Medium?s request for the full ?content assessment? from the Malaysian Communications and Multimedia Commission (MCMC), the MCMC instructed all Malaysian ISPs to block all of Medium throughout Malaysia.

Decisions to be made by Medium:

  • How do you handle demands from foreign governments to take down content?
  • Does it matter which government? If so, how do you determine which governments to trust?
  • How do you determine the accuracy of claims from a foreign government regarding things like ?false reporting??
  • What are the trade-offs of being blocked entirely by a country?

Questions and policy implications to consider:

  • Taking down content that turns out to be credible accusations of corruption can serve to support that corruption and censor important reporting. Yet, leaving up information that turns out to be false can lead to political unrest. How should a website weigh those two sides?
  • Should it be the responsibility of websites to investigate who is correct in these scenarios?
  • What is the wider impact of an entire website for user generated content being blocked in a country like Malaysia?

Resolution: The entire Medium.com domain remained blocked in Malaysia for over two years. In May of 2018, Najib Razak was replaced as Prime Minister by Mahathir Mohamad (who had been Prime Minister from 1981 to 2003). However, in 2018, he was representing the Pakatan Harapan coalition, which was the first opposition party to the Barisan Nasional coalition to win a Malaysian election since Malaysian independence (Mahathir Mohamad had previously ruled as part of the Barisan Nasional). Part of Pakatan Harapan?s platform was to allow for more press freedom.

Later that month, people noticed that Medium.com was no longer blocked in Malaysia. Soon after, the MCMC put out a statement saying that Medium no longer needed to be blocked because an audit of 1MDB had been declassified days earlier, and once that report was out, there no longer was a need to block the website: ?In the case of Sarawak Report and Medium, there is no need to restrict when the 1MDB report has been made public.?

Originally published on the Trust & Safety Foundation website.

Filed Under: , , , ,
Companies: medium, sarawak report

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Dealing With Demands From Foreign Governments (January 2016)”

Subscribe: RSS Leave a comment
4 Comments
Damien says:

I’m not sure how Medium handles things like this, but as a general rule if I have no intention of ever visiting or doing business in a country any demands it sends to websites I run will get sent to the junk folder. If I have no economic or physical ties to jurisdictional authority it’ll be lucky to get a form letter returned reminding them to take it up with the US State Department for any international complaints since it’s not my problem.

Jono793 (profile) says:

Another consideration

… Does your service have critical infrastructure, or staff located in the country concerned? Hopefully this had been risk-assessed in advance, given the trend of censorious regimes holding both of these hostage as part of takedown disputes.

Ultimately, you can’t moderate to the lowest common denominator. And the Razak regime was particularly corrupt, thuggish and censorious.

A service that cares at all about free expression of ideas, is never going to sit well with a government that believes in stealing from it’s own people, and censoring all their critics!

crade (profile) says:

"Should it be the responsibility of websites to investigate who is correct in these scenarios?"

It should be up to them if they feel like investigating or just defaulting to leave up or take down in these situations. If they are making any sort of claim of credibility or journalism I would think they would take on that responsibility or lose their credibility pretty quickly, if it’s just a dumping ground for random people to put stuff on then people shouldn’t have much expectations either way

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (2)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (24)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow