Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021)
from the tank-man's-gone-missing dept
Summary: On the 32nd anniversary of the Tiananmen Square protests, internet users noticed Microsoft’s Bing search engine was producing some interesting results. Or, rather, it wasn’t producing expected search results for some possibly interesting reasons.
Users searching for the most iconic image of the protests — that of the unidentified person known only as “Tank Man” — were coming up empty. It appeared that Microsoft’s search engine was blocking results for an image that often serves as shorthand for rebellion against the Chinese government.
As was reported by several web users, followed by several news outlets, the apparent blocking of search results could be observed in both the United States and the United Kingdom, leaving users with the impression the Chinese government had pressured Microsoft to moderate search results for “tank man” in hopes of reducing any remembrance of the Tiananmen Square Massacre, which resulted in the deaths of 2,500-3,500 protesters.
The apparent censorship was blamed on Microsoft’s close relationship with the Chinese government, which allowed its search engine to be accessed by Chinese residents in exchange for complying with government censorship requests.
This led to Microsoft being criticized by prominent politicians for apparently allowing the Chinese government to dictate what users around the world could access in relation to the Tiananmen Square protests.
- When complying to one government’s interests, how can Microsoft ensure these considerations don’t affect users located elsewhere in the world?
- How can compliance departments assist in handling edge cases and/or overly-broad moderation demands?
- What are the tradeoffs for content providers when weighing offended users against offended governments?
- What ethical concerns should be taken into consideration when entering markets controlled by oppressive governments? Is there a line companies should not be willing to cross when seeking to expand their user base so as not to offend or alienate the user base they already have?
- What options can companies pursue when seeking to do business in countries with historically-censorial/repressive regimes to prevent collateral moderation damage to users located elsewhere?
- What sort of cost/benefit analysis, both human and fiscal, should take place before offering a product in countries with known human rights issues?
Resolution: Shortly after the apparent censorship of the iconic “Tank Man” image was reported, Microsoft claimed the very timely removal of relevant search results was the byproduct of “accidental human error.”
However, the company refused to offer any additional explanation. And, while searching the term “Tank Man” produced search results in Bing, it did not generate the expected results.
Image via The Verge
Several hours after the first “fix,” things returned to normal, with “Tank Man” searches bringing up the actual Tank Man, rather than just tanks or tanks with men near or on the tanks.
Image via Twitter user Steven F
More clarification and comment was sought, but Microsoft apparently had nothing more to say about this “human error” and its conspicuous timing. Nor did it offer any details on whether or not this “human error” originated with its Beijing team. It also didn’t explain why the first fix resulted in images very few people would associate with the term “Tank Man.”
Originally posted to the Trust & Safety Foundation website.