Content Moderation Case Study: Facebook Attracts International Attention When It Removes A Historic Vietnam War Photo Posted By The Editor-in-Chief Of Norway's Biggest Newspaper (2016)
from the the-terror-of-content-moderation dept
Summary: Tom Egeland, a Norwegian author of a number of best-selling fiction books, posted a well-known photo known as “The Terror of War” to Facebook. The historic photograph (taken by Vietnamese-American photographer Nick Ut) depicts a naked Vietnamese girl running from a napalm attack during the Vietnam War.
Ut’s iconic photo brought the horrors of the war in Vietnam to viewers around the world. But it was not without controversy. Given the full-frontal nudity of the child depicted in the image, the Associated Press pushed back against Ut, citing the paper’s policy against publishing nudity. In this case, the nudity of the child resulted in more resistance than usual. Ultimately, the AP decided to run the photo, resulting in a Pulitzer Prize for Ut in 1973.
Despite the photo’s historical significance, Facebook decided to suspend Tom Egeland’s account. It also deleted his post.
Facebook’s decision was based on its terms of service. While the photo was undeniably a historical artifact, moderation efforts by the platform were not attuned to the history.
A notice sent to Egeland pointed out that any displayed genitalia would result in moderation. Also, given the platform’s obligation to inform the government about Child Sexual Assault Material (CSAM), leaving a photo of a naked prepubscent up posed problems the algorithms couldn’t necessarily handle on their own.
The decision to remove the post and suspend the author’s account resulted in an open letter being sent by Norwegian journalist Epsen Hansen. The letter — addressed to Facebook founder and CEO Mark Zuckerberg — asked what negative effects moderation efforts like these would have on a “democratic society.”
Decisions to be made by Facebook:
- Should automatic moderation that aids law enforcement be overridden when context shows posts are not attempting to sidestep rules put in place to prevent Facebook users from being subjected to abusive content?
- What value is placed on context-considerate moderation? Does it add or subtract from financial obligations to shareholders?
- Does it serve users better to be more responsive — and helpful — when context is a primary consideration?
Questions and policy implications to consider:
- Is the collateral damage of negative press like this offset by Facebook’s willingness to be proactive when removing questionable content?
- Is it more important to serve private users than the numerous governments making moderation demands?
- Do inexact or seemingly-incoherent responses to controversial content raise the risk of government intervention?
Resolution: Despite the letter from a prominent Norwegian journalist, Facebook refused to reinstate the photo. Instead, it offered boilerplate stating its objection to “nude genitalia.” While it stated it did make “allowances” for “educational, humorous, and satirical purposes.” Ut’s photo did not make the cut apparently. Facebook asked Aftenposten, Egeland, and/or Hansen to “pixelate” the iconic photo before reposting. This was the response from Aftenposten’s Hegeland:
Unfortunately, Facebook did not see the pointed humor of Hansen’s modification. Facebook’s deletion of the original — as well as its suspension of author Tom Egeland’s account — remained in force. While public shaming has had some effect on moderation efforts by social media companies, Facebook’s stance on nudity — especially the nudity of minors — prevented it from backing down in the face of negative publicity.