Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Facebook Attracts International Attention When It Removes A Historic Vietnam War Photo Posted By The Editor-in-Chief Of Norway's Biggest Newspaper (2016)

from the the-terror-of-content-moderation dept

Summary: Tom Egeland, a Norwegian author of a number of best-selling fiction books, posted a well-known photo known as "The Terror of War" to Facebook. The historic photograph (taken by Vietnamese-American photographer Nick Ut) depicts a naked Vietnamese girl running from a napalm attack during the Vietnam War.

Ut's iconic photo brought the horrors of the war in Vietnam to viewers around the world. But it was not without controversy. Given the full-frontal nudity of the child depicted in the image, the Associated Press pushed back against Ut, citing the paper's policy against publishing nudity. In this case, the nudity of the child resulted in more resistance than usual. Ultimately, the AP decided to run the photo, resulting in a Pulitzer Prize for Ut in 1973.

Despite the photo's historical significance, Facebook decided to suspend Tom Egeland's account. It also deleted his post.

Facebook's decision was based on its terms of service. While the photo was undeniably a historical artifact, moderation efforts by the platform were not attuned to the history.

A notice sent to Egeland pointed out that any displayed genitalia would result in moderation. Also, given the platform's obligation to inform the government about Child Sexual Assault Material (CSAM), leaving a photo of a naked prepubscent up posed problems the algorithms couldn't necessarily handle on their own.

The decision to remove the post and suspend the author's account resulted in an open letter being sent by Norwegian journalist Epsen Hansen. The letter -- addressed to Facebook founder and CEO Mark Zuckerberg -- asked what negative effects moderation efforts like these would have on a "democratic society."

Decisions to be made by Facebook:

  • Should automatic moderation that aids law enforcement be overridden when context shows posts are not attempting to sidestep rules put in place to prevent Facebook users from being subjected to abusive content?
  • What value is placed on context-considerate moderation? Does it add or subtract from financial obligations to shareholders?
  • Does it serve users better to be more responsive -- and helpful -- when context is a primary consideration?
Questions and policy implications to consider:
  • Is the collateral damage of negative press like this offset by Facebook's willingness to be proactive when removing questionable content?
  • Is it more important to serve private users than the numerous governments making moderation demands?
  • Do inexact or seemingly-incoherent responses to controversial content raise the risk of government intervention?
Resolution: Despite the letter from a prominent Norwegian journalist, Facebook refused to reinstate the photo. Instead, it offered boilerplate stating its objection to "nude genitalia." While it stated it did make "allowances" for "educational, humorous, and satirical purposes." Ut's photo did not make the cut apparently. Facebook asked Aftenposten, Egeland, and/or Hansen to "pixelate" the iconic photo before reposting. This was the response from Aftenposten's Hegeland:

Unfortunately, Facebook did not see the pointed humor of Hansen's modification. Facebook's deletion of the original -- as well as its suspension of author Tom Egeland's account -- remained in force. While public shaming has had some effect on moderation efforts by social media companies, Facebook's stance on nudity -- especially the nudity of minors -- prevented it from backing down in the face of negative publicity.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, csam, historic photos, nudity, photo, tom egeland, vietnam
Companies: facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 21 Nov 2020 @ 8:21am

    Re: Re: Content Moderation

    Could I get a little dressing with that word salad?


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Essential Reading
Techdirt Insider Chat
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.