Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020)

from the political-moderation dept

Summary: For years now there have been concerns raised about the possibility of “deep fake” videos impacting an election. Deep fakes are videos that have been digitally altered, often to insert someone’s face onto another person’s body, to make it appear that they were somewhere they were not or did something they did not. To date, most of the more sophisticated deep fake videos have been mainly for entertainment purposes but there has been a concern that they could lead to faked accusations against politicians or other public figures. However, so far, there has been little evidence of deep fake videos being used in elections. This may be because the technology is not yet good enough or because such videos have been easy to debunk through other evidence.

Meanwhile, there has been increasing concern about something slightly different: cheap fake or shallow fake videos, which are just slight modifications and adjustments to real videos—less technically sophisticated, but also potentially harder to combat.

One of the most high profile examples of this was a series of videos that went viral on social media of House Speaker Nancy Pelosi that were modified by slowing down the video to 75% of the original speed. The modified videos were spread with false claims that they showed Pelosi slurring her words, possibly indicating intoxication. Various media organizations fact checked the claims, noting that the videos were altered and therefore presented a very inaccurate picture of Pelosi and her speech patterns.

Social media companies were urged by some to delete these videos, including by Speaker Pelosi herself, who argued that Facebook in particular should remove them. Both Facebook and Twitter refused to take down the video, saying that it did not violate their policies. YouTube removed the video.

In response to concerns raised by Pelosi, some highlighted that it would be impossible to expect social media to remove every misleading political statement that either took the words of an opponent out of context or presented them in a misleading way, while others suggested that there’s a clear difference when it comes to manipulated video as compared to manipulated text.

Others highlighted that it would be difficult to distinguish manipulated video from satire or other attempts at humor.

Company Considerations:

  • Where should companies draw the line between misleading political content and deliberate misinformation?
  • Under what conditions would a misleading cheap fake video separately violate other policies, such as harassment?
  • What should the standards be for removing political content that could be deemed misleading?
  • Does medium matter? Should there be different rules for manipulated videos as compared to other types of content, such as taking statements out of context? 
  • Should there be exceptions for parody/satire?
  • Are there effective ways for distinguishing videos that are manipulated to mislead vs. those that are manipulated for humor or commentary? 
  • Should the company have different standards if the subject of a cheap fake video was not a political or public figure? 
  • What are other alternative approaches, beyond blocking, that could be used to address manipulated political videos?
Issue Considerations:
  • What are the possible unintended consequences if all “manipulated video” is deemed a policy violation?
  • What, if any, is the value of not removing videos of political or public figures that are clearly misleading? Would there be any unintended consequences of such a policy?
  • What are the implications for democracy if manipulated political videos are allowed to remain on a platform, where they may spread virally?
  • Are misleading “cheap fake” videos about politicians considered political speech?
  • Who should decide when “cheap fake” political speech is inaccurate and inappropriate—should it be social media platforms, a general public consensus, or a third body??
  • How might “cheap fake” videos be used for harassment and bullying of non-public figures, and what are the potential implications for real life harm? 
  • If the cheap fake video didn’t originate from a public source (as in the Pelosi video) but a private video, how could a company determine that those videos were manipulated?   
Resolution: After public demands that Twitter, YouTube and Facebook do something about the modified Pelosi video, the three major social media platforms each responded differently. YouTube took the video down, saying that it violated its policies. YouTube also noted that unlike the other platforms, the modified Pelosi video did not appear to go viral or spread widely on its platform. Facebook kept the video up but, in accordance with its fact-checking policies, once the video was deemed “false” Facebook de-ranked the video in its news feed, limiting its spread.

Twitter left the video up. However, by the time a very similar event happened a few months later, Twitter announced a new plan for dealing with such content, saying that it would begin adding labels to “manipulated media,” offering context for people who came across such videos so they would understand that the video is not being shown in its original context. One of the first examples of Twitter applying this “manipulated media” label was to a comedy segment by late night TV entertainer Jimmy Kimmel who used some manipulated video to make fun of former Vice President Mike Pence.

Originally posted to the Trust & Safety Foundation website.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: cheap fakes, content moderation, deep fakes, modified content, nancy pelosi, politics, shallow fakes, social media
Companies: facebook, twitter, youtube


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    ECA (profile), 9 Jul 2021 @ 11:16am

    Fakes, long have they been around.

    Want a Suggestion,
    Lets make things easy.
    The Creator and the TRUE persons who paid for them, MUST be admitted to. Have to be even a 1 frame acknowledgment.
    If this requires the Gov/state to Dig up that info, then it should be prosecuted.


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Make this the First Word or Last Word. No thanks. (get credits or sign in to see balance)    
  • Remember name/email/url (set a cookie)

Follow Techdirt
Sponsored Promotion
Public Money, Public Code - Sign The Open Letter at publiccode.eu
Essential Reading
Techdirt Insider Chat
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.