Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Facebook Knew About Deceptive Advertising Practices By A Group That Was Later Banned For Operating A Troll Farm (2018-2020)

from the election-advertising-moderation dept


In the lead-up to the 2018 midterm elections in the United States, progressive voters in seven competitive races in the Midwest were targeted with a series of Facebook ads urging them to vote for Green Party candidates. The ads, which came from a group called America Progress Now, included images of and quotes from prominent progressive Democrats including Bernie Sanders and Alexandria Ocasio-Cortez with the implication that these politicians supported voting for third parties. 

The campaign raised eyebrows for a variety of reasons: two of the featured candidates stated that they did not approve the ads, nor did they say or write the supposed quotes that were run alongside their photos, and six of the candidates stated that they had no connection with the group. The office of Senator Sanders asked Facebook to remove the campaign, calling it “clearly a malicious attempt to deceive voters.” Most notably, an investigation by ProPublica and VICE News revealed that America Progress Now was not registered with the Federal Election Commission nor was any such organization present at the address listed on its Facebook page.

In response to Senator Sanders’ office, and in a further statement to ProPublica and VICE, Facebook stated that it had investigated the group and found no violation of its advertising policies or community standards.

Two years later, during the lead-up to the 2020 presidential election, an investigation by the Washington Post revealed a “troll farm”-type operation directed by Rally Forge, a digital marketing firm with connections to Turning Point Action (an affiliate of the conservative youth group Turning Point USA), in which multiple teenagers were recruited and directed to post pro-Trump comments using false identities on both Facebook and Twitter. This revelation resulted in multiple accounts being removed by both companies, and Rally Forge was permanently banned from Facebook.

As it turned out, these two apparently separate incidents were in fact closely connected: an investigation by The Guardian in June of 2021, aided in part by Facebook whistleblower Sophie Zhang, discovered that Rally Forge had been behind the America Progress Now ads in 2018. Moreover, Facebook had been aware of the source of the ads and their deceptive nature, and of Rally Forge’s connection to Turning Point, when it determined that the ads did not violate its policies. The company did not disclose these findings at the time. Internal Facebook documents, seen by The Guardian, recorded concerns raised by a member of Facebook’s civic integrity team, noting that the ads were “very inauthentic” and “very sketchy.” In the Guardian article, Zhang asserted that “the fact that Rally Forge later went on to conduct coordinated inauthentic behavior with troll farms reminiscent of Russia should be taken as an indication that Facebook’s leniency led to more risk-taking behavior.”

Company considerations:

  • What is the best way to address political ads that are known to be intentionally deceptive but do not violate specific advertising policies?
  • What disclosure policies should be in place for internal investigations that reveal the questionable provenance of apparently deceptive political ad campaigns?
  • When a group is known to have engaged in deceptive practices that do not violate policy, what additional measures should be taken to monitor the group in case future actions involve escalations of deceptive and manipulative tactics?
Issue considerations:
  • How important should the source and intent of political ads be when determining whether or not they should be allowed to remain on a platform, as compared to the content of the ads themselves?
  • At what point should apparent connections between a group that violates platform policies and a group that did not directly engage in the prohibited activity result in enforcement actions against the latter group?

A Facebook spokesperson told The Guardian that the company had “strengthened our policies related to election interference and political ad transparency” in the time since the 2018 investigation, which revealed no violations by America Progress Now. The company also introduced a new policy aiming to increase transparency regarding the operators of networks of Facebook Pages.

Rally Forge and one of its page administrators remain permanently banned from Facebook following the 2020 troll farm investigation, while Turning Point USA and Turning Point Action deny any involvement in the specifics of either campaign, and Facebook has taken no direct enforcement action against those groups.

Originally posted to the Trust and Safety Foundation website.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: advertising, content moderation, elections, russia, troll farm
Companies: facebook

Reader Comments

Subscribe: RSS

View by: Time | Thread

  • icon
    That One Guy (profile), 12 Jan 2022 @ 7:27pm

    'We need to be HEAVILY regulated.' -Mark Zuckerburg

    I can't help but suspect that if the group had been using pics of Facebook execs and attributing dishonest quotes to them in favor of some political position that the company didn't agree with that would have been enough for them to find something to nail them on, but since it was just a bunch of liars trying to convince people not to vote democrat it was seen as no big deal.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 13 Jan 2022 @ 6:23am

    Cancel FASCBOOK

    Cancel FASCBOOK

    reply to this | link to this | view in chronology ]

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here

Subscribe to the Techdirt Daily newsletter

Comment Options:

  • Use markdown. Use plain text.
  • Make this the First Word or Last Word. No thanks. (get credits or sign in to see balance)    
  • Remember name/email/url (set a cookie)


Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here

Subscribe to the Techdirt Daily newsletter

Comment Options:

  • Use markdown. Use plain text.
  • Make this the First Word or Last Word. No thanks. (get credits or sign in to see balance)    
  • Remember name/email/url (set a cookie)

Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Recent Stories


Email This

This feature is only available to registered users. Register or sign in to use it.