The Tech Policy Greenhouse is an online symposium where experts tackle the most difficult policy challenges facing innovation and technology today. These are problems that don't have easy solutions, where every decision involves tradeoffs and unintended consequences, so we've gathered a wide variety of voices to help dissect existing policy proposals and better inform new ones.

That's A Wrap: Techdirt Greenhouse Content Moderation Edition

from the building-a-better,-more-ethical-internet dept

When we launched Techdirt Greenhouse, we noted that we wanted to build a tech policy forum that not only tackled the thorniest tech policy issues of the day, but did so with a little more patience and nuance than you'll find at many gadget-obsessed technology outlets. After our inaugural panel tackled privacy, we just wrapped on our second panel subject: content moderation. We'd like to thank all of those that participated in the panel, and all of you for reading.

You'd be hard pressed to find a thornier, more complicated subject than content moderation. On one hand, technology giants have spent years prioritizing ad engagement over protecting their user base from malicious disinformation and hate speech, often with fatal results. At the same time, many of the remedies being proposed cause more harm than good by trampling free speech, or putting giant corporations into the position of arbiters of acceptable public discourse. Moderation at this scale is a nightmare. One misstep in federal policy and you've created an ocean of new problems.

Whether it's the detection and deletion of live-streaming violence, or protecting elections from foreign and domestic propaganda, it's a labyrinthine, multi-tendriled subject that can flummox even experts in the field. We're hopeful that this collection of pieces helped inform the debate in a way that simplified some of these immensely complicated issues. Here's a recap of the pieces from this round in case you missed them:

  • Michael Karanicolas examined how localized content moderation decisions can have a massive, often unpredictable global impact, as disinformation-fueled genocide makes abundantly clear.
  • Robert Hamilton explored the need to revisit the common law liability of online intermediaries before Section 230, helping us better understand how we got here.
  • Jess Miers explored how getting rid of Section 230 won't magically eliminate the internet's most problematic content.
  • Aye Min Thant took a closer look at how conflating Facebook with "the internet" in locations like Myanmar, without understanding the culture or having adequate safeguards in place, threw accelerant on the region's genocide.
  • Matthew Feeney examined how evidence "supporting" the repeal of Section 230 is shaky at best, and the fixation on Section 230 is hugely myopic.
  • John Bergmayer argued that it doesn't make sense to treat ad the same as user-generated content, and that websites should face the legal risk for ads they run as print publishers.
  • Brandi Collins-Dexter explored how the monetization of polarization has had a heartbreaking impact on America's deep, longstanding relationship with bigotry.
  • Emma Llanso discussed how the sharing of content moderation knowledge shouldn't provide a backdoor to cross-platform censorship.
  • David Morar explored how many of the problems currently being blamed on "big tech," are simple, ordinary, human fallibility.
  • Yosef Getachew examined how social media could easily apply many of the content moderation practices they've custom-built for COVID-19 to the battle to protect election integrity from domestic and foreign disinformation.
  • Adelin Cai and Clara Tsao offered a useful primer for trust and safety professionals tasked with tackling the near-impossible task of modern content moderation at scale.
  • Gaurav Laroia & Carmen Scurato discussed how fighting online hate speech requires keeping Section 230, not discarding it.
  • Taylor Rhyne offered a useful content moderation primer for startups facing a daunting challenge without the bottomless budgets of their "big tech" counterparts.
  • Graham Smith took a closer look at the content moderation debate and how it intersects with existing post-Brexit headaches in the UK.
  • Daphne Keller took a deep dive into what policy makers can do if they don't like existing platform free speech rules, and how none of the options are particularly great.
  • Much like the privacy debate, crafting meaningful content moderation guidelines and rules (and ensuring consistent, transparent enforcement) was a steep uphill climb even during the best of times. Now the effort will share fractured attention spans and resources with an historic pandemic, recovering from the resulting economic collapse, and addressing the endless web of socioeconomic and political dysfunction that is the American COVID-19 crisis. But, much like the privacy debate, it's an essential discussion to have all the same, and we hope folks found this collection informative.

    Again, we'd like to thank our participants for taking the time to provide insight during an increasingly challenging time. We'd also like to thank Techdirt readers and commenters for participating. In a few weeks we'll be announcing the next panel; one that should prove timely during an historic health crisis that has forced the majority of Americans to work, play, innovate, and learn from the confines of home.

    Hide this

    Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

    Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

    While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

    –The Techdirt Team

    Filed Under: content moderation, tradeoffs


    Reader Comments

    Subscribe: RSS

    View by: Time | Thread


    • identicon
      Anonymous Coward, 16 Sep 2020 @ 12:31pm

      I have really appreciated the work of everyone authoring posts for this, even those with which i had a good deal of disagreement. You have done a smashing job of documenting and analyzing the current and possible future landscapes, among other things, and have provided a great deal of clarity on the subject (or allied subjects), which provides high-grade thinking fuel.

      reply to this | link to this | view in chronology ]


    Add Your Comment

    Have a Techdirt Account? Sign in now. Want one? Register here



    Subscribe to the Techdirt Daily newsletter




    Comment Options:

    • Use markdown. Use plain text.
    • Remember name/email/url (set a cookie)

    Close

    Add A Reply

    Have a Techdirt Account? Sign in now. Want one? Register here



    Subscribe to the Techdirt Daily newsletter




    Comment Options:

    • Use markdown. Use plain text.
    • Remember name/email/url (set a cookie)

    Follow Techdirt

    The Tech Policy Greenhouse
    is a special project by Techdirt,
    with support from:

    Essential Reading
    Techdirt Insider Chat
    Recent Stories

    This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
    Close

    Email This

    This feature is only available to registered users. Register or sign in to use it.