The Tech Policy Greenhouse is an online symposium where experts tackle the most difficult policy challenges facing innovation and technology today. These are problems that don't have easy solutions, where every decision involves tradeoffs and unintended consequences, so we've gathered a wide variety of voices to help dissect existing policy proposals and better inform new ones.

A New Hope For Moderation And Its Discontents?

from the decentralization-as-the-new-frontier dept

In his post kicking off this series, Mike notes that, “the biggest concern with moving moderation decisions down the stack is that most infrastructure players only have a sledge hammer to deal with these questions, rather than a scalpel.” And, I agree with Jonathan Zittrain and other contributors that governments, activists, and others will increasingly reach down the stack to push for takedowns—and will probably get them. 

So, should we expect more blunt force infra layer takedowns or will infrastructure companies invest in more precise moderation tools? Which one is even worse?

Given the choice to build infrastructure now, would you start with a scalpel? How about many scalpels? Or maybe something less severe but distributed and transparent, like clear plastic spoons everywhere! Will the moderation hurt less if we’re all in it together? With the distributed web, we may get to ask all these questions, and have a chance to make things better (or worse). How?

Let me backup a moment for some mostly accurate natural history. In the 90s, to vastly oversimplify, there was web 1.0: static, server-side pages that arose, more manual than you'd like sometimes, maybe not so easy to search or monetize at scale, but fundamentally decentralized and open. We had webrings and manually curated search lists. Listening to Nirvana in my dorm room I read John Perry Barlow’s announcement that "We are forming our own Social Contract. This governance will arise according to the conditions of our world, not yours. Our world is different," in a green IRC window and believed. 

Ok, not every feature was that simple or open or decentralized. The specter of content moderation haunted the Internet from early days of email and bulletin boards. In 1978, a marketer for DEC sent out the first unsolicited commercial message on ARPANET and a few hundred people told him to knock it off, Gary! Voila, community moderation. 

Service providers like AOL and Prodigy offered portals through which users accessed the web and associated chat rooms, and the need to protect the brand led to predictable interventions. There's a Rosetta Stone of AOL content moderation guidelines from 1994 floating around to remind us that as long as there have been people expressing themselves online, there have been other people doing their best to create workable rule sets to govern that expression and endlessly failing in comic and tragic ways (“‘F--- you’ is vulgar” but ‘my *** hurts’ is ok”). 

Back in the Lascaux Cave there was probably someone identifying naughty animal parts and sneaking over with charcoal to darken them out, for the community, and storytellers who blamed all the community’s ills on that person.

And then after the new millenium, little by little and then all at once, came Web 2.0—the Social Web. Javascript frameworks, personalization, everyone a creator and consumer within (not really that open) structures we now call "Platforms" (arguably even less open when using their proprietary mobile rather than web applications). It became much easier for anyone to create, connect, communicate, and distribute expression online without having to design or host their own pages. We got more efficient at tracking and ad targeting and using those algorithms to serve you things similar to the other things you liked.

We all started saying a lot of stuff and never really stopped. If you're a fan of expression in general, and especially of people who previously didn't have great access to distribution channels expressing themselves more, that's a win. But let's be honest: 500 million tweets a day? We've been on an expression bender for years. And that means companies spending billions, and tens of thousands of enablers—paid and unpaid—supporting our speech bender. Are people happy with the moderation we're getting? Generally not. Try running a platform. The moderation is terrible and the portions are so large!

Who’s asking for moderation? Virtually everyone in different ways. Governments want illegal content (CSAM, terrorist content) restricted on behalf of the people, and some also want harmful but legal content restricted in ways that are still unclear, also for the people. Many want harmful content restricted, which means different things depending on which people, which place, which culture, which content, which coffee roast you had this morning. Civil society groups generally want content restricted related to their areas of expertise and concern (except EFF, who will party like it's 1999 forever I hope). 

There are lots of types of expression where at least some people think moderation is appropriate, for different reasons; misinformation is different from doxxing is different from harassment is different from copyright infringement is different from spam. Often, the same team deals with election protection and kids eating Tide Pods (and does both surprisingly well, considering). There’s a lot to moderate and lots of mutually inconsistent demand to do it coming from every direction.

Ok, so let’s make a better internet! Web 3 is happening and it is good. More specifically, as Chris Dixon recently put it, “We are now at the beginning of the Web 3 era, which combines the decentralized, community-governed ethos of Web 1 with the advanced, modern functionality of Web 2.” Don’t forget the blockchain. Assume that over the next few years, Web 3 infrastructure gets built out and flourishes—projects like Arweave, Filecoin, Polkadot, Sia, and Storj. And applications eventually proliferate; tools for expression, creativity, communication, all the things humans do online, all built in ways that embody the values of the DWeb. 

But wait, the social web experiment of the past 15 years led us to build multi-billion dollar institutions within companies aimed at mitigating harms (to individuals, groups, societies, cultural values) associated with online expression and conduct, and increasingly, complying with new regulations. Private courts. Private Supreme Courts. Teams for safeguarding public health and democratic elections. Tens of thousands poring over photos of nipples, asking, where do we draw the line? Are we going to do all that again? One tempting answer is, let’s not. Let’s fire all the moderators. What’s the worst that could happen?

Another way of asking this question is -- what do we mean when we talk about “censorship resistant” distributed technologies? This has been an element of the DWeb since early days but it’s not very clear (to me at least) how resistant, which censorship, and in what ways.

My hunch is that censorship resistance—in the purest sense of defaulting to immutable content with no possible later interventions affecting its availability—is probably not realistic in light of how people and governments currently respond to Web 2.0. The likely outcome is probably quick escalation to intense conflict with the majority of governments. 

And even for people who still favor a marketplace-of-ideas-grounded “rights” framework, I think they know better than to argue that the cure for CSAM is more speech. There will either have to be ways of intervening or the DWeb is going to be a bumpy ride. But “censorship resistant” in the sense of, “how do we build a system where it is not governments, or a small number of powerful, centralized companies that control the levers at the important choke points for expression?” Now we’re talking. Or as Paul Frazee from Beaker Brower and other distributed projects put it: “The question isn't ‘how do we make moderation impossible?’ The question is, how do we make moderation trustworthy?”

So, when it comes to expression and by extension content moderation, how exactly are we going to do better? What could content moderation look like if done consistent with the spirit, principles, and architecture of Web 3? What principles can we look to as a guide? 

I think the broad principles will come as no surprise to anyone following this space over the past few years (and are not so different from those outlined in Corynne McSherry’s post). They include notice, transparency, due process, the availability of multiple venues for expression, and robust competition between options on many axes—including privacy and community norms, as well as the ability of users to structure their own experience as much as possible. 

Here are some recurring themes:

  • Transparency: Clear, broadly available, and auditable information about what interventions are being made, including at the infrastructure level, is a precondition to empowering more people on the web. Rather than building in transparency as an adjunct to moderation (like the essential Lumen database), what if we baked it in from the start? With accommodations and exceptions for privacy where needed; a moderation system that defaults to publicly viewable records as part of every act of intervention.
  • Multiplicity: More nodes with smaller power differentials between. More storage providers, more ISPs, more ways of accessing expression, more contexts for encountering it, and more communities with attendant norms, controlling more layers of the stack down to the infra.
  • Subsidiarity: How do we give users as much control over their own experience as possible? Allow users to customize their applications so that, if desired, certain infrastructure is favored or disfavored (as opposed to blocked or not blocked), through tools like shareable open source lists of disfavored files, actors, institutions, with reliable information about their provenance. Push decisions to the edge.
  • Identity & Reputation: can we structure more experience based on a person’s chosen, persistent, potentially pseudonymous identity? Ideally, assembled from multiple, decentralized sources, and multiple implementations parsing that data rather than one or a few arbiters who store data and make a broadly effective determination. 

We’re in the early days, so the question of how we build decentralized systems resilient to broad attempts to censor—in ways that are familiar from Web 2.0—is still in progress. But the applications are here. The Mastodon project and Element have been at it since 2016. Twitter’s Blue Sky project aims to build decentralized standards for social media with a leader in Jay Graber and a rocking Discord. If Web 2.0 has got you down, consider the possibility that Web 3 might be worth another shot?

Time around let’s not have a small number of actors (commercial or otherwise) making decisions for a large number of people about what expression they can or can’t publish or access. Let’s have more of everything. More decision makers, more venues for expression, more chances for users to make small decisions about what they want to see or not see, more civil society stakeholders with real influence (operationalized through say, governance tokens or whatever other means are at hand). 

At the infrastructure level we need more nodes, more actors, more storage providers, more mechanisms for choosing what is available to more disparate groups, and more ways to make it all visible. With all that and some luck, it may add up to fewer and fewer decisionmakers able to decide what comes down at scale, with a sledgehammer, a vaccine for capital C Censorship by the radical proliferation of small m moderators?

Alex Feerst is a legal & tech executive, trust & safety expert and currently is CEO Murmuration Labs whose clients include the Filecoin Foundation.

Techdirt and EFF are collaborating on this Techdirt Greenhouse discussion. On October 6th from 9am to noon PT, we'll have many of this series' authors discussing and debating their pieces in front of a live virtual audience (register to attend here).

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, cryptocurrency, decentralized, decentralized services, infrastructure, web3


Reader Comments

Subscribe: RSS

View by: Thread


  • icon
    ECA (profile), 5 Oct 2021 @ 1:04pm

    Moderation sucks

    It comes to a point that you have to be 1 side or the other.
    You cant go in and EDIT someones content. As it might then be construed as your own, then You can be responsible for it.
    Controlling Word individually, Like Curses and Cussing and Verbal assaults, IS ALSO editing.
    With the gov. not enforcing any privacy laws what can be done? The only active laws so far are for hacking/breaking into systems, or using them without consent.

    There is something about privacy that the corps Want. Its a way to control who is responsible. With Tons of data released, including most medical records and SS#. Faking things gets very easy. Including internationally.
    But who gets to use it?
    There have been so many Faked phone calls recently, its almost as bad as 2005 when the FTC and FCC stepped in to track who was doing it. But they arnt doing much this time.
    In the past they found the person reasonably close, In the USA or Canada. Cant figure it was only 1 person.
    Its a strange thought, from along time ago. That the military and gov. Cant do certain things from INSIDE the country, but what if its done from the Outside.
    How does the gov. track the gov. doing external things inside the USA. They know the tricks, as they created them, and if you use the same agency to do the Work as the external one, Who controls it?
    Hardest thing to know is What to do with all the data. sell internationally? Keep it? Use it to bankrupt the USA banks?
    This gets to deep and its hard to tell who, what, when, where and any of the rest. I could see the corps doing it, I could see other gov's doing it.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 5 Oct 2021 @ 1:20pm

    Every talk of decentralization that veers into talk of NFTs, like that Chris Dixon Twitter thread, gives me pause. I don’t want to have to engage with NFTs, crypto, or the blockchain to be able to engage with a decentralized web. I don’t want to have an NFT Social Credit Score mandated for me to do certain things on social media. I don’t want my life and online identity financialized more than it is. I just want us to reach a point where I can use my Twitter account to interact with people on a larger number of services, or have my Twitter feed bring me posts from my friends or artists I follow that are on other services, or eventually take my entire post history to another site and not have it all on Twitter.

    Greater user choice should also mean “One shouldn’t have to deal with cryptocurrencies or other stuff they object to in order to have the same experience they have right now, or to improve their experience.” We shouldn’t have to engage with tech whose description is eerily similar to what pyramid schemes are described as to use social media.

    reply to this | link to this | view in chronology ]

  • identicon
    Pixelation, 5 Oct 2021 @ 6:53pm

    "A New Hope" You are so getting sued by Disney.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 15 Oct 2021 @ 11:09am

    Allegedly censorship resistance is built into Bitcoin. What DOES happen if a notation on some transaction includes an encrypted file, and two weeks later it comes out that "happybirthday" decrypts it into a toddler losing her virginity? What is Bitcoin's censorship plan? Anybody know?

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt

The Tech Policy Greenhouse
is a special project by Techdirt,
with support from:

  • Knight Foundation
  • GOLD SPONSORS

    Cloudflare
  • Internet Society
  • SPONSOR

    Golden Frog
  • IN PARTNERSHIP WITH

    Electronic Frontier Foundation
Essential Reading
Techdirt Insider Discord

Introducing the new Techdirt Insider Chat, now hosted on Discord. If you are an Insider with a membership that includes the chat feature and have not yet been invited to join us on Discord, please reach out here.

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.