Content Moderation Case Study: Decentralized Social Media Platform Mastodon Deals With An Influx Of Gab Users (2019)
from the decentralized-content-moderation-challenges dept
Summary: Formed as a more decentralized alternative to Twitter that allowed users to more directly moderate the content they wanted to see, Mastodon has experienced slow, but steady, growth since its inception in 2016.
Unlike other social media networks, Mastodon is built on open-source software and each “instance” (server node) of the network is operated by users. These separate “instances” can be connected with others via Mastodon’s interlinked “fediverse.” Or they can remain independent, creating a completely siloed version of Mastodon that has no connection with the service’s larger “fediverse.”
This puts a lot of power in the hands of the individuals who operate each instance: they can set their own rules, moderate content directly, and prevent anything the “instance” and its users find undesirable from appearing on their servers. But the larger “fediverse” — with its combined user base — poses moderation problems that can’t be handled as easily as those presenting themselves on independent “instances.” The connected “fediverse” allows instances to interact with each other, allowing unwanted content to appear on servers that are trying to steer clear of it.
That’s where Gab — another Twitter alternative — enters the picture. Gab has purposely courted users banned from other social media services. Consequently, the platform has developed a reputation for being a haven for hate speech, racists, and bigots of all varieties. This toxic collection of content/users led to both Apple and Google banning Gab’s app from their app stores.
Faced with this app ban, Gab began looking for options. It decided to create its own Mastodon instance. With its server now technically available to everyone in the Mastodon “fediverse,” those not explicitly blocking Gab’s “instance” could find Gab content available to its users — and also allow for Gab?s users to direct content to their own users. It also allowed Gab to utilize the many different existing Mastodon apps to sidestep the app bans handed down by Google and Apple.
Decisions to be made by Mastodon:
Should Gab (and its users) be banned from setting up “instances,” given that they likely violate the Mastodon Server Covenant?
Is it possible to moderate content across a large number of independent nodes?
Is this even an issue for Mastodon itself to deal with, given that the individuals running different servers can decide for themselves whether or not to allow federation with the Gab instance?
Given the open source and federated nature of Mastodon, would there reasonably be any way to stop Gab from using Mastodon?
Questions and policy implications to consider:
Will moderation efforts targeting the “fediverse” undercut the independence granted to “instance” owners?
Do attempts to attract more users create moderation friction when the newly-arriving users create content Mastodon was created to avoid?
If Mastodon continues to scale, will it always face challenges as certain instances are created to appeal to audiences that the rest of the ?fediverse? is trying to avoid?
Can a federated system, in which unique instances choose not to federate with another instance, such as Gab, work as a form of ?moderation-by-exclusion??
Resolution: Mastodon’s founder, Eugen Rochko, refused to create a blanket ban on Gab, leaving it up to individual “instances” to decide whether or not to interact with the interlopers. As he explained to The Verge, a blanket ban would be almost impossible, given the decentralized nature of the service.
On the other hand, most “fediverse” members would be unlikely to have to deal with Gab or its users, considering the content contained in Gab’s “instance” routinely violates the Mastodon “covenant.” Violating these rules prevents instances from being listed by Mastodon itself, lowering the chances of other “instance” owners inadvertently adding toxic content and users to their server nodes. And Rochko himself encouraged users to preemptively block Gab’s “instance,” resulting in ever fewer users being affected by Gab’s attempted invasion of the Mastodon fediverse.
But running a decentralized system creates an entirely new set of moderation issues, which has turned Mastodon itself into a moderation target. Roughly a year after the Gab “invasion,” Google threatened to pull Mastodon-based apps from its store for promoting hate speech, after users tried to get around the Play Store ban by creating apps that pointed to Mastodon ?instances? filled with hateful content. Google ultimately decided to leave Mastodon-based apps up, but appears ready to pull the trigger on a ban in future.
Originally posted to the Trust & Safety Foundation website.