Internet Content Moderation Isn't Politically Biased, It's Just Impossible To Do Well At Scale

from the stop-this-dumb-narrative dept

The narrative making the political rounds recently is that the big social media platforms are somehow "biased against conservatives" and deliberately trying to silence them (meanwhile, there are some in the liberal camp who are complaining that sites like Twitter have not killed off certain accounts, arguing -- incorrectly -- that they're now overcompensating in trying to not kick off angry ideologues). This has been a stupid narrative from the beginning, but the refrain on it has only been getting louder and louder, especially as Donald Trump has gone off on one of his ill-informed rants claming that "Social Media Giants are silencing millions of people." Let's be clear: this is all nonsense.

The real issue -- as we've been trying to explain for quite some time now -- is that basic content moderation at scale is nearly impossible to do well. That doesn't mean sites can't do better, but the failures are not because of some institutional bias. Will Oremus, over at Slate, has a good article up detailing why this narrative is nonsense, and he points to the episode of Radiolab we recently wrote about, that digs deep on how Facebook moderation choices happen, where you quickly begin to get a sense of why it's impossible to do it well. I would add to that a recent piece from Motherboard, accurately titled The Impossible Job: Inside Facebook’s Struggle to Moderate Two Billion People.

These all highlight a few simple facts that lots of angry people (on all sides of political debates) are having trouble grasping.

  1. If you leave a platform completely unmoderated, it will fill up with junk, spam, trolling and the like, thereby decreasing its overall utility and pushing people away.
  2. If you do decide to moderate, you have a set of impossible choices. So much content requires understanding context, and context may be very different, even for the same content when viewed by different people.
  3. If you're going to moderate at scale, you're going to need a set of "rules" that thousands of generally low paid individuals will have to be able to put into practice, reviewing pieces of content for just a few seconds (a recent report said that Facebook reviewers were expect to review 5,000 pieces of content per day.
  4. It is impossible to make rules like that that can easily be applied to all content. A significant percentage of content falls into gray areas, where it then becomes a judgment call by people in a cubicle in the middle of reviewing 5,000 pieces of content.
  5. At that rate, many mistakes are made. It is collateral damage of moderation at scale.
  6. People caught in the crossfire of collateral damage will rightly make a big stink about it and the social media companies will look bad.
  7. Meanwhile, some of the reasonable moderation decisions will hit trolls hard (see point 1 above) and those trolls will then take to other platforms and make a huge stink about how unfair it all is, and the social media companies will look bad.
Put this all together and it is a no win situation. You can't leave the platform completely unmoderated. But any attempt at moderation at scale is going to have problems. The "scale" part of this is what's the most difficult for most people to grasp. As Kate Klonick (again, author of an incredible paper on content moderation that you should read as well as author of a guest post here on Techdirt) notes in the Motherboard piece:

“This is the difference between having 100 million people and a few billion people on your platform,” Kate Klonick... told Motherboard. “If you moderate posts 40 million times a day, the chance of one of those wrong decisions blowing up in your face is so much higher.”

Later in the piece, Klonick again makes an important point:

“The really easy answer is outrage, and that reaction is so useless,” Klonick said. “The other easy thing is to create an evil corporate narrative, and that is also not right. I’m not letting them off the hook, but these are mind-bending problems and I think sometimes they don’t get enough credit for how hard these problems are.”

This is why I've been advocating loudly for platforms to move the moderation decisions further out to the ends of the network, rather than doing it in a centralized fashion. Let end users create their own moderation system, or adapt ones put together by third parties. But, of course, even that has problems as well.

No matter what choices are made, there are significant tradeoffs. As the Motherboard article also highlights, what seems like a "simple" rule gets hellishly complex quickly when applied to other situations, and then you've suddenly increased the "error" rate and people get angry all over again and the whole mess gets blown out of proportion again.

“There's always a balance between: do we add this exception, this nuance, this regional trade-off, and maybe incur lower accuracy and more errors,” Guy Rosen, VP of product management at Facebook, said. "[Or] do we keep it simple, but maybe not quite as nuanced in some of the edge cases? Balance that's really hard to strike at the end of the day.”

As the Oremus piece notes, the "bias" of platforms when it comes to moderation is not "liberal" or "conservative," it's Capitalist. Having a platform overrun with spam and trolls is bad for business. Hiring enough people who can adequately review content within the correct context is somewhere between insanely cost prohibitive and impossible. So the platforms muddle by with imperfect review processes. Making moderation mistakes is also bad for business, and the platforms would love to minimize them, but "mistakes" are often in the eye of the beholder as well, again reinforcing that this is an impossible task. For everyone screaming about how Alex Jones should be kicked off platforms, there's a similar number of people screaming about how awful the platforms are that do kick him off. There is no "right" way to do this, and that's what every platform struggles with.

And, if you think that these platforms are unfairly silencing "conservatives" (which is the prevailing narrative right now), it's probably because you're not paying enough attention elsewhere. Black Lives Matter and other civil rights groups have complained about "racially biased" moderation in the opposite direction, saying that minority groups are regularly silenced on these platforms. Indeed, it's not hard to find a ton of reports about black activists having content removed from social media platforms. And for all the talk of Infowars being taken off these platforms, how many people noticed that the Facebook page of the Venezuelan socialist TV station Telesur was recently taken down as well.

Yes, it's fine to point out that these platforms (mainly Facebook, Twitter and YouTube) are really bad at moderating. But, unless you're willing to actually understand the scale at play, recognize how many mistakes are going to be made (and recognize how trolls are going to go nuts over correct decisions), you're playing into a false narrative to argue that any of these platforms are "targeting" anyone. It's not true.

Filed Under: bias, content moderation, errors, filters, mistakes, scale, social media, spam, trolls
Companies: facebook, google, twitter, youtube

Reader Comments

Subscribe: RSS

View by: Time | Thread

  1. icon
    Anonymous Anonymous Coward (profile), 27 Aug 2018 @ 3:34pm

    Re: Meaninglessness

    1) There is no such thing as nature "intending" to grow specific plants.

    To know whether 'nature' intended anything goes further into belief systems than I care to venture, but only because our discussion of what god you believe in vs the god I believe in would take over the conversation. Nature, intended? I don't know about that, but it certainly promoted some things and denigrated others. How that choice was made is not yet known, for absolute fact. The chemistry, yes, but the choice, no.

    2) A garden is purpose-grown, and removing weeds and other plants that go against the purpose of the garden is not "bias." Prefering to grow, say, berries instead of vegetables, could be called a "bias," but why would you? That's a preference; the subtext of the word bias does not maintain in that conversation.

    To prefer to grow something, in the Internet conversation sense, means that a particular 'bias' is inherent in the website. Some do, some don't. Some try to be neutral, others don't try so hard. If a website or a service tries to be neutral and fails at it, it is not bias. We could call it many names, but bias might not apply.

    3) Similarly, proper moderation is no more "bias" than is removing weeds from a garden; keeping a forum to its proper and purposeful context is what keeps it a forum instead of a pile of meaningless spam and garbage, just like weeding a garden keeps it a garden instead of a tangle of wild growth.

    Here we don't disagree. I never said otherwise.

    4) The term "bias" has a specific and deliberate negative connotation that your comments do not account for, and your apparent definition of "bias"is indistinguishable from the term "preference." A bias indicates taking a side on somethig actively in debate and deliberately slanting things toward that side, not removing unrelated spam and trolling.

    Definition of bias

    1 a : an inclination of temperament or outlook; especially : a personal and sometimes unreasoned judgment : prejudice

    b : an instance of such prejudice

    c : bent, tendency

    d (1) : deviation of the expected value of a statistical estimate from the quantity it estimates

    (2) : systematic error introduced into sampling or testing by selecting or encouraging one outcome or answer over others

    (limited to the first definition as the others don't apply)

    I am using definition 'c', whereas others may be using different definitions. Even when using the term 'bias' I am thinking more about 'discrimination or preference' (see below), which is why I chose definition 'c'.

    Definition of discrimination

    1 a : prejudiced or prejudicial outlook, action, or treatment racial discrimination

    b : the act, practice, or an instance of discriminating categorically rather than individually

    2 : the quality or power of finely distinguishing the film viewed by those with discrimination

    3 a : the act of making or perceiving a difference : the act of discriminating a bloodhound's scent discrimination

    b psychology : the process by which two stimuli differing in some aspect are responded to differently

    Definition of preference

    1 a : the act of preferring : the state of being preferred

    b : the power or opportunity of choosing

    2 : one that is preferred

    3 : the act, fact, or principle of giving advantages to some over others

    4 : priority in the right to demand and receive satisfaction of an obligation

    5 : orientation or sexual preference

    In the end nature vs nurture might just be applicable. If one has an inherent 'bias' to use the definitions I did not, then it seems to be nurture, whereas if one does not impose 'bias' and 'bias' is perceived, then it might just be nature exposing its dark side. Some of nature is good, some bad. There are things that nature produces 'naturally' that are inherently poisonous. There are others that are are not. Then again, that might pertain to the species consuming.

    If one reads Techdirt and presumes it has a liberal 'bias' then they should look into their perception system. If one reads Techdirt and presumes a conservative 'bias' then they should look into their perception system. If one reads Techdirt and does not impose 'bias' then it is likely that their perception system is working, as Techdirt tries not to impose any political 'bias', but looks at various things as to how they might be good or bad in relation to everybody, not just one or the other (meaning conservative or liberal or R or D or any other descriptor).

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here

Subscribe to the Techdirt Daily newsletter

Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Techdirt Gear
Shop Now: Copying Is Not Theft
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Report this ad  |  Hide Techdirt ads
Recent Stories
Report this ad  |  Hide Techdirt ads


Email This

This feature is only available to registered users. Register or sign in to use it.