The Externalization Of Content Moderation: Facebook Explores 'Election Commission'

from the maybe-that's-not-a-bad-thing dept

We’ve covered plenty of stuff about the Oversight Board that Facebook set up. While we recognize the cynicism towards it and still think it’s somewhat early to judge, for many of us following the Board’s decisions closely, it’s been really kind of eye opening how much the Board has really pushed Facebook to make real changes. Of course, there are many real structural issues with the way the Oversight Board is set up — but the initial results have been fascinating.

And that’s why it’s interesting to hear that the company is considering creating another oversight board like setup for dealing with the challenges of moderation and elections, according to a report in the NY Times.

Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.

The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.

I already know (and have seen) some people responding cynically, that this is, once again, Zuckerberg trying to avoid responsibility by fobbing off the tough choices on other people. And, honestly, given Zuckerberg’s own past comments that might be exactly a key motivation for him.

But, here’s the crazy thing: this could be good even if it’s done for cynical, responsibility-avoidance reasons.

First, even if it’s Facebook trying to dodge responsibility, um, since no one trusts Facebook at all, shouldn’t we want the responsibility taken away from Facebook? Lots of companies that don’t want to take responsibility just throw up their hands and do nothing. Here, at least, Facebook is looking to hand it off to others to actually do the hard work that the company itself doesn’t want to do. And, even better, it wants to hand it off to a bunch of actual subject matter experts rather than a stressed out, overworked internal team that gets no respect in the first place and has to deal with all sorts of competing interests and incentives.

Second, a huge part of the problem that everyone is facing regarding the internet giants and their moderation choices is that there is zero transparency involved, and that’s a big part of what makes things so uncomfortable for people. From the outside, the big decisions (especially on the content moderation policy side) are suddenly announced by the company, with no insight into who is making these decisions or how. Indeed, the few leaks from behind the blue curtain have not been pretty. Letting actual experts figure out better ways forward, where the public knows who they are, and (hopefully) gets to see some of the reasoning and thought process, might not only come up with better overall policies, but also create more transparency and understanding about the policymaking process itself.

So, yes, sure, this very well may be a cynical move. And maybe Zuckerberg gets to laugh at people for letting him off the hook, but frankly, I don’t want Zuckerberg personally to be on the hook for these kinds of decisions. Hand it off to experts, make the process public, let us learn about it and see what happens, rather than trusting Mark or his band of former political operatives to make the decisions.

Filed Under: , , , , ,
Companies: facebook, oversight board

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “The Externalization Of Content Moderation: Facebook Explores 'Election Commission'”

Subscribe: RSS Leave a comment
17 Comments
This comment has been deemed insightful by the community.
This comment has been deemed funny by the community.
Anonymous Coward says:

Mike, Mike, Mike...

Remember: people don’t want transparency, accountability, or subject matter expert opinion. They want Facebook to Nerd Harder™ and make all that "bad" stuff they don’t like, just… go away.

Christenson says:

Delegation

I’m with Mike here, this election commission (and the oversight board) are likely good things.

The Zuck is only one person, and the commission is Zuck:
delegating a difficult and very complicated but important job.
giving the job adequate priority
pulling in a team of (hopefully widely respected) experts
giving it high-level buy-in. Who, within Facebook, is gonna ignore Zuck’s commission?

Yes, there’s a cynical Public Relations motive, and maybe Zuck is a crappy human being, and maybe he should have had those experts leading his moderation team in the first place, but seriously, complain about improved transparency and improved moderation?

Stephen T. Stone (profile) says:

Re:

I hear that Harvard recently picked a new chief chaplain, who happens to be an atheist

What’s wrong with a Humanist being the chief chaplain at Harvard? Are you trying to imply that atheists of any kind don’t deserve a seat at the table of religion⁠—or that the non-religious students at Harvard who are exploring their spirituality and religious beliefs don’t deserve an alternative to clergy from theistic religions?

Darkness Of Course (profile) says:

Zucky is tired of being the focal point for his company

You have to remember who Zucky is; Elitist snob that was searching for hot chicks to "date". Got lucky with his Facebook of hot chick faces, don’t know if he got laid as a result and that is a different post.

He embraced That Fucking Goon, and many GQP ideals because Elizabeth Warren, and Alexandria Orcasio-Cortez wanting to tax the rich. Why I do believe they might have been thinking about taking his precious, precious money. Which Zucky loves more than anything else on the planet. Even Facebook.

Also, he has a legion of advisors and lawyers that are paid to suck up to the Zucker. However, that has not served him well to date. So maybe he might actually do something reasonable with outside advisory committee aid.

If only America had a privacy law at the federal level that had teeth in it, and personal liability for each data loss for each individual.

Yeah, I was thinking about each piece of data as well, but that’s too much for the modern legislator to comprehend.

Vermont IP Lawyer (profile) says:

The Commish

And who exactly would populate this commission? Cannot be anyone with a liberal/progressive/Democratic legacy as the other side will never trust them. Ditto if it’s almost anyone with a conservative/Republican legacy. Is there a bunch of political science profs/law school profs who have managed to achieve "widely respected expert" status without ever advocating for one side or the other?

Anonymous Coward says:

The real problem is with internet giants trying (or being demanded) to "take responsibility" for things which they are only theoretically but not practically capable of changing.

Nobody would have the resources to correctly identify and remove all the "wrong" opinions on the scale of the internet, even if it were possible for people to agree what the "wrong" opinions are, which is not.

The result is people half the people insisting they are being unfairly censored and the other half complaining there are still things they don’t like online. Neither of these complaints are going to go away with more empty promises and more high-sounding policies impossible to accurately enforce at scale.

Christenson says:

Re: The complicated Middle, #&@&*(!

I think Mr Masnick is in the missing middle third with me:
The world is complicated.
We can do better; bad sh exists on line.
We can do worse, bad people want to be absolutely corrupted by absolute power — do it my way, and only to those I dislike!
Experiments on that account help!
Theorems and logic help
** The law of unintended consequences is real.

Lily May says:

Having a more transparent conversation about moderation policy is a good thing at heart. But at the end of the line there is an overworked, underpaid non-expert making the decisions under pressure with seconds to consider. Or worse, a robot.

Instead of setting up commissions and boards to come up with high-sounding press releases, they should focus on moderation policy that takes into account the practical realities of the people and systems that have to implement it. Just because some people constantly demand social media companies to do something about their complaint du jour doesn’t mean it’s possible or practical to do it. Or that trying would not have unintended consequences worse that the problem they were trying to solve.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...