The Externalization Of Content Moderation: Facebook Explores 'Election Commission'
from the maybe-that's-not-a-bad-thing dept
We’ve covered plenty of stuff about the Oversight Board that Facebook set up. While we recognize the cynicism towards it and still think it’s somewhat early to judge, for many of us following the Board’s decisions closely, it’s been really kind of eye opening how much the Board has really pushed Facebook to make real changes. Of course, there are many real structural issues with the way the Oversight Board is set up — but the initial results have been fascinating.
And that’s why it’s interesting to hear that the company is considering creating another oversight board like setup for dealing with the challenges of moderation and elections, according to a report in the NY Times.
Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.
The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.
I already know (and have seen) some people responding cynically, that this is, once again, Zuckerberg trying to avoid responsibility by fobbing off the tough choices on other people. And, honestly, given Zuckerberg’s own past comments that might be exactly a key motivation for him.
But, here’s the crazy thing: this could be good even if it’s done for cynical, responsibility-avoidance reasons.
First, even if it’s Facebook trying to dodge responsibility, um, since no one trusts Facebook at all, shouldn’t we want the responsibility taken away from Facebook? Lots of companies that don’t want to take responsibility just throw up their hands and do nothing. Here, at least, Facebook is looking to hand it off to others to actually do the hard work that the company itself doesn’t want to do. And, even better, it wants to hand it off to a bunch of actual subject matter experts rather than a stressed out, overworked internal team that gets no respect in the first place and has to deal with all sorts of competing interests and incentives.
Second, a huge part of the problem that everyone is facing regarding the internet giants and their moderation choices is that there is zero transparency involved, and that’s a big part of what makes things so uncomfortable for people. From the outside, the big decisions (especially on the content moderation policy side) are suddenly announced by the company, with no insight into who is making these decisions or how. Indeed, the few leaks from behind the blue curtain have not been pretty. Letting actual experts figure out better ways forward, where the public knows who they are, and (hopefully) gets to see some of the reasoning and thought process, might not only come up with better overall policies, but also create more transparency and understanding about the policymaking process itself.
So, yes, sure, this very well may be a cynical move. And maybe Zuckerberg gets to laugh at people for letting him off the hook, but frankly, I don’t want Zuckerberg personally to be on the hook for these kinds of decisions. Hand it off to experts, make the process public, let us learn about it and see what happens, rather than trusting Mark or his band of former political operatives to make the decisions.