Americans Disagree On What Content Should Be Moderated, But They All Agree Social Media Companies Suck At Moderation
from the truly-a-land-of-contrasts dept
No one agrees on how tech companies should perform the impossible job of moderating their platforms. But almost everyone agrees platforms are doing it wrong.
Conservatives complain too many of their fellow social media users are being silenced by left-leaning tech companies. Those on the left seem to feel not enough is being done to silence people engaged in hate speech or other abusive actions. Both sides agree there’s too much misinformation being spread, although they disagree greatly about which news sources are the “fakest.”
Since it’s impossible to please everyone, almost everyone is going to have complaints about moderation efforts. That’s the key finding of a recently-released poll [PDF] from Gallup and the Knight Foundation. When it comes to moderation, just about everyone agrees social media companies are handling it badly.
Americans do not trust social media companies much (44%) or at all (40%) to make the right decisions about what content should or should not be allowed on online platforms.
That’s pretty much everybody. The perception that companies aren’t making the right decisions flows directly from the disagreement between social media users on what content platforms should focus on moderating.
The level of concern about online foreign interference in U.S. elections varies sharply by political party. Whereas 80% of Democrats are very concerned about this issue, just 23% of Republicans are.
Similarly, when it comes to concerns about hate speech and other abusive online behaviors, Democrats are more likely to say they are very concerned about the issue (76%), compared to Republicans (38%) and independents (50%).
A smaller, although notable gap also can be seen in views on misinformation. Americans who identify as Democrats (84%) or independents (71%) are more likely than Republicans (65%) to say the spread of misinformation is very concerning.
Content moderation has become a partisan issue. The only bipartisan aspect of this is that both sides agree platforms are handling moderation poorly and that they wield too much power. But a majority of both parties agree allowing platforms to handle moderation without government interference is the least bad option.
Even though Americans distrust internet and technology companies to make the right decisions around what content appears on their sites, given a choice, they would still prefer the companies (55%) rather than the government (44%) make those decisions.
This is despite the fact that almost everyone agrees other users are getting away with stuff.
Most Americans do not think major internet companies apply the same standards in the same way to all people who use their apps and websites (78%). This includes 89% of Republicans, 73% of Democrats and 76% of independents.
So, what do we have? A fractured social media landscape mostly divided down party lines. Adding the government to this mix would only increase the perception of bias, if not actually insert bias where none may be currently present.
Social media companies are being asked to moderate millions of pieces of content every day. This would be nearly impossible if moderation only dealt with clearly illegal content (like child porn) and obvious violations of terms of service. But they’re asked to determine what is hate speech, to target nebulous concepts like “terrorist content,” to combat misinformation, and to deal with everything else users report as perceived violations.
This report shows a lot of perceived bias by tech companies is based on users’ own political biases. Much of what users claim tech companies are doing wrong depends on their party alignment. Fortunately, both sides agree the government would probably handle this worse, but it’s only a slim majority of those polled.
The biases seen in this poll carry over to moderators themselves, who will never be free of their own biases. This includes the algorithms used to handle most of the moderation load. But bias at the moderation level isn’t enough to shift an entire platform towards one side or the other. There’s simply too much content in play at any given time to allow moderators to create an echo chamber.
Moderation efforts will never please everyone. What’s being done now pleases almost no one. And that’s the way it’s going to be in perpetuity. And the more the government leans on tech companies to do “more” in response to whatever is the latest hot button topic, the less effectively it will be done. Moderation resources are finite. User-generated content isn’t.