from the it's-one-way-to-deal-with-it dept
As you may recall, a few weeks back, Twitter made a decision to add a fact check to some tweets by President Trump, and a few days later, to put a label on some of his tweets, saying that they violated Twitter’s policies, and would normally be deleted, but Twitter decided that given the newsworthiness of the speaker, they would be left up (though without the ability to comment or retweet them). The president reacted about as well as expected, meaning he whined vociferously, and eventually issued a silly executive order.
Of course, the other end of this story was that Trump posted some of the same content to Facebook, and Facebook chose to do nothing. Indeed, Mark Zuckerberg pulled out this ridiculous self-serving, sanctimonious nonsense about how Facebook would allow that content because he didn’t want to be “the arbiter of truth.” Except, of course, Facebook does fact checks and content moderation all the time. This seemed to be a lot more about currying favor with the president, than any principled stand.
It created a big fuss within (and outside) the company, and as with any situation in which a social media website says it’s taking a hands-off approach, it eventually proves to be totally unworkable. It seems to have taken all of a month for Facebook to recognize this as well.
On Friday, Mark Zuckerberg announced a bunch of changes to Facebook’s policies that appear to be pretty damn similar to what Twitter did a month earlier, which Zuckerberg originally pretended was a bad idea. Amidst a larger rollout of changes to fight voter suppression and misinformation, there was this:
A handful of times a year, we leave up content that would otherwise violate our policies if the public interest value outweighs the risk of harm. Often, seeing speech from politicians is in the public interest, and in the same way that news outlets will report what a politician says, we think people should generally be able to see it for themselves on our platforms.
We will soon start labeling some of the content we leave up because it is deemed newsworthy, so people can know when this is the case. We’ll allow people to share this content to condemn it, just like we do with other problematic content, because this is an important part of how we discuss what’s acceptable in our society — but we’ll add a prompt to tell people that the content they’re sharing may violate our policies.
To clarify one point: there is no newsworthiness exemption to content that incites violence or suppresses voting. Even if a politician or government official says it, if we determine that content may lead to violence or deprive people of their right to vote, we will take that content down. Similarly, there are no exceptions for politicians in any of the policies I’m announcing here today.
Frankly, I think this is the best of a bunch of bad solutions. There really isn’t a great answer here, even though people always assume there’s “the right way” to do this. Among your options:
- Do nothing: What Zuckerberg initially claimed Facebook would do. But this then allows people — including politicians — to spread ridiculous lies, sometimes hateful, or violence inducing, without any way to stop it. It pisses off users of your platform, as well as advertisers.
- Take the content down: This pisses off the lying politicians, who are in a position to make your life even more miserable. See the response in Congress to Twitter doing just a little bit of moderation, in which victim-playing Republicans suddenly pretended that Twitter was “censoring them” and demanding revenge.
- Calling out newsworthy exemptions: More or less where both companies have ended up. This still leads to complaints from both sides, but is a form of a compromise — and one that involves adding “more speech” to questionable speech, rather than completely erasing some speech or pretending that the original speech was perfectly acceptable.
Is this the “best” possible resolution? Almost certainly not. But it does show how the companies continue to struggle through this and adapt to try to come up with solutions that make the most sense in a world where every option has significant trade-offs.