Lessons Learned From Creating Good Faith Debate In A Sea Of Garbage Disinformation
from the it's-possible dept
A few weeks ago, Elizabeth Dwoskin, Will Oremus and Gerrit De Vynck from the Washington Post published one of the most fascinating — and in some ways, most important — discussions of social media and dealing with “disinformation” that I’ve seen in a while. It touches on two things I’ve written about recently — how the way we talk about disinformation is not helpful and the difficulty in determining how to deal with bad faith actors.
The WaPo article talks about a group on Facebook — set up by concerned mothers — that focuses on having thoughtful debate about vaccines — an area that is fraught with misinformation, disinformation, utter nonsense, and propaganda. But where there may be legitimate causes for debate and concern. But the problem is that the space is so flooded with nonsense that it feels like any attempt to discuss stuff seriously quickly slides into the nonsense zone and everyone backs into their usual corners. But what this article notes is that it is possible to have a good faith debate on such topics, even with people who believe strongly in debunked nonsense. The real trick? Having strict rules and following them:
?The most important rule was ?civility,? ? Bilowitz said. ?There are some groups online where people just yell at each other. We wanted to just be able to talk to one another without it getting that way.?
Vaccine Talk now has nearly 70,000 members, each of whom must gain administrators? approval to join and commit to a code of conduct. Strict rules prohibit users from misrepresenting themselves, offering medical advice and harassing or bullying people. Another key rule: Be ready to provide citations within 24 hours for any claim you make. Twenty-five moderators and administrators in six countries monitor the posts, and those who flout the rules are kicked out.
So, there is heavy moderation going on in the group, but they’re not just banning people who spout nonsense. They ask that they cite the evidence for such nonsense, and those who don’t live up to the rules face consequences. It’s not surprising that bad faith actors have trouble following the rules:
?Usually, the hardcore anti-vaxxers cannot follow the rules,? Bilowitz said. ?They are usually spamming people with their commentary. I think it?s hard for them: They are basically coming out of an echo chamber.?
And the interesting bit is that this community seems to be working in actually convincing some skeptics:
Monica Buescher, a 32-year-old teacher in Vacaville, Calif., said she went ?deep down the rabbit hole? of anti-vaccine misinformation when she had her second child in 2019. Convinced that shots were dangerous, she nonetheless wanted to hear the pro-vaccine side. She found her way to Vaccine Talk, which she said had a reputation among anti-vaccine groups as being ?mean? for banning those who made claims without scientific evidence.
On Vaccine Talk, Buescher credits a handful of people with walking her through the scientific evidence and persuading her that routine childhood vaccines are safe and effective. Now, Buescher is helping her friends and family navigate conflicting information about coronavirus vaccines.
There are other examples as well.
Of course, the story also reinforces the idea that expecting the big companies to do all the moderation and enforcement themselves leads to questionable outcomes as well:
Vaccine Talk represents exactly the type of conversations Facebook says it wants to cultivate. But Bilowitz said the social network?s often clumsy and heavy-handed enforcement of covid misinformation policies has made their work more difficult. In June, Facebook temporarily shut down the group because someone posted an article deemed to be misinformation. But the poster had been seeking advice on how to rebut the article.
?We were just caught up in the algorithm,? Bilowitz said, ?and felt there wasn?t a human in charge of the process.?
The article highlights how the constant drumbeat of people (including the White House) demanding that Facebook “do more” is often counterproductive.
?Facebook is attempting to shut down misinformation by shutting down all conversation entirely,? she said. ?I strongly believe that civil, evidence-based discussion works, and Facebook?s policies make it extremely difficult for that to happen.?
There’s a lot more in the article that is worth thinking about with regards to these debates about mis- and disinformation and how to deal with it. And it is notable that this is a successful, decently large community, that doesn’t just shut down dissent, but rather simply requires those spreading it to back it up, and makes it clear that if you cannot, you face consequences. And it’s also useful in highlighting how the general demands to put all the responsibility on big companies like Facebook can backfire.
I honestly think this article is one of the most important ones we’ve seen on various content moderation debates regarding misinformation, and I expect to cite back to it often.