Last week I wrote a bit about the ridiculous and misguided backlash against Facebook
over the election results. The basis of the claim was that there were a bunch of fake or extremely misleading stories shared on the site by Trump supporters, and some felt that helped swing the election (and, yes, there were also fake stories shared by Clinton supporters -- but apparently sharing fake news was nearly twice as common
among Trump supporters than Clinton supporters). I still think this analysis blaming Facebook is wrong. There was confirmation bias, absolutely, but it's not as if a lack of fake news would have changed people's minds. Many were just passing along the fake news because it fit the worldview they already have.
In response to that last post, someone complained that I was arguing that "facts don't matter" and worried that this would just lead to more and more lies and fake news from all sides. I hope that's not the case, but as I said in my reply
, it's somewhat more complicated. Some folks liked that reply
a lot so I'm expanding on it a bit in this post. And the key point is to discuss why "fact checking" doesn't really work in convincing people whom to vote for. This doesn't mean I'm against fact checking, or think that facts don't matter. Quite the reverse. I think more facts are really important, and I've spent lots of time over the years calling out bogus news stories based on factual errors.
But here's the problem: the general business of fact checking seems to merely serve to, again, reinforce and retrench opinions, rather than change them. As I said in my comment, there are a large group of people out there that view the whole fact checking business itself as a sort of condescending "let them eat facts" kind of thing, in which they're being scolded for believing the "wrong" kind of thing. And this has lead (not surprisingly) to widespread attacks on fact checkers themselves as being "biased." You don't have to look very hard to find (often conservative-leaning) publications argue that "fact checkers" are biased against their views. The famed debunking site Snopes has come in for particular attack this year as just a liberal front
. In the past few months, any time I've mentioned Snopes, or seen someone else link to it in our comments, another comment will mock them from linking to such a "biased" or "Clinton-supporting" site. Snopes itself, for it's part, has put up a somewhat amusing page with all of the contradictory accusations of bias
it has received over the years from people who dislike its fact checking on certain politicians.
Senator Daniel Patrick Moynihan famously stated "You're entitled to your opinion, but you are not entitled to your own facts." It's a good quote, but the problem is that plenty of people do feel entitled to their own facts these days. And straight up fact checking seems like the wrong approach. In psychology, there's a concept known as cognitive dissonance
, describing how people basically trick themselves into dealing with contradictory beliefs (the term is technically about the uncomfortable position people are in put in because of the contradictory ideas, but it is commonly used to describe how people effectively trick themselves to get out of that state). It seems to describe how many people end up dealing with inconvenient
facts. They don't change their mind -- they just come up with an excuse as to why the facts presented are wrong or biased. And when they're presented in the form of "fact checking" from a big site or news publication, it's easier than ever to dismiss them, because we're told over and over again that "you can't trust the media." That gives people an out -- when they come across inconvenient facts, they insist that there's bias or a problem with the source
while not dealing with the actual underlying facts. And studies have shown that fact checking can not just fail to convince people in political debates, it can actually make them cling more strongly
to their false beliefs.
I'm not quite sure how to deal with this, but I wonder if the overall approach needs to change. It's pretty uncommon to see people change their minds when just handed a big stack of facts. Some have suggested that convincing people they're wrong on something is so complex that it has to involve them literally transforming how they think of themselves
, which is not going to happen when you just throw a pile of facts at them. In my experience, the times I've been convinced to change my mind, or seen others change their minds, it tends to come when there are long drawn out conversations, exploring the issues in more depth -- with lots of back and forth. But also it tends to happen in environments where the stakes are lower (e.g., often private, rather than public discussions, where no one "loses face" for realizing they were wrong).
Given that, I still don't know what the solution is, but merely pumping up the fact checking isn't going to do much to change anyone's minds. It just angers some, and reinforces the feelings of superiority of others.