Well, Duh: Facebook's System To Stop 'Fake News' Isn't Working — Because Facebook Isn't The Problem

from the get-a-little-perspective dept

It’s not like we didn’t say right away that those rushing to blame Facebook for “fake news” were missing the point and that the problem was always with the nature of confirmation bias, rather than the systems people use to support their own views. But, alas, the roar of “but Facebook must be the problem, because we saw “fake news” on Facebook” along with the related “but, come on, it must ‘take responsibility'” arguments kept getting louder and louder, to the point that Facebook agreed to start trying to warn people of fake news.

And, guess what? Just like basically every attempt to stifle speech without looking at the underlying causes of that speech… it’s backfiring. The new warning labels are not stopping the spread of “fake news” and may, in fact, be helping it.

When Facebook?s new fact-checking system labeled a Newport Buzz article as possible ?fake news?, warning users against sharing it, something unexpected happened. Traffic to the story skyrocketed, according to Christian Winthrop, editor of the local Rhode Island website.

?A bunch of conservative groups grabbed this and said, ?Hey, they are trying to silence this blog ? share, share share,?? said Winthrop, who published the story that falsely claimed hundreds of thousands of Irish people were brought to the US as slaves. ?With Facebook trying to throttle it and say, ?Don?t share it,? it actually had the opposite effect.?

Again, this isn’t a surprise. Fake news was never the issue. People weren’t changing their minds based on fake news. They were using it for confirmation of their views. And when you get contradictory information, cognitive dissonance kicks in, and you rationalize why your beliefs were right. In fact, studies haves shown that when questionable beliefs are attacked with facts, it often makes the believers dig in even stronger. And that seems to be what’s happening here. With efforts made to call out “fake news” the people who believe it just see this as “fake news” itself — and an attack on what they believe is true. It’s easy to chalk up any fake news labels as just part of the grand conspiracy to suppress info “they” don’t want you to see.

The article goes on to talk to a bunch of different people who operate sites that had articles dinged with the “fake news” scarlet letter from Facebook, and most of them (though, not all) say they saw no real impact on traffic.

Of course, because we’ve seen this kind of thing play out before, it’s likely that rather than recognizing Facebook isn’t the issue, people who are angry about what they believe to be the scourge of “fake news” will also double down — just like those who fall for “fake news.” They’ll insist that it’s Facebook’s fault that the fake news issue didn’t just go away when Facebook put warning labels on stories. They’ll ignore the fact that they were the ones demanding such things in the first place, and that they insisted such labels would work. Instead, they’ll argue that Facebook should be doing even more to suppress “fake news” and never consider that maybe they’re targeting the symptoms and not the actual disease.

Facebook has always been an easy target, but Facebook isn’t the problem. People want to share bogus, fake, or misleading news, because it confirms their biases and beliefs and makes them feel good. That’s not Facebook’s fault. It’s a problem in how we educate people and how we teach basic media literacy. That’s not going to be fixed with warning labels.

Filed Under: ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Well, Duh: Facebook's System To Stop 'Fake News' Isn't Working — Because Facebook Isn't The Problem”

Subscribe: RSS Leave a comment
52 Comments
Ninja (profile) says:

It will also not be fixed within a generation life span. I’d argue we’d need changes in how we deal with education out there to prepare people to be critical thinkers instead of bots that can do math and spew biology, chemistry knowledge they memorized. I’d argue that if we start now it’ll still take more than 30 years to solve the problem.

Christenson says:

I'm right. You're Wrong

I’m right. You’re Wrong.
Confirmation Bias.
Us. Them.
Science. Religion.
Four Estates.
I saw it on the internet, it must be true!

So, if facts don’t matter one iota once the mind is made up, what to do? Which are the “more words” we need here?

The internet has reduced the barriers to reaching large audiences, whatever the message.

William Braunfeld (profile) says:

Re: I'm right. You're Wrong

A wise debator, when faced wit a “true believer” in something, will focus his efforts not on changing said believer’s mind, but on changing the minds of the receptive audience. When you are a third party whose beliefs are not being “attacked” directly, you have more room to think; when you are on the fence or even just not fully committed to an idea, the debator can reach you. Think Bill Nye vs Chris Han (think that’s his name); Nye wasn’t there to change Han’s mind, he was there to show the audience how ridiculous Han’s claims are when examined closely.

Jono (profile) says:

This is a long-term problem with a long-term solution

Anybody who thinks the development of rules – private or public – to curb the proliferation of “fake news” is the answer or will even be remotely successful is deeply misguided. This whole trend of soft censorship has been increasingly worrying.

This is an issue a long time in the making and I don’t see any quick solution. The only solution I see is a vastly improve education system that caters to even those at the bottom.

Fake news is not a failure of our media companies or news aggregators. It is a signal of a failure in our society to produce critically-thinking adults who can take a logical view of the world.

Anonymous Coward says:

Re: This is a long-term problem with a long-term solution

Make all information freely available to all people instead of hoarding it and you will make everyone better rather than making most people worse for the pleasure of a few.

Allow things to stand on their merit vs be promoted and controlled with violence.

Chuck says:

Re: Re:

On the one hand, this is a horrible way to look at the problem and very backwards-looking.

On the other hand, I totally agree with you. Crazy has always found a comfortable place on the internet, but it used to stay in its own little corner and leave me the hell alone. With the rise of Facebook, all the crazy and sane are mixing together, and it’s not having the desired effect (i.e. bringing sanity to the crazy). Instead, it’s just causing the crazy to spread faster.

The only facebook filter that’d have an immediate effect here would just be to shut down facebook entirely for a month. They’d all go find somewhere else to congregate (likely 4chan) and then facebook could do a grand relaunch a month later and the site would likely be better for it.

Long-term, this is an education problem and won’t be solved in my lifetime. Oh well.

JoeCool (profile) says:

Re: Re: Re:

The problem is simple – a person is smart, people are stupid. The larger a group, the lower the intelligence of the group as a whole. Stick to sites that have a SMALL number of members. I’ve never seen an exception – the more members a forum has, the greater the idiocy you’ll find. If you want to know if a forum is worth joining, check the total number of members.

Anonymous Coward says:

confirmation bias, or just plain confirmation

When people against Facebooks “fake news” labeling plan said it amounted to restricting speech can then prove that their algorithm is restricting their speech, I don’t think that’s confirmation bias, it just confirmation that restricting speech is what Facebook is doing…..

Anonymous Coward says:

Your bias is to regard Facebook as fair and impartial trying to stop fake news, rather than itself a globalist propaganda front suppressing the opposition.

Every day you give evidence to confirm my bias that you too are a globalist and a corporatist. Indeed, seeing ANY good in Facebook is that evidence. Facebook is a problem of TOO MANY SORTS to go into. This is standard tactic of getting otherwise sane people to defend a data slurping globalist front.

As I said far back, we’re WAY down the rabbit hole (that means into irrational fantasy). For instance, I keep being amazed that the Trump-Russia allegations keep going as if there’s any evidence for it beyond unending allegations by the usual globalists — led by NY Times. And now they’re screaming for impeachment because Trump told the commies “classified” information, which apparently amounts to “you could put a bomb in a laptop computer”. Only purpose to those that I see is so no one knows or cares about The Truth anymore. — Someone please state the evidence for Trump-Russia — other than “major media sez and I believe it”. I’d LIKE to see it. — Whoever and so far as the Trump-Russia allegations are taken seriously, I don’t believe care about Truth.

Baron von Robber says:

Re: Re: Re:2 Your bias is to regard Facebook as fair and impartial trying to stop fake news, rather than itself a globalist propaganda front suppressing the opposition.

It’s like the legal proverb, ““If the facts are against you, argue the law. If the law is against you, argue the facts. If the law and the facts are against you, pound the table and yell like hell”

Baron von Robber says:

Re: Your bias is to regard Facebook as fair and impartial trying to stop fake news, rather than itself a globalist propaganda front suppressing the opposition.

Well, looks like Massad won’t share intel with the US anymore because of Trump-Russia.

http://www.timesofisrael.com/former-israeli-spymasters-rip-into-trump-say-israel-must-reassess-intel-sharing/

John Cressman (profile) says:

Fake News

Fake News is, in fact, fake news. There have always been scams, false information, skewed studies (ie. 3 cups of milk makes you lose weight, sponsored by the American Dairy Association), etc.

It’s EVERYONE’S responsibility to do the research for themselves and figure out what is truth and what isn’t. The MOMENT you let someone else decide that for you (Facebook, the Government, etc) is the moment you give up your free will and mindlessly follow like the sheepeople you are.

Do not go quietly into the night (of ignorance).

Wendy Cockcroft (user link) says:

Re: Fake News

John, you were doing great till you got to “sheepeople.”

I believe that promoting empiricism, the theory that all knowledge is based on experience derived from the senses, is a good place to start. We need to check and cross-reference with trusted sources but generally speaking partisan sources are best avoided.

Question to ask: is the story being told here an appeal to or against my personal biases? Does the language provoke an emotional response in me? Is it intended to? Does it bolster or tear down (or try to) the core beliefs of a partisan group?

I prefer neutral, fact-based reporting (“this is what happened…”) to “these are the bad guys, go and get them.” While TD writers do occasionally rant the articles are not particularly biased, they just call out bad behaviour on the part of individuals and groups. Heck, they even called out St. Ron of Wyden for supporting the TPP. Why? They considered it bad behaviour on his part. No one is immune to criticism here whether they are generally portrayed in a favourable light or not. This is what keeps me coming back to TD again and again for tech news.

JoeCool (profile) says:

Re: Re:

Sorry, that won’t work. Let’s say that the absolute smartest a person can get is 1.0. Say a smart person is 0.9, and an idiot is 0.5. How smart can a group of people be? If they’re all smart, it only takes seven people to make the group dumber than an idiot (.9.9.9.9.9.9.9 = .48). And so you see why a person is smart, people are dumb. It doesn’t matter how smart each individual person is, eventually the group becomes stupid as it gets large enough.

Telecart (profile) says:

Usually I'm with you, disagree on this

What we who cried "Fake News!" wanted was for Facebook to take editorial responsibility for the content they publish and distribute on their site. We didn’t ask for a ‘label’; that was Facebook’s weak response to our actual ask.

Facebook has been very resistant to admitting that it is, in fact, a media company. So long as they did not have an algorithm and you couldn’t pay for added signal, there might have been some merit to the claim they were merely an agnostic publishing platform. But that hasn’t been true in a long time.
If the NYTimes had published blatantly false articles, we would hold the editorial team and their EiC to account. The same ought to hold true for Facebook. Facebook has an editor, one that makes millions of editorial decisions every second. It’s their algorithm.
They have decided, knowingly, to develop an editorial algorithm designed to optimize for user engagement. We now know that this means an editorial bias towards sensationalism and clickbait. There’s no way to know this and to still maintain an agnostic or neutral POV on the matter; where is their integrity? They are arguably the largest media company in the world, and it is their choice to avoid altering the algorithm (=editorial direction) and instead add an external labelling system. I think this is not enough and expect them to own up to their editorial responsibilities as a powerful publisher.

Anonymous Coward says:

Re: Usually I'm with you, disagree on this

If the NYTimes had published blatantly false articles, we would hold the editorial team and their EiC to account.

How exactly would they be held to account?

Despite current events, there are no laws stating that media companies cannot publish fake news. The closest you get is civil libel suits (fake news that is harmful to an individual’s reputation), and that requires evidence of intentional malicious intent toward the target (at least for general damages, you can be compensated for "actual harm" without that. Of course, it’s often very difficult to show "actual harm", so take that as you will). And besides, clearly labeling news as containing false information is sufficient to escape libel anyway, so facebook has actually responded acceptably to the issue from a legal standpoint even assuming they’re a media company.

If by "held to account" you mean "held to account in the court of public opinion" then that’s already happened.

Telecart (profile) says:

addendum on censorship

This is plainly not a freedom of speech or expression issue. No one is jailing anybody for publishing false information. Anyone is free to say or write whatever they please without fear of government persecution. This is not in question.

The matter at hand has to do with Facebook’s juvenile absconding of their responsibilities as a media company. Facebook Newsfeed is arguably the largest publisher of news on the planet, all we are asking for is the application of Ethics in Journalism 101 professional integrity by this publisher in their choices of what information to disseminate. Again, I must stress, they do make these choices all the time. They are not an agnostic platform, they are very deliberate and selective about what they show you in your feed and when. This power entails responsibility.

Christenson says:

Re: addendum on censorship

Oh, but you said the magic word: Responsibility!

Techdirt takes it as axiomatic that consequences should generally flow AFTER speech, and not come from the government. So if Farcebook is just clickbait (and I agree it has a bad case of it), what’s the plan after they are gone? Do you actually think Facebook is somehow special and there won’t be another easy-to-use soapbox when the internet has made those things super easy?

What’s the plan? Measles? Miami underwater? Divided we fall.

Telecart (profile) says:

Re: Re: addendum on censorship

Not sure what you’re getting at; nobody is preventing any speech, certainly not the government. I’m talking about the editorial decision to continue to disseminate plainly false information that has already been published, simply because it feeds into the readers dopaminergic reward wiring, to Facebook’s monetary gain. I think that’s immoral.

Facebook’s reach is unparalleled by almost any other organization on the planet. I do not doubt that other soapboxes exist as indeed The National Enquirer has been in print for 90+ years. I’ve given up expecting integrity from the bottom feeders, and perhaps there ought to be some room for that. But Facebook, with a market cap of ~$418Bn dollars is clearly not that, and we’re allowed to expect more from the grown ups.

Anonymous Coward says:

I think Techdirt is suffering from it’s own confirmation bias here. Traffic went up to a page it labeled as ‘fake news’? There’s an obvious answer to why this would happen – people want to assess the reliability of Facebook’s ‘fake news’ assertion.

I regularly view some of the spam in my Spam folder, not because I believe the subject line, but because I want to make sure that the rules that decided that e-mail was spam are actually still reliable. If I had put in place new spam filter software, I would check even more of my spam, to be sure the software is operating as it should.

If Facebook’s ‘fake news’ labels are shown to be reliable, we should see traffic drop dramatically. If it’s reliable.

Wendy Cockcroft (user link) says:

Re: Re:

Per the article people were being urged to share the article on the grounds that it was being censored.

Some may have tried to find out whether it was true or not but I daresay the ones crying “Share! Share!” were doing so out of principle.

The point is that letting any large organisation decide what “fake news” is or isn’t ain’t the answer. People need to learn how to work things out for themselves instead of passively accepting what they’re told. Thinking for yourself takes effort and being willing to leave the safety of the herd to walk alone. It’s a bit scary sometimes but it’s worth it in the long run.

Anonymous Coward says:

I like how you deliberately leave out the fact that the so-called non-partisan third-party fact checkers are all far left organizations such as Snopes, Politifact, Factcheck, and ABC.

The stupidity runs deep with all of these groups, most especially ABC:

https://twitter.com/NBCNews/status/785299709342654465/photo/1?ref_src=twsrc%5Etfw&ref_url=http%3A%2F%2Fwww.breitbart.com%2Ftech%2F2016%2F12%2F15%2Ffacebook-introduce-warning-labels-stories-deemed-fake-news%2F

The entire anti-fake news system in and of itself a massive attempt at confirmation bias by elitist far left ideologues.

Wendy Cockcroft (user link) says:

Re: Re: Re:

The partisan bias of reality skews left or right depending on the ridiculousness of the stance being taken, to be fair.

I’ve had a damn good laugh at, “De Russkies done stole de election!”, for example. It kills me. Look, if it’s true, the alphabet spaghetti of American security agencies were asleep on the job since they did nothing to protect their country from foreign interference. These agencies, notably the CIA, have been doing the exact same thing to other countries for decades so I’m not buying it; they have the tools, the methods and the money to push back against that kind of thing.

Does this mean Trump has not been up to shenanigans with the Russians? Not necessarily. I’m just not buying the idea of Putin running America by proxy.

Now then, what about the fact checkers? The things they claim are true or false are backed up with evidence; the claims are verifiable, is what I’m saying. People need to learn what words mean: “Left wing” does not mean “people who disagree with alt-right nutters.”

Anonymous Coward says:

If you think the point of fake news is tp change minds, you are mistaken.

Fake news is like the Berlin Wall. Its meant to separate and divide people into more easily addressed sides. It makes those in the middle shift slightly to one side or the other. Once the wall is there they drown them in kool aid.

Yeah metaphore hell.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...