Facebook Ranking News Sources By Trust Is A Bad Idea… But No One At Facebook Will Read Our Untrustworthy Analysis

from the you-guys-are-doing-it-wrong-again dept

At some point I need to write a bigger piece on these kinds of things, though I’ve mentioned it here and there over the past couple of years. For all the complaints about how “bad stuff” is appearing on the big platforms (mainly: Facebook, YouTube, and Twitter), it’s depressing how many people think the answer is “well, those platforms should stop the bad stuff.” As we’ve discussed, this is problematic on multiple levels. First, handing over the “content policing” function to these platforms is, well, probably not such a good idea. Historically they’ve been really bad at it, and there’s little reason to think they’re going to get any better no matter how much money they throw at artificial intelligence or how many people they hire to moderate content. Second, it requires some sort of objective reality for what’s “bad stuff.” And that’s impossible. One person’s bad stuff is another person’s good stuff. And almost any decision is going to get criticized by someone or another. It’s why suddenly a bunch of foolish people are falsely claiming that these platforms are required by law to be “neutral.” (They’re not).

But, as more and more pressure is put on these platforms, eventually they feel they have little choice to do something… and inevitably, they try to step up their content policing. The latest, as you may have heard, is that Facebook has started to rank news organizations by trust.

Facebook CEO Mark Zuckerberg said Tuesday that the company has already begun to implement a system that ranks news organizations based on trustworthiness, and promotes or suppresses its content based on that metric.

Zuckerberg said the company has gathered data on how consumers perceive news brands by asking them to identify whether they have heard of various publications and if they trust them.

?We put [that data] into the system, and it is acting as a boost or a suppression, and we?re going to dial up the intensity of that over time,” he said. “We feel like we have a responsibility to further [break] down polarization and find common ground.?

But, as with the lack of an objective definition of “bad,” you’ve got the same problem with “trust.” For example, I sure don’t trust “the system” that Zuckerberg mentions above to do a particularly good job of determining which news sources are trustworthy. And, again, trust is such a subjective concept, that lots of people inherently trust certain sources over others — even when those sources have long histories of being full of crap. And given how much “trust” is actually driven by “confirmation bias” it’s difficult to see how this solution from Facebook will do any good. Take, for example, (totally hypothetically), that Facebook determines that Infowars is untrustworthy. Many people may agree that a site famous for spreading conspiracy theories and pushing sketchy “supplements” that you need because of conspiracy theory x, y or z, is not particularly trustworthy. But, for those who do like Infowars, how are they likely to react to this kind of thing? They’re not suddenly going to decide the NY Times and the Wall Street Journal are more trustworthy. They’re going to see it as a conspiracy for Facebook to continue to suppress the truth.

Confirmation bias is a hell of a drug, and Facebook trying to push people in one direction is not going to go over well.

To reveal all of this, Zuckerberg apparently invited a bunch of news organizations to talk about it:

Zuckerberg met with a group of news media executives at the Rosewood Sand Hill hotel in Menlo Park after delivering his keynote speech at Facebook?s annual F8 developer conference Tuesday.

The meeting included representatives from BuzzFeed News, the Information, Quartz, the New York Times, CNN, the Wall Street Journal, NBC, Recode, Univision, Barron?s, the Daily Beast, the Economist, HuffPost, Insider, the Atlantic, the New York Post, and others.

We weren’t invited. Does that mean Facebook doesn’t view us as trustworthy? I guess so. So it seems unlikely that he’ll much care about what we have to say, but we’ll say it anyway (though you probably won’t be able to read this on Facebook):

Facebook: You’re Doing It Wrong.

Facebook should never be the arbiter of truth, no matter how much people push it to be. Instead, it can and should be providing tools for its users to have more control. Let them create better filters. Let them apply their own “trust” metrics, or share trust metrics that others create. Or, as we’ve suggested on the privacy front, open up the system to let third parties come in and offer up their own trust rankings. Will that reinforce some echo chambers and filter bubbles? Perhaps. But that’s not Facebook’s fault — it’s part of the nature of human beings and confirmation bias.

Or, hey, Facebook could take a real leap forward and move away from being a centralized silo of information and truly disrupt its own setup — pushing the information and data out to the edges, where the users could have more control over it themselves. And not in the simplistic manner of Facebook’s other “big” announcement of the week about how it’ll now let users opt-out of Facebook tracking them around the web (leaving out that they kinda needed to do this to deal with the GDPR in the EU). Opting out is one thing — pushing the actual data control back to the end users and distributing it is something entirely different.

In the early days of the web, people set up their own websites, and had pretty much full control over the data and what was done there. It was much more distributed. Over time we’ve moved more and more to this silo model in which Facebook is the giant silo where everyone puts their content… and has to play by Facebook’s rules. But with that came responsibility on Facebook’s part for everything bad that anyone did on their platform. And, hey, let’s face it, some people do bad stuff. The answer isn’t to force Facebook to police all bad stuff, it should be to move back towards a system where information is more distributed, and we’re not pressured into certain content because that same Facebook thinks it will lead to the most “engagement.”

Push the content and the data out and focus on the thing that Facebook has always been better at at it’s core: the connection function. Connect people, but don’t control all of the content. Don’t feel the need to police the content. Don’t feel the need to decide who’s trustworthy and who isn’t. Be the protocol, not the platform, and open up the system so that anyone else can provide a trust overlay, and let those overlays compete. It would take Facebook out of the business of having to decide what’s good and what’s bad and would give end users much more control.

Facebook, of course, seems unlikely to do this. The value of the control is that it allows them to capture more of the money from the attention generated on their platform. But, really, if it doesn’t want to keep dealing with these headaches, it seems like the only reasonable way forward.

Filed Under: , , , , , , ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Facebook Ranking News Sources By Trust Is A Bad Idea… But No One At Facebook Will Read Our Untrustworthy Analysis”

Subscribe: RSS Leave a comment
hij (profile) says:

Everything old is new again

Looks like Facebook learned nothing from AOL and their attempts to keep other businesses out. At least I do not have to figure out clever things to do with Facebook CDs. This does have the advantage, though, in that it will split European regulators between those that favour personal liberty and those who favour traditional media outlets.

Ninja (profile) says:

How about don’t remove anything unless it’s clearly a crime and even then if it’s a crime in a country and not in another then just filter the former out. And tell sensitive people they can go fuck themselves and just hide the thing they don’t like?

It’s pretty simple, if it’s a crime then hide it (or remove if it’s one of those universal things like child porn) and go after the goddamn source. Otherwise just let people hide it and make their own ‘feeds’ rosy unicorn filled things.

If countries like Germany want to push ethereal ‘fake news’ or ‘hate speech’ laws then filter the heck out of their content but let other more mature countries have the thing censorship free.

Is it that hard?

Richard (profile) says:

MSM Trust

_ trust is such a subjective concept, that lots of people inherently trust certain sources over others even when those sources have long histories of being full of crap._

Most MSM outlets have a reputation for being trustworthy – which survives right up to the point where they report on something where you have actual first hand knowledge about – then you realise how bad they actually are – and you never trust them again.

It is not a matter of trusting some outlets – it is a matter of using the output of multiple sources, where you know at least roughly what their bias and expertise is – and then forming your own opinion.

Christenson says:

Re: MSM Trust

This “getting stuff all screwed up” isn’t exactly unique to traditional “main stream media”. It’s a function of the people in the system.

Not that I don’t think there hasn’t been a disconnect between what shows up in MSM and most people’s direct experience. Tell me about the people in your life who have died, and compare that to what you see in the media. I know some that have wrecked cars and died, but this is not news. We’ll lose a few hundred to fentanyl, but this is not on the news. 20,000 or so will shoot themselves and die this year, but this is not news.

Richard (profile) says:

Re: Re: They Openly Admit It.

In addition you would expect them to do exactly that – because that is what everyone (including the conservative media) does exactly that.

More worrying is how Facebook is cowtowing to the most illiberal regimes on the planet:


Anonymous Coward says:

Re: Do you EVER, for even ten seconds, worry about GOOGLE?

I have news for You, YouTube is the main self publishing platform provided by Google, they also own Blogger. However the main search Engine is not a platform that allows people to self publish, it is just an index of the Internet, and indexes all the other self publishing platforms, Like Facebook, Twitter etc. as well as YouTube.

Mike Masnick (profile) says:

Re: Do you EVER, for even ten seconds, worry about GOOGLE?

Astute enough to spot which GIANT "platform" is left out? As always here.]

Er, YouTube is Google. The reason I included YouTube rather than Google in general is that YouTube is Google’s primary user generated content platform, and thus the main platform that people are demanding have their content moderated on…

Anonymous Coward says:

Re: Do you EVER, for even ten seconds, worry about GOOGLE?

Masnick has criticized Google in other articles, but your track record shows you don’t like those either. Because reasons.

The easier thing to do is to not give you what you want because giving you what you want is an endless chore akin to filling a bottomless pit…

Anonymous Coward says:

"trustworthy" MSM can easily become war propaganda cheerleaders

The “trustworthiness” of the press was a big argument that I was having with many people in 2003 (and consistently losing) as my country was preparing to invade Iraq for possession of “weapons of mass destruction” and “ties to Bin Ladin” and various other accusations. That’s because all the so-called “respectable” and “trustworthy” news organizations such as the NY Times were pushing a unified narrative that basically ignored all the contradictory evidence, and in many cases, promoted outright lies as incontrovertible fact.

There were many small American news outlets –including Infowars — that were reporting a much more accurate analysis of the Iraq invasion than any of the “trusted” American mainstream media were reporting. While the well-worn line “if we only knew then what we know now” got thrown around an awful lot in the aftermath of not finding any of the claimed weapons that the invasion was supposed to justify, but the sad fact is that everything was already known “then” (before the invasion) and was all over the internet for anyone who cared to do their own research and make their own conclusions. It was primarily only the US mainstream media that got it wrong, while the rest of the world and the “conspiracy” minded websites were almost universally correct in their skepticism and their reporting of critical facts that the MSM refused to touch.

Many foreign newspapers, including the UK’s Independent and Guardian, dared to question the official war narrative, but not a single U.S. based news outlet dared stray from the official Bush-Cheney war propaganda. The American mainstream press had completely stopped being journalists and were instead acting as cheerleaders for a bogus war that ultimately claimed millions of lives and trillions of dollars.

I can’t believe 15 years have already passed, and I’m still literally seething with rage about the mainstream media leading this country over a cliff — and then blaming it on everyone else but themselves.

The 2003 Iraq invasion is the single biggest reason why I don’t trust any of the American mainstream media news outlets — and I’m sure I never will for as long as I live.

Anonymous Coward says:

Re: Re: "trustworthy" MSM can easily become war propaganda cheerleaders

As Hearst and Pulitzer learned –and demonstrated– way back in the 19th century: war sells newspapers. So it makes perfect sense that the press will always be on the side of starting wars, because it’s in their own commercial interest. (and it’s only much later when the public loses interest that they change tack.)

It’s interesting (but not unexpected) that even Trump’s harshest critics in the press loudly applauded him when he bombed the Syrian government, both last year and this year. And what could be better for newspaper sales and cable news ratings than war against Russia?

SirWired (profile) says:

I have no problem with this

Given a finite amount of data that is going to pop up in a user’s feed, there are going to be SOME criteria over what makes the grade. Some internal measure over how “trustworthy” the source is seems as good a criteria as any.

It’s no different from a search engine ranking search results based on their best guess as to how “useful” the search result is instead of just counting the number of times the keywords show up on the page. Sometimes they’ll guess well, sometimes they won’t and useful results will be ranked lower than some pathetic spam.

Anonymous Coward says:

Two people read the same newspaper article. A cat person and a dog person.

The dog person reads the article and rates it as accurate.

The cat person reads the article and rates it as questionable.

Six months go by and both revisit the article in light of new information.

The dog person rates it as accurate and credible.

The cat person rates it as debunked and not-credible.

Therein lies the challenge with building out algorithms to automate this process. Let’s say you assign a credibility score based on the historic human-based scoring of credibility by source. How frequently that source, or author, accurately reports the news. With as polarized as sources are today, the dog person and the cat person training an algorithm will come away with completely different automated processes reflecting their subjective world view.

That is where we are today.

We have automated algorithms scoring high credibility to sources that report the CrowdStrike analysis of DNC servers suggest it was done by Russians.

We have Bill Binney and associated security researchers claiming what is known about the event has to be an insider from the DNC.

Huge political stakes are at play so expect both sides of that coin to fight tooth and nail to force their will over control of the algorithms that do get deployed.

Richard (profile) says:

Re: Re:

Therein lies the challenge with building out algorithms to automate this process. Let’s say you assign a credibility score based on the historic human-based scoring of credibility by source. How frequently that source, or author, accurately reports the news. With as polarized as sources are today, the dog person and the cat person training an algorithm will come away with completely different automated processes reflecting their subjective world view.

Actually it is far more complicated than that.

There is no straight left – right axis anymore

Foreign policy is even more complicated.

Compare for example George Galloway’s take on the world with that of Peter Hitchens.

On the Israel vs Arab (Muslim) axis they would violently disagree

On the Russia vs the West axis they agree (with each other) and both disagree with the western establishment view.

The fact is that most people (and news outlets) are a mixture of different biases on different topics – and they may change with time.

Generally I tend to trust the BBC and the Guardian – but on some issues I know that they have a bias and so I will look at other sources too.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...