When Facebook Turned Off Its News Feed Algorithm, It Made Everyone's Experience Worse… But Made Facebook More Money
from the oh,-look-at-that dept
For reasons I don’t fully understand, over the last few months, many critics of “big tech” and Facebook, in particular, have latched onto the idea that “the algorithm” is the problem. It’s been almost weird how frequently people insist to me that if only social media got rid of algorithmically recommending stuff, and went back to the old fashioned chronological news feed order, all would be good in the world again. Some of this seems based on the idea that algorithms are primed to lead people down a garden path from one type of video to ever more extreme videos (which certainly has happened, though how often is never made clear). Some of it seems to be a bit of a kneejerk reaction to simply disliking the fact that these companies (which many people don’t really trust) are making decisions about what you may and may not like — and that feels kinda creepy.
In the past few weeks, there’s been a bit of a fever pitch on this topic, partly in response to whistleblower Frances Haugen’s leak of documents, in which she argues that Facebook’s algorithm is a big part of the problem. And then there’s the recent attempt by some Democrats in Congress to take away Section 230 from algorithmically recommended information. As I noted, the bill is so problematic that it’s not clear what it’s actually solving.
But underlying all of this is a general opinion that “algorithms” and “algorithmic recommendations” are inherently bad and problematic. And, frankly, I’m confused by this. At a personal level, the tools I’ve used that do algorithmic recommendations (mainly: Google News, Twitter, and YouTube) have been… really, really useful? And also pretty accurate over time in learning what I want, and thus providing me more useful content in a more efficient manner, which has been pretty good for me, personally. I recognize that not everyone has that experience, but at the very least, before we unilaterally declare algorithms and recommendation engines as bad, it might help to understand how often they’re recommending stuff that’s useful and helpful, as compared to how often they’re causing problems.
And, for all the talk about how Haugen’s leaking has shown a light on the “dangers” of algorithms, the actual documents that she’s leaked might suggest something else entirely. Reporter Alex Kantrowitz has reported on one of the leaked documents, regarding a study Facebook did on what happens when Facebook turns off the algorithmic rankings and… it was not pretty. But, contrary to common belief, Facebook actually made more money without the News Feed algorithm.
In February 2018, a Facebook researcher all but shut off the News Feed ranking algorithm for .05% of Facebook users. ?What happens if we delete ranked News Feed?? they asked in an internal report summing up the experiment. Their findings: Without a News Feed algorithm, engagement on Facebook drops significantly, people hide 50% more posts, content from Facebook Groups rises to the top, and ? surprisingly ? Facebook makes even more money from users scrolling through the News Feed.
Considering how often we’ve heard, including from Haugen herself, that Facebook’s decision-making is almost always driven by what will beneficially impact the bottom line the most, this deserves some consideration. Because the document… suggests something quite different. In fact, what the researchers seemed to find was that people hated it, but it made them spend more time on the site and see more ads because they had to poke around to try to find the interesting stuff they wanted to see, and that drove up ad rates. If Facebook were truly focused on just the bottom line, then, they should consider turning off the news feed algorithm — or, just supporting the awful JAMA bill in Congress which will create incentives for the same result:
Turning off the News Feed ranking algorithm, the researcher found, led to a worse experience almost across the board. People spent more time scrolling through the News Feed searching for interesting stuff, and saw more advertisements as they went (hence the revenue spike). They hid 50% more posts, indicating they weren?t thrilled with what they were seeing. They saw more Groups content, because Groups is one of the few places on Facebook that remains vibrant. And they saw double the amount of posts from public pages they don?t follow, often because friends commented on those pages. ?We reduce the distribution of these posts massively as they seem to be a constant quality compliant,? the researcher said of the public pages.
As always, there are lots of factors that go into this, and one experiment may not be enough to tell us much. Also, it’s entirely possible that over time, the long term result would be less revenue because the increasing annoyances of not finding the more interesting stuff causes people to leave the platform entirely. But, at the very least, this leaked research pokes a pretty big hole in the idea that getting rid of algorithmic recommendations does anything particularly useful.