Yet Another Study Debunks The ‘YouTube’s Algorithm Drives People To Extremism’ Argument

from the maybe-the-problem-is-us,-not-the-machines dept

A few weeks ago, we had director Alex Winter on the podcast to talk about his latest documentary, The YouTube Effect. In that film he spoke with a young man who talked about getting “radicalized” on YouTube and going down the “alt-right rabbit hole.” One thing that Alex talked about in the podcast, but was not in the documentary, was that, at one point, he asked the guy to go to YouTube and see if it would take him down that path again, and he couldn’t even get it to recommend sketchy videos no matter how hard he tried.

The story that’s made the rounds over the years was that YouTube’s algorithm was a “radicalization machine.” Indeed, that story has been at the heart of many recent attacks on recommendation algorithms from many different sites.

And yet, it’s not clear that the story holds up. It is possible that it was true at one point, but even that I’d call into question. Two years ago we wrote about a detailed study looking at YouTube’s recommendation algorithm from January 2016 through December of 2019, and try as they might, the researchers could find no evidence that the algorithm pushed people to more extreme content. As that study noted:

We find no evidence that engagement with far-right content is caused by YouTube recommendations systematically, nor do we find clear evidence that anti-woke channels serve as a gateway to the far right.

Anyway, the journal Science now has another study on this same topic that… more or less finds the same thing. This study was done in 2020 (so after the last study) and also finds little evidence of the algorithm driving people down rabbit holes of extremism.

Our findings suggest that YouTube’s algorithms were not sending people down “rabbit holes” during our observation window in 2020…

Indeed, this new research report cites the one we wrote about two years ago, saying that it replicated those findings, but also highlights that it did so during the election year of 2020.

We report two key findings. First, we replicate findings from Hosseinmardi et al. (20) concerning the overall size of the audience for alternative and extreme content and enhance their validity by examining participants’ attitudinal variables. Although almost all participants use YouTube, videos from alternative and extremist channels are overwhelmingly watched by a small minority of participants with high levels of gender and racial resentment. Within this group, total viewership is heavily concentrated among a few individuals, a common finding among studies examining potentially harmful online content (27). Similar to prior work (20), we observe that viewers often reach these videos via external links (e.g., from other social media platforms). In addition, we find that viewers are often subscribers to the channels in question. These findings demonstrate the scientific contribution made by our study. They also highlight that YouTube remains a key hosting provider for alternative and extremist channels, helping them continue to profit from their audience (28, 29) and reinforcing concerns about lax content moderation on the platform (30).

Second, we investigate the prevalence of rabbit holes in YouTube’s recommendations during the fall of 2020. We rarely observe recommendations to alternative or extremist channel videos being shown to, or followed by, nonsubscribers. During our study period, only 3% of participants who were not already subscribed to alternative or extremist channels viewed a video from one of these channels based on a recommendation. On one hand, this finding suggests that unsolicited exposure to potentially harmful content on YouTube in the post-2019 era is rare, in line with findings from prior work (24, 25).

What’s a little odd, though, is that this new study keeps suggesting that this was a result of changes YouTube made to the algorithm in 2019, and even suggests that the possible reason for this finding was that YouTube had already radicalized all the people open to being radicalized.

But… that ignores that the other study that they directly cite found the same thing starting in 2016.

It’s also a little strange in that this new study seems to want to find something to be mad at YouTube about, and seems to focus on the fact that even though the algorithm isn’t driving new users to extremist content, that content is still on YouTube, and external links (often from nonsense peddlers) drive new traffic to it:

Our data indicate that many alternative and extremist channels remain on the platform and attract a small but active audience of individuals who expressed high levels of hostile sexism and racial resentment in survey data collected in 2018. These participants frequently subscribe to the channels in question, generating more frequent recommendations. By continuing to host these channels, YouTube facilitates the growth of problematic communities (many channel views originate in referrals from alternative social media platforms where users with high levels of gender and racial resentment may congregate)

Except, if that’s the case, then it doesn’t really matter what YouTube does here. Because even if YouTube took down that content, those content providers would post it elsewhere (hello Rumble!) and the same nonsense peddlers would just point there. So, it’s unclear what the YouTube problem is in this study.

Hell, the study even finds that in the rare cases where the recommendation algorithm does suggest some nonsense peddler to a non-jackass, that most people know better than to click:

We also observe that people rarely follow recommendations to videos from alternative and extreme channels when they are watching videos from mainstream news and non-news channels.

Either way, I do think it’s fairly clear that the story you’ve heard about YouTube radicalizing people not only isn’t true today, but if it was ever true, it hasn’t been so in a long, long, time.

The issue is not recommendations. It’s not social media. It’s that there is a subset of the population who seem primed and ready to embrace ridiculous, cult-like support for a bunch of grifting nonsense peddlers. I’m not sure how society fixes that, but YouTube isn’t magically going to fix it either.

Filed Under: , , , ,
Companies: youtube

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Yet Another Study Debunks The ‘YouTube’s Algorithm Drives People To Extremism’ Argument”

Subscribe: RSS Leave a comment
14 Comments
Anonymous Coward says:

*checks my YT recommendations

Hm, there’s very little, if any, of any sort of content remotely related to the culture-warring violent insurrectionists, even in the youtube media I watch.

Vtubers, cooking, animals, random science stuff, enough Minecraft content to make people think I’m crazy, and even gaming, which does tend to attract the aforemetioned assholes mentioned.

I guess actually teaching my pet YT algorithm what I WANT to watch is paying off.

Anonymous Coward says:

I can only talk about myself but my experience was different. In general I only used youtube to watch music videos and I’ve locked down most of my privacg settings/don’t search google while signed into gmail. My reccommended feed was nothing but music. That is, until I did a google search for qanon, because I wanted primary sources on what it was, while signed jnto gmail in another window. All of a sudden qanon videos started ahowing up in my recommended feed. It doesn’t force people down the rabbit hole from scratch but once you open the door to look…

Anonymous Coward says:

Re:

see, if I search for “qanon” all I get are videos about its believers being prosecuted for a range of crimes, about how it’s a dangerous cult with an oversized influence on US politics, and how elected officials who follow it believe in jewish space lasers. Which I guess just goes to show, this kind of tailored content only leads to rabbit holes when there’s a ton of pre-existing radicalisation already there via other channels like TV, radio, or tabloids. Normal folks with normal search habits get it framed in terms of normal folks’ horror or amusement at it.

Arijirija says:

Well, from my experience, if I go looking on Youtube for oud music, I find oud music. If I go looking for viola da gamba music, I find viola da gamba music. If I go looking for heavy metal, I find heavy metal. And if I go looking for 60s folk, I find 60s folk.

Youtube hasn’t made me a fan of any of those if I didn’t want to be, or hadn’t already wanted to know about them. So it would appear that GIGO – garbage in garbage out – still applies. Or to rephrase it, you can’t make a silk purse out of a sow’s ear.

Dealing with radicalization requires skills I suspect the people commi9ssioning and doing these studies, don’t have. And having some slight knowledge of the medical field gives me the certainty that dealing with symptoms is not the best way to go – eg, amputation is one way to deal with a headache, and it’s a very quick cure and the problem never occurs again, but it also guarantees you won’t get repeat custom or grateful referrals.

That Anonymous Coward (profile) says:

From the mental giants who believe…

Being GLBTQI+ is a choice – because we enjoy you assholes harassing us for existing so we picked this

Reading a GLBTQI+ book will make your kids teh gay – I dunno you assholes claimed you read the bible and Jesus would smite a bunch of y’all

Addiction is just a lack of willpower & not loving god – and nothing to do with judgemental assholes who make seeking help dehumanizing, demeaning, and ineffective

It is not my baby’s fault, you made them do it – bitch if we had the power to make people do things we’d have used it to make you less of a bitch

The site gives you what you asked it to, it then offers up things similar to what you sought out… so its not giving you videos to make you an extremist you are selecting those videos. Its job isn’t to stop you from doing stupid shit, despite everyone wanting it to (because its impossible to do).

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...