YouTube Radicals and Recommendation Bias

The Assistant Village Idiot passed along an interesting article about concerns being raised over YouTube’s tendency to “radicalize” suggestions in order to keep people on the site. I’ve talked before about the hidden dangers and biases algorithms can have over our lives, and this was an interesting example.

Essentially, it appears that YouTube has a tendency to suggest more inflammatory or radical content in response to both regular searches and in response to watching more “mainstream” viewing. So for example, if you search for the phrase “the Pope” as I just did in incognito mode on Chrome, it gives me these as the top 2 hits:

Neither of those videos are even the most watched Pope videos….scrolling down a bit shows some funny moments with the Pope (little boy steals the show) with 2.1 million hits and a Jimmy Kimmel bit on him with 4 million views.

According to the article, watching more mainstream news stories will quickly get you to more biased or inflammatory content. It appears that in it’s quest to make an algorithm that will keep users on the site, YouTube has created the digital equivalent of junk food…..content that is tempting but without a lot of substance.

It makes a certain amount of sense if you think about it. Users may not have time to really play around much on YouTube, unless the next thing they see is slightly more tempting than what they were originally looking for. Very few people would watch three videos in a row of Obama State of the Union Address coverage, but you might watch Obama’s State of the Union address followed by Obama’s last White House Correspondents Dinner talk followed by “Obama’s best comebacks” (the videos I got suggested to me when I looked for “Obama state of the Union”.

Even with benign things I’ve noticed this tendency. For example, my favorite go to YouTube channel after a long day is the Epic Rap Battles of History channel. After I’ve watched two or three videos, I started noticing it would point me towards videos from the creators lesser-watched personal channels. I actually had thought this was some sort of setting the creators set, but now I’m wondering if it’s the same algorithm. Maybe people doing random clicking gravitate towards lesser watched content as they keep watching. Who knows.

What makes this trend a little concerning is that so many young people use YouTube to learn about different things. My science teacher brother had mentioned seeing an uptick in kids spouting conspiracy theories in his classes, and I’m wondering if this is part of the reason. Back in my day, kids had to actually go looking for their offbeat conspiracy theories, now YouTube brings this right to them. In fact a science teacher who asks their kids to look for information on a benign topic may find that they’ve now inadvertently put them in the path of conspiracy theories that came up as video recommendations after the real science. It seems like this algorithm may have inadvertently stumbled on how to prime people for conversion to radical thought, just through collecting data.

According the the Wall Street Journal, YouTube is looking to tackle this problem, but it’s not clear how they’re  going to do that without running in to the same problems Facebook did when it started to crack down on fake news. It will be interesting to watch this develop, and it’s a good bias to keep in mind.

In the meantime, here’s my current favorite Epic Rap Battle:

 

One thought on “YouTube Radicals and Recommendation Bias

  1. Pingback: What I’m Reading: May 2018 | graph paper diaries

Comments are closed.