YouTube’s algorithm doesn’t care if you “thumb” videos

A photo of a screen on YouTube with the mouse over the dislike button.

YouTube has already stopped showing videos by how many dislikes it has received, but apparently giving a thumbs down to a video doesn’t change how many similar videos the platform recommends to you. Photo: Wachiwit (Shutterstock)

My YouTube recommendations are full of old reruns of Gordon Ramsay’s Kitchen Nightmares. It may be partly my fault to get drunk one night and watch a full episode. Let me tell you, if there’s one thing I don’t want in my feed anymore, it’s the famously blowhard Brit taking down another chef while the world’s most obnoxious sound effects (braaa-reeeee) slide in the background. I didn’t like many of these videos, but now Hell’s Kitchen is popping up on my page, and I feel more and more like a “raw” steak that Ramsay pokes and scolds.

But apparently I’m not the only one with my YouTube recommendation problems. A Mozilla Foundation report released Monday claims, based on a survey and crowdsourced data, that the “dislike” and “don’t recommend channel” feedback tools don’t really change video recommendations.

Well, there are two points here. One is that users constantly feel that the controls that Google-owned YouTube provides don’t really make a difference. Two, based on user data, that the controls have a “negligible” impact on recommendations, meaning “most unwanted videos still slip through”.

The foundation relied on data from its own RegretsReporter browser plugin tool that allows users to block certain YouTube videos from displaying in their feed. The report says it based its analysis on nearly 2,757 respondents and 22,722 people who gave Mozilla access to more than 567 million video recommendations from late 2021 to June 2022.

While the researchers admit that survey respondents aren’t a representative sample of YouTube’s huge and diverse audience, a third of those surveyed said using YouTube’s controls didn’t seem to change their video recommendations at all. One user told Mozilla that they would report videos as misleading or spam and would come back to their feed later. Respondents often said that blocking one channel would only lead to recommendations from similar channels.

G/O Media may receive a commission

50" Amazon Fire 4K TV with a 4-year protection plan

21% discount

50-inch Amazon Fire 4K TV with a 4-year protection plan

Keep it Covered Means you’re protected from mechanical and electrical failures and errors on your 4K Ultra HD television with Alexa control and acts as a hub for numerous streaming services, making them not only easier to access but also look great.

YouTube’s algorithm recommends videos to users that they don’t want to watch, and it’s often worse than just the old Ramsay cable. A 2021 Mozilla report, again based on crowdsourced user data, claimed that people surfing the video platform are regularly recommended violent content, hate speech and political misinformation.

In this latest report, Mozilla researchers found that pairing videos, including the users rejected, such as a Tucker Carlson screed, would only lead to another video from the Fox News YouTube channel being recommended. Based on a rating of 40,000 video pairs, the algorithm often recommends very similar videos from similar channels when one channel is blocked. Using the “Do not like” or “Not interested” buttons prevented only 12% and 11% of the unwanted recommendations, respectively, compared to a control group. Using the ‘do not recommend channel’ and ‘delete from watch history’ buttons was more effective in correcting users’ feeds, but only by 43% and 29% respectively.

“In our analysis of the data, we found that YouTube’s user controls are inadequate as tools to prevent unwanted recommendations,” Mozilla researchers wrote in their study.

YouTube spokesperson Elena Hernandez told Gizmodo in an email statement that “Our checks do not filter out entire topics or points of view, as this can have negative effects for viewers, such as creating echo chambers.” The company has said they do not prevent all related topic content from being recommended, but they also claim to push “authoritative” content and suppress “borderline” videos that come close to violating its content moderation policy.

In a 2021 blog post, Cristos Goodrow, YouTube’s VP of Engineering, wrote that their system is “constantly evolving” but that providing transparency about their algorithm “isn’t as simple as listing a formula for recommendations.” as their systems take into account clicks, watch time, survey responses, shares, likes and dislikes.

Of course, like any other social media platform, YouTube struggles to create systems that can combat the entire breadth of bad or even predatory content uploaded to the site. An upcoming book shared exclusively with Gizmodo said YouTube had cut nearly billions of dollars in ad revenue to tackle the strange and disturbing videos recommended to children.

While Hernandez claimed the company has expanded its data API, the spokesperson added, “Mozilla’s report doesn’t take into account how our systems really work, so it’s difficult for us to gather a lot of insights.”

But this is a criticism Mozilla also puts at Google’s feet, saying the company doesn’t provide enough access to allow researchers to assess what affects YouTube’s secret sauce, aka their algorithms.

Supply hyperlink

Leave a Comment