Login
Login is restricted to DCN Publisher Members. If you are a DCN Member and don't have an account, register here.

Digital Content Next logo

Menu

Research / Insights on current and emerging industry topics

Investigating YouTube’s algorithmic black box

July 13, 2021 | By Rande Price, Research VP – DCN

We’ve all taken a trip down the YouTube rabbit hole. It starts off innocently enough as we view a video recommended by a friend. Then about 30 to 60 minutes later, after countless video views, your brain neurons set off an alarm. This alert registers and you realize you are watching a video that clearly violates YouTube’s standards and practices. But wait just a minute: this video was recommended by YouTube itself.

That’s right, the platform recommends videos that disregard their own standards. It is difficult to explain how this happens because YouTube provides little transparency into their recommendation algorithms (well, any of their algos, really). The rabbit hole turns out to be an algorithmic black box.

While there are penalties for video violations, the determination process for YouTube’s actions for prohibited content are often unknown. They range from demonetization to the removal of individual videos and suspension of an entire account — to nothing at all. Not much is known about the end results. Meanwhile, there is little the user can do on the platform to prevent the targeting and amplification of regrettable (if not deplorable) content.

RegretsReporter

Consequently, Mozilla, a non-profit creator of browsers, apps, and tools, stepped in to shed light on YouTube’s black box of algorithms. They created the RegretsReporter browser extension. It’s a crowdsourcing tool to help users identify their path to a “regrettable” YouTube video. The browser extension captures the user’s YouTube browsing behavior. The tracking will only occur up to 5 hours prior to initiating the report. Further, the data is shared with Mozilla only if the user actively agrees.

Mozilla’s newly released report, YouTube Regrets, is an analysis of shared browser data from over 37,000 YouTube users in 91 countries. In all, the pathways and content of 3,362 regrettable videos are explored in this study. It’s the largest-ever crowdsourced investigation into YouTube’s algorithms.

Regrettable videos, often recommended

This analysis finds that one in 10 of the reported videos (12.2%) “should not be on YouTube” or “should not be proactively recommended,” based on YouTube’s Community Guidelines. The most frequently reported videos include misinformation and violent or graphic content. Covid-19 misinformation is categorized separately from “general” misinformation because of the volume of videos. It comprises a third of all categorized video regrets.

Unfortunately, YouTube recommendations represent 71% of the regretted videos reported to Mozilla. And at least two-fifths (43%) of all regrettable videos are recommended by YouTube and are completely unrelated to videos the user is watching.

Mozilla and other researchers cannot confirm YouTube’s claims at progress in correcting algorithms. YouTube provides no insight into the design and operational practices of their recommendation systems. Therefore, Mozilla sees these recommendations as necessary next steps in YouTube’s accountability:

  1. Allow independent audits of recommendation systems.
  2. Provide information about how recommendation systems work.
  3. Give users more control over which of their data is used to generate recommendations.
  4. Implement risk assessment programs to identify and evaluate the possibility and magnitude of harm caused by the recommendation system.
  5. Provide users with an option to opt-out of personalized recommendations in favor of receiving chronological, contextual, or search term-based recommendations.

Mozilla and other researchers believe there are significant consequences connected to YouTube’s algorithms. Further, many believe the algorithms are optimized in favor of YouTube’s business model – increasing users’ time spent to serve more advertising. The amplification of regrettable video content including pseudo-science, 9/11 conspiracy theories, mistreated animals or encouraging white supremacy cannot be a byproduct of YouTube’s business model. Clearly, transparency is a needed to identify and resolve the problems embedded in their recommendation ecosystem.

Print Friendly and PDF