YouTube’s content recommendations algorithm also recommends videos that violate the policy and are subsequently removed from the service, according to research by the Mozilla Foundation.
The non-profit organization developing the Firefox search browser is one regret report (The name means Rough Reporter) An add-on browser version that was discovered when someone was watching a YouTube video on the Internet and then asked the user if they regretted it.
The data collection process took 10 months, and according to the survey, 3,362 videos were misused during this period. Overall, the rate was well under one percent, with Brazil hardest hit when the videos were released: 22 out of ten thousand times.
Rumor and incitement to hatred
Users were more likely to be reluctant to watch YouTube recommended videos than those who chose them themselves: 71 percent recommended the service. Watching recommended videos was processed 40 percent more than those found by self-research. Non-English-speaking videos were 60 percent more likely to be dissatisfied with English-speaking videos, which can be blamed on the language capabilities of the recommended algorithm.
When examining abusive videos, researchers found that 12.2 percent violated terms of service, a fifth were fake, and 12 percent were misinformation about COVID-19, but they also encountered violent content and harassment, as well as hateful content.
Two hundred of those later reported were removed, but they had accumulated 160 million views by the time this happened.
In half of the cases, the recommended program was not connected in any way to what was previously shown. For example, view an extreme right channel for a video of survival rounds.
Social pressure is increasing
The results raise really serious and troubling questions when viewed across an actual YouTube audience. What we found is just the tip of the iceberg
– Brandi Juerkink, a Mozilla employee in Germany, who said it was unacceptable that the recommendation algorithm’s operation had not been transparent for years, even after doubts had been raised about its social impact.
It is not only about the amount of malicious content present on the platform, they also have to take responsibility if their tools increase their impact.
A YouTube spokesperson said:
Our referral system is designed to connect viewers with the content they love, and the system recommends over 200 million videos per day.
He added that the referral system was changed last year, reducing consumption of border content to less than 1 percent.
Not only is Brandi Geurkink asking video-sharing parent Google to confirm the numbers it cites through an independent investigation, but policy initiatives have also been launched in the Americas, the UK and the European Union to make social algorithm operators highly transparent.
(Cover Image: Shutterstock)