Connect with us

Hi, what are you looking for?

science

Index – Tech-Science – We need to be very careful about the YouTube videos we watch

Index – Tech-Science – We need to be very careful about the YouTube videos we watch

YouTube’s content recommendations algorithm also recommends videos that violate the policy and are subsequently removed from the service, according to research by the Mozilla Foundation.

The non-profit organization developing the Firefox search browser is one regret report (The name means Rough Reporter) An add-on browser version that was discovered when someone was watching a YouTube video on the Internet and then asked the user if they regretted it.

The data collection process took 10 months, and according to the survey, 3,362 videos were misused during this period. Overall, the rate was well under one percent, with Brazil hardest hit when the videos were released: 22 out of ten thousand times.

Rumor and incitement to hatred

Users were more likely to be reluctant to watch YouTube recommended videos than those who chose them themselves: 71 percent recommended the service. Watching recommended videos was processed 40 percent more than those found by self-research. Non-English-speaking videos were 60 percent more likely to be dissatisfied with English-speaking videos, which can be blamed on the language capabilities of the recommended algorithm.

When examining abusive videos, researchers found that 12.2 percent violated terms of service, a fifth were fake, and 12 percent were misinformation about COVID-19, but they also encountered violent content and harassment, as well as hateful content.

Two hundred of those later reported were removed, but they had accumulated 160 million views by the time this happened.

In half of the cases, the recommended program was not connected in any way to what was previously shown. For example, view an extreme right channel for a video of survival rounds.

See also  To stay where we left off - get ready for the bike ride!

Social pressure is increasing

The results raise really serious and troubling questions when viewed across an actual YouTube audience. What we found is just the tip of the iceberg

– Brandi Juerkink, a Mozilla employee in Germany, who said it was unacceptable that the recommendation algorithm’s operation had not been transparent for years, even after doubts had been raised about its social impact.

It is not only about the amount of malicious content present on the platform, they also have to take responsibility if their tools increase their impact.

A YouTube spokesperson said:

Our referral system is designed to connect viewers with the content they love, and the system recommends over 200 million videos per day.

He added that the referral system was changed last year, reducing consumption of border content to less than 1 percent.

Not only is Brandi Geurkink asking video-sharing parent Google to confirm the numbers it cites through an independent investigation, but policy initiatives have also been launched in the Americas, the UK and the European Union to make social algorithm operators highly transparent.

(new worldAnd the Zdnet)

(Cover Image: Shutterstock)

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Top News

In a harrowing incident that has shaken the community of Lewiston, Maine, a series of shootings on Wednesday evening resulted in a tragic loss...

Top News

President Joe Biden’s abrupt departure from a speech on the U.S. economy at the White House on Monday sent a ripple of speculation and...

Top News

Given the differences in styles with next-generation consoles, the so-called “console war” between Sony and Microsoft is arguably moot. Most console players, however, will...

World

Chinese scientists have discovered a little-known type of ore containing a rare earth metal highly sought after for its superconducting properties. The ore, called...

Copyright © 2024 Campus Lately.