The Relationship Tips
Take a fresh look at your lifestyle.

YouTube algorithm pushing harmful videos, claims Mozilla


YouTube TV '4K Plus' tier brings 4K streaming

New Delhi: Firefox browser developer Mozilla has claimed that Google-owned YouTube keeps pushing harmful videos and its algorithm is recommending videos with misinformation, violent content, hate speech and scams to its over two billion users.

The in-depth study also found that people in non-English speaking countries are far more likely to encounter videos they considered disturbing.

“YouTube’s controversial algorithm is recommending videos considered disturbing and hateful that often violate the platform’s very own content policies,” according to a 10-month long, crowdsourced investigation released by Mozilla late on Wednesday.

YouTube told NBC News that videos promoted by the recommendation system result in more than 200 million views a day from its homepage, and that it pulls in more than 80 billion pieces of information.

“We constantly work to improve the experience on YouTube and over the past year alone, we’ve launched over 30 different changes to reduce recommendations of harmful content,” the company said in a statement.

Mozilla conducted the research using RegretsReporter, an open-source browser extension that converted thousands of YouTube users into YouTube watchdogs.

People voluntarily donated their data, providing researchers access to a pool of YouTube’s tightly-held recommendation data.

Research volunteers encountered a range of regrettable videos, reporting everything from Covid fear-mongering to political misinformation to wildly inappropriate “children’s” cartoons.

Also Read

WeChat blocks several LGBT accounts in China


“The non-English speaking world is most affected, with the rate of regrettable videos being 60 per cent higher in countries that do not have English as a primary language,” the findings showed.

Over 71 per cent of all videos that volunteers reported as regrettable were actively recommended by YouTube’s very own algorithm.

Almost 200 videos that YouTube’s algorithm recommended to volunteers have now been removed from YouTube — including several that the platform deemed violated their own policies.

These videos had a collective 160 million views before they were removed, said the Mozilla report.

“YouTube needs to admit their algorithm is designed in a way that harms and misinforms people,” said Brandi Geurkink, Mozilla’s Senior Manager of Advocacy.

“Our research confirms that YouTube not only hosts, but actively recommends videos that violate its very own policies. We also now know that people in non-English speaking countries are the most likely to bear the brunt of YouTube’s out-of-control recommendation algorithm,” Geurkink emphasised.

Recommended videos were 40 per cent times more likely to be regretted than videos searched for. Several Regrets recommended by YouTube’s algorithm were later taken down for violating the platform’s own community guidelines, the report mentioned.

Last month, Firefox said that Google’s new proposal for targeted ad tracking has several properties that could pose “significant” privacy risks to users.

Firefox published the results of an analysis of Google’s Federated Learning of Cohorts (FLoC) proposal.

Firefox CTO Eric Rescorla said there are major privacy problems with the system.

Get the latest updates in Hyderabad City News, Technology, Entertainment, Sports, Politics and Top Stories on WhatsApp & Telegram by subscribing to our channels. You can also download our app for Android and iOS.

Subscribe to our newsletter
Sign up here to get the latest news, updates and special offers delivered directly to your inbox.
You can unsubscribe at any time
Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More