Mozilla wants to expose YouTube's algorithms. The company calls on people to help

What is Mozilla's plan?

The company announced a new extension to enable users to better

understand how YouTube's recommendation algorithm works and provide detailed information about the patterns it detects.

YouTube recommendations can be amazingbut also dangerous. The platform has long been recommending malicious content to its users, from pandemic conspiracies to political disinformation, even if they've previously viewed innocuous content.

In October 2019, Mozilla publishedown research on this topic, which found that YouTube recommended malicious videos that included misinformation. With the global pandemic and the upcoming US elections, the consequences of this problem are real and growing every day, the company said.

Despite serious consequences, the algorithmYouTube recommendations are completely mysterious to users. What will YouTube recommend to US users to watch in the last days before the election? Or in the coming days, when election results may be unclear?

Asking these questions, Mozilla givesYouTube users can act when recommended malicious videos and launches RegretsReporter. This browser extension was created for the purpose of crowdsourced research on the YouTube recommendation problem.

Why Mozilla?

AI recommendations YouTube is one of the mostinfluential "curators of information" on the Internet. YouTube is the second most visited site in the world, and video views based on its algorithm make up 70% of the total viewing time on the site. It's no exaggeration to say that YouTube has a significant impact on public awareness and understanding of key issues around the world.

For years, people have been sounding the alarm aboutYouTube recommends videos that include conspiracy theories, misinformation, and other malicious content. YouTube previously stated that the company has made progress on this issue and the platform has managed to reduce the number of harmful recommendations by 70%. But it’s impossible to verify these claims or figure out where YouTube still has work to do.

Every day, while YouTube checks thesestatements such as videos like "Civil War Is Coming" or "Plandemic" may be offered to more and more people. Mozilla said they are "recruiting YouTube users to become the watchdogs" of the platform. People may share recommendation data to help Mozilla understand what YouTube is recommending and how to make recommendation systems in general more reliable.

What specific information does Mozilla want to disclose?

  • What recommended videos contain racist, violent, or conspiratorial content?
  • Are there patterns in video recommendations in terms of frequency or severity of malicious content?
  • Are there specific YouTube usage patterns for which harmful content is recommended?

With this information, Mozilla - along withfellow researchers, journalists, politicians, and even YouTube engineers - can work to build more robust systems for recommending content.

By using open source, Mozilla willpublicly share research results and hope that YouTube and others will use this information to improve their products. Stay tuned for updates on the site.

How will YouTube whistle-blowing company work?

When you browse YouTube, the new extensionMozilla automatically sends data about how much time you spend on the platform without collecting any information about what you are watching or looking for. This may give us some insight into how often users are outraged by YouTube's recommendations.

The Mozilla browser extension prompts the user to enter data about the YouTube algorithm.
prt scr Mozilla

Users also have the option tosend the report directly to Mozilla. The report form will ask you to tell more about the YouTube claims and gather information about the video you are reporting and how you arrived at it. This is important for understanding how YouTube's recommendation system gets people to watch certain videos and what impact it can have on them.

It's safe?

Throughout this entire process, user privacy is of the utmost importance to the company, Mozilla said.

The data that Mozilla collects is associated witha randomly generated user ID, not with a specific YouTube account. Only Mozilla will have access to the raw data to minimize the risk of user identification. The extension does not collect data while watching videos in private. More information on user privacy is provided in the RegretsReporter Privacy Notice.

prt scr Mozilla

An important aspect of research

“We ask users not to change theirbehavior on YouTube when using this extension. Don't look specifically provocative. Instead, use YouTube as usual, ”explains Mozilla. According to the company, this is the only way to help "collectively" understand whether YouTube is solving the problem of recommending inappropriate content and in which areas the video service should improve.

YouTube reaction

Mozilla has no formal agreement with Google or YouTube to research the recommendation algorithm, but Boyd says they've been in contact with the company and intend to share information.

YouTube, however, said that Mozilla's methodology appears to be "questionable."

A YouTube spokesman said in a statement to TheVerge that the company is always interested in researching its recommendation system. “However, it is difficult to draw general conclusions from anecdotal examples, and we are constantly updating our recommendation systems to improve the user experience,” the spokesman said, adding that over the past year, YouTube has launched “more than 30 different changes to reduce border content recommendations”.

What's wrong with YouTube's algorithms?

At the end of 2019 YouTube finally admitted that their recommendation systemoffers malicious content. This is a small step in the right direction. For a long time, video hosting algorithms have received many complaints about dubious recommendation algorithms.

YouTube ignores warnings

As reported by Bloomberg, YouTube executivesignored all warnings, letting the toxic videos run wild. Proposals to change recommendations and curb conspiracies have been sacrificed for engagement, according to company employees.

Back in 2018, Susan Wojcicki, ChiefYouTube's CEO took the stage during the South by Southwest conference to defend the platform. The reputation of her company, which had been criticized for stirring up lies on the Internet for months before, was shaken after a video hosting a video related to a conspiracy theory video of a high school shoot in Parkland, Florida. The authors of the video claimed that the victims were "crisis actors."

Wojcicki is generally reluctant to speak withthe public, but at a conference in Austin, she spoke to present a solution she hoped would help suppress conspiracy theories. A tiny text box from websites like Wikipedia should underneath videos that question conventional wisdom, such as the moon landing, will help people view the truth.

YouTube striving to overtake television byis estimated to generate over $ 16 billion in advertising sales annually. But that day, Wojcicki compared her video site to a different type of establishment. “We're more like a library,” she said, taking the familiar position of an advocate for free speech. "If you look at libraries, there has always been controversy."

how YouTube hides the truth?

YouTube Fires Independent Researchers Who Want To Understand The Recommended Algorithm Issue. The Mozilla Foundation has created a timeline to prove it.

  • At the beginning, there was news that the systemYouTube recommendations harm users. The algorithm has radicalized young people in the U.S., sowed division in Brazil, spread state-sponsored propaganda in Hong Kong, and more.
  • Then YouTube responds. But not by admitting an error or a detailed solution. Instead, the company makes a statement blaming and criticizing the research methodologies used to examine their recommendations, vague promises that the company is "working on it."
  • After, in a blog post, YouTube admitted that theirthe recommendation engine offered users borderline content, and posted a graph showing that they have devoted significant resources to addressing this issue over the years.

    “What they don’t realize is how theyshy away and fire journalists and academics who have highlighted this issue for years. In addition, there is still a clear lack of publicly verified data to support YouTube claims that they are solving the problem, ”Mozilla said.

As a result, Mozilla published a list of answersYouTube to third-party research on its recommendation system. The timeline posted on the Mozilla blog records 14 responses - evasive or dismissive - issued over 22 months in reverse chronological order.

Top trends in YouTube excuses

  • YouTube often claims to fix the problem by changing its algorithm, but provides little detail on what exactly these settings do.
  • YouTube claims to have data that contradicts independent research, but refuses to share this data.
  • YouTube dismisses independent research on this topic as erroneous or anecdotal, but refuses to provide third-party access to its data to prove it.

In a blog post, YouTube stated that“Edge Content” accounts for 1% of content viewed by users in the United States. But with 500 hours of videos uploaded to YouTube every minute, how many hours of such content is still being offered to users? After examining hundreds of stories about the impact of these videos on people's lives in recent months, it becomes clear that this problem is too important to be played down by statistics that even prevent outside experts from seeing and studying the problem as a whole.

What's the bottom line?

Ever since Wojcicki took the stage defendingYouTube last year, new famous conspiracy theories appeared on the platform, including vaccination of children; about Hillary Clinton's connection to a satanic cult, which has angered lawmakers seeking to regulate technology companies. A year later, YouTube became even more associated with the darknet. Perhaps Mozilla's new plan for dealing with inappropriate content will help clarify how recommendations are generated. If anything, the new development re-raises the important issue of manipulating public opinion through social networks and media.

Read also

The Doomsday glacier turned out to be more dangerous than scientists thought. We tell the main thing

Two pieces of evidence of extraterrestrial life emerged at once. One on Venus, the other - no one knows where

Hybrid vehicles are more environmentally hazardous than diesel vehicles. We tell the main thing