Flat Earth, anti-vaccinations and a moon without Americans: how YouTube became the main channel for the dissemination of conspiracy theories and hoaxes

Recommendation systems are one of the most common forms of artificial intelligence with which

users are confronted with it, whether they are aware of it or not.Such systems are built into Facebook and Twitter — and, of course, into "suggested videos" on YouTube.

Such systems are extremely complex, especially when it comes to the services that are visitedLike other similar neural networks, AI fromYouTube tries to show you the most relevant videos based on previous videos you've watched or your interests collected through Google.

In the case of YouTube, AI consists of two neural networks — the first generatesrecommendations, and the second ranks them for a specific user.

Initial selection of candidates for entry into"Recommendations" is based on the analysis of user activity. This is a video that he watched before or sites he visited. At this stage, the neural generator offers several hundred options, which are then actualized for a particular person. The neural network is ready to abandon many parameters (age, gender, location) in order to show the user what he recently watched.

Then comes the second neural network, whichresponsible for ranking. She assesses the candidate videos on a wide range of criteria and sees as the main task the highest correspondence of the video to the interests of the user.

For the recommendation system does not have muchmeaning that the user will miss some of the proposed videos - in general, it takes into account the most important markers for it. And based on them shows videos that quickly gain views.

The system is continuously trained - one network is givenuser activity history up to a certain time (t), and the second network asks what the user would like to see at the time point (t + 1). As a result, they get more data and get to know the user better and better. The AI ​​developers for YouTube believe that this was one of the best ways to recommend a video.

Why YouTube shows us conspiracy videos

YouTube's recommendation system, for all its advantages from a technical point of view, has a clear ethical flaw — its main purpose is tois to make the user watch the video for as long as possible and, as a result, see more ads.

Sidebar

For example, the algorithm sees that a music video of a famous musician quickly gains views and goes viral.The neural network begins to offer it to users who may like it, and automatically plays the videoAnd now the user is already watchingnot a music video, but a video about the fact that the Americans did not land on the moon or about a Masonic conspiracy.

The operation of this scheme is well demonstrated by the results of a survey conducted at a congress of supporters of the flat Earth theory. 29 out of 30 participants said they learned about the theory from a YouTube video.

The flat earth theory is based on the ideaproposed in the 19th century by the English inventor Samuel Rowbotham. Its supporters are of the opinion that the Earth is a flat disk with a diameter of 40 thousand km, centered around the North Pole. Along the entire circumference of the disk there is a wall of ice that prevents the world's oceans from spreading. The theory states that the Sun, Moon and other stars orbit above the Earth's surface, and all photographs taken in space are fake.

The only respondent who becamea supporter of the flat earth theory before watching a video on YouTube, believed in the idea after talking with relatives who learned about it after watching a video on YouTube.

YouTube is the main distributor of ideas about flat Earth.

This scheme benefits all participants: users receive an endless feed of content, YouTube receives millions of hours of views, and advertisers receive monetization of their videos.

What does this YouTube do?

YouTube community guidelines do not prohibit authorsDownload videos with conspiracy theories or misleading information. Nevertheless, the video service makes point attempts to prevent the spread of conspiracy theories.

At the end of February, the service blocked severalthousands of accounts that published videos about conspiracy theories and spread various hoaxes. Under the lock then came and videos, the creators of which talked about self-assembly of weapons or their use. Two months earlier, YouTube announced an increase in the number of moderators to 10 thousand people.

In addition to the increase in the number of moderators, in Decemberlast year, YouTube began to show links to Wikipedia and other reliable sources when watching videos with conspiracy theories to limit the spread of fake news and false theories in video service.

However, the system does not work perfectly - the algorithmYouTube mistook broadcasts of the April 15 Notre-Dame de Paris fire for footage of the Twin Towers burning during the September 11 terrorist attacks in New York. The neural network showed users an article from Wikipedia about the terrorist attack, which outlined several versions of the event - and then moderators manually disabled the display of the article.

“The panels with articles start the algorithm, and sometimes it is mistaken. We turned off the panels in live broadcasts of the fire in the Cathedral of Notre Dame manually, ”a YouTube representative said then.

YouTube algorithms error in the fire of Notre Dame Cathedral on April 15

Another example is a video entitled “Any Cancercan be cured in a few weeks, ”published by iHealthTube.com in late March. It lacks a panel with reference to a reliable source on the topic of the video, while the video was viewed about 7 million times.

YouTube representatives in response to a series of requestsBloomberg refused to explain how the recommendation system works and how effective the measures taken by the company to combat the dissemination of conspiracy videos. In general, the YouTube management prefers to avoid direct answers to sensitive questions and prefers to explain problems with technical deficiencies of algorithms or employee negligence, the agency writes.

YouTube may stop distributingconspiracy theories only in one case - if regulators in several countries or large advertisers require this. The latter have already managed to ensure that the video service prohibits the monetization of videos depicting violence.

Sidebar

If the video about the flat Earth, landing the USA on the moonor the ineffectiveness of vaccination will not be possible to monetize, their number will drop dramatically. Other ways of dealing with conspiracy theories will hardly prove effective - not a single AI developer knows how it works - and Google is no exception. And to re-create the recommendatory algorithm, which has already proved its efficiency, the company does not make any sense.

In response to criticism of the company's growth priorityon user security, Facebook has announced an intention to radically change its policy of working with data and social network algorithms. The YouTube leadership is still trying to explain its mistakes with the interests of the public or investors, and sometimes with the actions of individual employees. And for Google, the inability of video services to solve their problems remains one of the main problems.