HomeAPPSHow The New YouTube Algorithm Works

How The New YouTube Algorithm Works

YouTube sheds light on the general principles that guide the algorithm that suggests videos to watch. I had reviewed the functioning of the algorithms of Facebook, Instagram, Twitter, LinkedIn, TikTok and Spotify. Still, I had never focused on the Google home social network because it has always been quite intuitive. 

In recent times, however, it has also been significantly modified to meet users’ needs and combat the phenomena of disinformation and violence. Previously, the suggested contents were chosen from those most viewed by all viewers of a particular country. At the same time, today, they follow more sophisticated rules, enhanced by Artificial Intelligence.

The Signals Considered By The YouTube Algorithm

The algorithmic recommendation system acts on two occasions: in the composition of the homepage and the “next video” function. The primary YouTube page is dynamically composed at each of our accesses with videos from channels we are subscribed to, those from accredited news sources (in the last year, “news on Covid-19” was given prominence) and those chosen for us from machine learning algorithms.  The latter is highly personalized and based on over 80 billion pieces of information called “signals”. The most important are: 

The Clicks On The Videos 

It makes the system understand that we have a certain degree of curiosity towards them. This is a general indication that is weighed and evaluated together with all the other factors because I may have only a momentary interest in the content that I have selected;

The Watch Time 

It is the viewing time of each video clicked. Naturally, the system will assign greater weight to those contents that we have watched for the longest time and not to those abandoned after a few seconds because they are not in line with expectations. For example, if the accumulated time watching movie reviews is more significant than that spent watching trailers and extra content, the system will likely offer us more thoughts in the future;

Survey Responses

As users sometimes leave YouTube in the background without really paying attention to what is being broadcast, the need has arisen to measure “valuable watch time”. In practice, some users are subjected to a survey asking to rate the video viewed from one to five stars and justify the answer. Only videos with four and five stars will be considered helpful to the user.

To Extend Survey Results To More Users

YouTube has trained a machine learning model.

Interactions

The actions we perform after watching a video, i.e. the shares, likes and dislikes, are necessary signals for the algorithm. Even in this case, however, they are evaluated concerning the modus operandi of the individual. For example, if a user is used to sharing everything, even the videos to which he has given two stars, the system will understand that, in this specific case, the “sharing” signal is not so relevant. These signals are valid as long as they do not concern content contrary to the platform’s rules. 

Videos that are violent or with a sexual background, discriminatory, sensational, or spread fake news tend to be blocked or not recommended by the platform. In this attempt to keep the ecosystem clean, the technology allies are a team of reviewers who manually evaluate the content. This is a never-ending struggle and one that is far from perfect. Web users can also do something to improve the system recommendations and cleanliness of the place. For example, for each video, it is possible to use the menu activated by the three vertical dots and choose “I don’t care”, “don’t recommend the channel”, or “report” in cases of harmful content.

Confirmation And Discovery Algorithms

The algorithms of social networks, symmetrical and asymmetrical, are heavily influenced by the tastes of the social network that the individual has consciously or unconsciously built. Instead, the social media algorithms of use, such as YouTube, cannot be based on the habits of contacts but only on those of the user and similar users, which are more precise, i.e. in line with the individual’s tastes. And so, the YouTube algorithm works very well to deepen one’s interests and bring to light even videos that are not seen but are consistent with those already appreciated. The user is happy because he is given what he loves, and the platform thrives because the residence time increases.

  1. This logic, however, has two problematic implications.
  2. The first is that it contributes to reinforcing beliefs and prejudices, excluding critical thinking (confirmation bias).
  3. The second is that it leaves no room for amazement at discovering unexpected contents and novelties that we do not yet know we can appreciate.

If, for example, I have seen several videos of Tom Waits and other fans like me have also seen videos of Leonard Coen, YouTube will recommend them to me. But it probably won’t bring out videos from new authors that might catch my eye. In short, to go without fail, engineers avoid training the algorithm to hazard new proposals. From this point of view, TikTok’s recommendation system is interesting, as I explained here, because it exposes the viewer to random videos. I hope that YouTube engineers will test new ways to stimulate viewers’ open-mindedness, perhaps allowing us to choose the level of unexpected content we want to receive with a selector.

Also Read: What Is Email Marketing, And What Tools Can Be Used?

Technology Talkerhttps://www.technologytalker.com
Technology Talker is a well built blog which Provides you with all the Latest news about Technology, business, Marketing, Social Media etc.
RELATED ARTICLES

Latest Articles