YouTube will begin to hide the “dislikes” in the videos.

YouTube will begin tests with small groups of users to eliminate the count of “dislikes” in videos in the coming weeks. This decision arises with the aim of taking care of the well-being of the creators as well as to stop the campaigns directed against certain videos of the platform.

Some channels will have the interface of the videos changed to show only the “I like” count and not the “I don’t like” count, the company announced on Twitter this Tuesday. In a post on their support page, they state that they are testing a few different designs that will be able to be seen in videos of small groups of users over the next several weeks, but that the “dislike” button will not go away.

YouTube takes this step because of criticism received from content creators regarding their “wellness and targeted ‘Dislike’ campaigns,” which may be motivated by thumb-down public accountants.

The platform has been trying for years to solve the problem of dislike mobs, as the organized groups of people who boycott videos are called, repudiating them en masse without even having viewed them. Thus, in 2019 a series of measures were proposed, such as the inclusion of a questionnaire when pressing the button.

The elimination of the number of thumbs down does not mean the disappearance of the button so that, in this way, users who visit a clip continue to have the possibility to share their negative opinion about it. In addition, creators will be able to follow the count of those negative reviews of their content on YouTube Studio.

The company said it made this decision to “try to balance enhancing the creator experience” while ensuring that “viewers’ opinions are taken into account and shared with the creator.”

This new feature will begin to be tested with a small group of users (REUTERS / Lucy Nicholson / File Photo)

This new feature will begin to be tested with a small group of users (REUTERS / Lucy Nicholson / File Photo)

What does the platform take into account to delete videos

The company discloses, through its transparency reports, the number of videos that are removed from the platform with the aim of limiting harmful content but always ensuring that the essence of an open platform that it has is maintained.

YouTube bases its liability policy on four pillars, which consist of: a) removing all content that violates community policies; b) reduce the proliferation of questionable content, which does not violate policies, but is not of good quality either; c) raising quality content and authoritative voices and d) rewarding content that meets or exceeds YouTube standards through monetization.

On the other hand, it should be noted that content moderation, which consists, among other things, of deciding which videos may or may not be on the platform, is a joint effort between humans and machines.

The machines find content quickly, and on a large scale and this is important to act effectively, that is, to be able to remove or limit the content without delay, before users are exposed.

The point is that machines are not always as accurate as humans, because they cannot understand all the subtleties of language or ironies. In that sense, they tend to act, generally as nominators, that is, they identify potentially harmful content and then send it to a human so that he or she can finish making the final decision.

Categorized in: