YouTube is currently working on algorithms to hope to keep its users even longer on the video platform.
It has been many months since many social networks and platforms have expressed their willingness to offer digital wellness tools to their users. In this sense, Facebook, Instagram and Apple offer for example data to know how much time is spent on his phone or on a platform.
For its part, YouTube – owned by Google – did the same by deploying a similar feature that tells you how many minutes or hours you spent watching videos.
But beneath these seemingly benevolent tools, one wonders if these platforms, including YouTube, do not deploy features that go in the opposite direction of digital well-being. Indeed, it would be in YouTube’s interest to keep users on its service longer, as they would watch more videos. If they watch more videos, they see more ads and YouTube earns more money, to sum up.
The Google platform has already deployed several features that tend to retain the attention of users, such as the recommendations or the fact that the videos are linked without having to click on the following video (like the episodes on Netflix).
Does YouTube evolve its recommendation algorithms?
As the MIT Technology Review notes in an article, it is precisely the recommendation algorithms that are currently at the heart of this desire to keep users on the platform. A new Google document discusses how the service plans to make its recommendations even more specific, so that people only see things they like or that might interest them. Obviously, it is more difficult to leave the platform when all the proposed videos are likely to please you.
In the document, the researchers evoke what they call an “implicit bias” bias that would push users to some YouTube videos rather than the ones they would really like to watch. This would be a consequence of the fact that the service has trouble determining if you clicked on the video because it interested you or because it was particularly recommended and highlighted.
To prevent this from happening, researchers are considering taking into account the rating of the video in the recommendations sidebar. If it is lower and requires scrolling, it will have a greater weight than the recommendation that appears first. A former engineer mentions the fact that it would increase by 0.24% the time spent on the platform per user, which obviously represents a lot of money on a larger scale. It is unclear if and when this novelty could be deployed.
Except that it could also have adverse effects. Jonas Kaiser, an affiliate of the Berkman Klein Center for the Internet & Society, says, for example: “In our research, we discovered that YouTube’s algorithms create an isolated far-right community, push users to videos of children and encouraged misinformation “. For his part, Jonathan Albright, director of the digital forensics initiative at the Tow Center for Digital Journalism also says: “In the margins, this change could promote the formation of more isolated communities than we have already seen.”
Nevertheless, YouTube seems rather confident and thinks that the recommendations will rather diversify than become more extreme.
How useful was this post?
Click on a star to rate it!
I am sorry that this post was not useful for you!
Let me improve this post!
Thanks for your feedback!