Streaming cats videos on YouTube. What can go wrong? Well, many.
YouTube that is owned by Google, is most popular video-streaming platform, and also one of the most visited website in the world. It has what it calls the 'Up Next' section that is powered by AI and algorithms.
This particular section uses Google's "magic", that seemingly capable of knowing users better than the users know themselves.
As a result, and for more than many times, users can spend more than they want to, or need to on YouTube, by clicking through the next suggested video, with each time promising themselves that this one would be the last one.
The AI-powered recommendation section drives about 70% of users total viewing time on YouTube.
What this means, that section is here to stay, It won't go anywhere, and YouTube loves it.
But there are problems associated with this section.
Often, that same algorithm leads viewers down a "rabbit hole".
YouTube is full of absurd things. While there are certainly a lot of educational and family-safe contents on the platform, but there are also tons of misleading contents, hate, pedophiles, conspiracies and more.
Many of those kind of videos are prohibited, but somehow managed to be unseen by YouTube's filtering algorithms.
YouTube users have complained. But since policing a platform as huge as YouTube, it seems that no matter how much the company deploys its AI-human resources, they are still not enough to catch them all.
To get an idea of how often people can stumble down the rabbit holes and get lost in the platform to see what they don't want to see, the non-profit Mozilla Foundation has launched a browser extension that lets users take action when they are recommended videos on YouTube that they then wish they hadn't ended up watching.
YouTube's recommendation AI is one of the most powerful curators on the internet, according to Mozilla.
In a blog post, the Firefox maker wrote that:
Mozilla has been collecting examples of users' YouTube Regrets since October 2019, by conducting research to find the reasons "why YouTube must change".
And the browser extension it calls 'RegretsReporter', is essentially a tool for users to report their "YouTube Regrets".
It's a way for users to report to Mozilla, so Mozilla can understand what made users go to that bizarre path.
"We’re recruiting YouTube users to become YouTube watchdogs. People can donate their own recommendation data to help us understand what YouTube is recommending, and help us gain insight into how to make recommendation engines at large more trustworthy," wrote Ashley Boyd, VP of Advocacy at Mozilla.
The idea is to help uncover information about the type of recommended videos that lead to racist, violent or conspirational content, and to seek patterns in YouTube usage that might lead to harmful content being recommended.
Users can report a YouTube Regret using the RegretsReporter by clicking on the extension's icon when they see a video they do not wish to see. This will open a form, where users can explain how they arrived at that video. The extension will also send data about YouTube browsing time to estimate the frequency at which viewers are directed to inappropriate content.
YouTube has already acknowledged these issues in the past, and made improvements.
But still, the problem continues, and arises especially when dealing with the so-called "borderline" content, or videos that brush up against YouTube's policies, but don't really cross the line.
YouTube said it improved its algorithms, and saw a 70% average drop in watch time for videos deemed borderline.
But according to Guillaume Chaslot, a former YouTube engineer, he is skeptical.
"The algorithm is still the same," he said. "It's just the type of content that is considered harmful that changed. We still have no transparency on what the algorithm is actually doing. So this is still a problem – we have no idea what gets recommended."
Mozilla has in the past tried to ask YouTube to open up its recommendation algorithm for the public, so people can scrutinize the inner workings of the system.
The organization has also called for YouTube to provide independent researchers with access to meaningful data, such as the number of times a video is recommended, the number of views that result from recommendation, or the number of shares. Mozilla also required the platform to build simulation tools for researchers, so they try mimicking users' path through the recommendation algorithm.
All that without success.
With RegretsReporter, Mozilla too the matter into its own hands.
The organization has decided that if YouTube won't share the data, then that data will be taken directly from YouTube's users.