Again, YouTube Couldn't Stop Child Predators From Posting Inappropriate Comments

YouTube is considered the largest video-streaming site on the web, and that is due to the many videos appealing a wide range of viewers.

Given its massive size, it's difficult for YouTube to police everything on its platform. This failure led to another wave of criticisms over the alarming number of child predator comments and videos targeting young children.

One of the concern came from a YouTube creator called Matt Watson.

He flagged the issue in a critical Reddit post, saying that he found a bunch of videos of children where YouTube users are trading inappropriate comments and timestamps below the fold.

This denounced the company for failing to prevent what he describes as a "soft-core pedophilia ring" from operating in plain sight.

The videos of those minors aren’t pornographic in nature. But what's inside the comment section is the problem.

To highlight this issue, Watson also posted a YouTube video to demonstrate how the platform's recommendation algorithms show users into what he says as a pedophilia “wormhole”.

As Watson described, in a history-cleared private browser session, clicking on a few videos of adult women in bikinis, will make YouTube in suggesting video recommendations with thumbnails showing young girls demonstrating gymnastics poses, showing off their “morning routines”, or licking popsicles or ice lollies.

Clicking on one the suggested videos will open a YouTube page with its sidebar populated with even more videos of prepubescent girls in its ‘up next’ section.

And because he saw ads being displayed on some videos of kids containing inappropriate comments, Watson also accused the company for facilitating and monetizing sexual exploitation of children.

"Youtube’s recommended algorithm is facilitating pedophiles’ ability to connect with each-other, trade contact info, and link to actual CP in the comments. I can consistently get access to it from vanilla, never-before-used Youtube accounts via innocuous videos in less than ten minutes, in sometimes less than five clicks.. Additionally, I have video evidence that these videos are being monetized by Youtube, brands like McDonald’s, Lysol, Disney, Reese’s, and more."

According to Watson, those videos contain inappropriate/predatory comments, including sexually suggestive emoji and timestamps that appear intended to highlight, shortcut and share the most compromising positions and/or moments in the videos of those children.

Some videos have more comments that the other, but in some, YouTube users made sexually suggestive remarks about the minors in the videos.

Making things worse, Watson claimed to have found links to actual child pornography being shared in YouTube comments too.

Previously in 2017, YouTube had experienced similar problem when it was discovered that pedophiles were posting obscene comments on videos of children.

Known as the "ElsaGate", this led to several major advertisers in stopping their ad spending on the platform.

At that time, YouTube responded by announcing a number of policy changes to those children-related videos, saying that it would start aggressively police comments on videos of kids, use AI, and when videos have inappropriate comments about the kids in them, the platform would have comments turned off altogether.

Not long after Watson's report, some of the videos on YouTube's recommendations have their comments disabled.

This suggests that YouTube's AI has gone to work and identified a large number of inappropriate comments being shared. But still, those videos themselves were still being suggested for viewing.

After years of public outcry, YouTube seems to still have difficulties in dealing with child predators on its platform.

"Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube," a YouTube spokesperson said.

"We enforce these policies aggressively, reporting it to the relevant authorities, removing it from our platform and terminating accounts. We continue to invest heavily in technology, teams and partnerships with charities to tackle this issue."

The online video giant announced that it had banned more than 400 channels and disabled comments on tens of millions of videos that have child exploitation comments. But still, many major brands, like Disney, AT&T, Nestle, and Fortnite have decided to halt their YouTube advertising.

Published: 
20/02/2019