YouTube Addresses Child Exploitation Issues Using Machine Learning, People And Experts

The video streaming giant YouTube is making major changes after a wave of press exposed disturbing contents targeted to kids on the video platform.

In an unsettling trend, many accounts are publishing contents starring children which "seems" like appropriate for children viewers, but however are disturbing and exploitative. These videos have racked up millions of views and a lot of revenue.

A number of videos have been discovered, in which many appear to originate from eastern Europe.

Those videos were made in a way that appeal those younger viewers. Using well-known family-friendly characters from Disney, for example, but created into themes that include them receiving injections in inappropriate places, use revealing clothing, firing guns at each other, placed in vulnerable scenarios or kidnapped, restrained with ropes or tape and sometimes crying or in visible distress, made to 'play doctor' with an adult, pretending to eat feces out of a diaper and so forth.

Not only that those videos were uploaded by regular YouTube users, even 'verified' channels have tens of millions of views for those disturbing videos.

While many of those videos have been removed either by algorithms or manual intervention from YouTube, many others have passed scrutiny.

YouTube had also repeatedly failed to address many user-submitted reports on these same issues.

After the issue was made forward and got press attention, the Google-owned company published a blog post detailing five ways it's addressing the problems.

These updates come a month after a reporter revealed that those contents have even made their way into YouTube Kids, the company's family-friendly app.

YouTube that is evaluating its verification policy, is adding more human oversight to its content, and also expand the use of its machine learning technology and automated tools to detect inappropriate content. As was previously announced, these videos are not allowed to include advertising.

As Koerber reported, videos that appeared in search results on YouTube Kids were selected by an algorithm and were not subject to human review, according to Google's support page for YouTube at the time.

YouTube also will disable all comments on videos of minors if sexual or predatory comments are made.

But the problem is massive, given by the sheer amount of videos uploaded to YouTube at any given time. YouTube know that it can't and shouldn't address the problem alone. For that matter, the company is also working with child safety experts to continue to identify troubling trends, which the company allowed to thrive on its platform for years.

"To help us better understand how to treat this content, we will be growing the number of experts we work with, and doubling the number of Trusted Flaggers we partner with in this area," the blog post reads.

For users, the company is releasing a "comprehensive guide" so creators can understand how to create "enriching family content" that abides by YouTube's community standards.

Published: 
23/11/2017