YouTube Starts Banning Intentionally Disturbing Videos Aimed At Children

From adults to teenagers, toddlers and even younger audience, YouTube is many people's favorite to spend their time with

With the many contents created at any given time, policing the platform is a difficult thing to do. Using human moderators and algorithms, some inappropriate videos can still slip through.

To decrease, if not eliminate videos targeting children with disturbing content, YouTube said that it is no longer allowing such videos to be uploaded to its platform anymore.

This move follows years of criticism that the video hosting website allowed bad actors to put violent and troubling content in front of its young audience.

In a Google's YouTube support page explaining the matter, the company said that:

"[Content] that contains mature or violent themes that explicitly targets younger minors and families in the title, description and/or tags will no longer be allowed on the platform."

Related: YouTube Disables Ads On 2 Million Inappropriate Children's Videos

On the post, YouTube gave some examples about what it is prohibiting:

  • A video with tags like "for children" featuring family friendly cartoons engaging in inappropriate acts like injecting needles.
  • Videos with prominent children’s nursery rhymes targeting younger minors and families in the video’s title, description or tags, that contain adult themes such as violence, sex, death, etc..
  • Videos that explicitly target younger minors and families with phrasing such as “for kids” or “family fun” in the video’s title, description and/or tags that contain vulgar language.

YouTube starts banning such content because it was a thing.

Many content creators upload videos which feature children's favorite characters, but make them do adult related acts, or some other inappropriate behaviors that are certainly not good for the young audience.

Disturbing children's YouTube video

Putting the restriction in place, YouTube is also making a new class of content it calls "age-restricted".

Explaining the matter, YouTube said that:

"Content that is meant for adults and not targeting younger minors and families won’t be removed, but it may be age-restricted. If you create adult content that could be confused as family entertainment, make sure your titles, descriptions, and tags match the audience you are targeting. Remember you can age restrict your content upon upload if it’s intended for mature audiences."

This describes contents in which the audience should first confirm that they are at a certain age to watch them.

YouTube is giving content creators 30 days to remove contents that depict the above. "We want to give creators time to get familiar with this update," said YouTube, adding that "we will begin removing these videos without giving a strike to the channel."

"Additionally, we won’t strike videos uploaded prior to this update, but we may still remove the content."

This move is necessary, and also essential as contents on YouTube has long frustrated parents and educators. With this policy in place, the rule allows YouTube to remove such content whenever it sees one.

Related: YouTube Backlash After Ads Appear Near Images Of Children And Pedophile Comments

Published: 
26/08/2019