When it comes to keeping kids entertained, sometimes the only thing a parent can do to get a moment of peace is put on a video.
While no parents would advocate letting television or online entertainment be a “babysitter,” the truth is that many parents rely on the stolen moments they gain when their child is watching Peppa Pig, even if they limit screen time. That’s why apps like YouTube Kids are so popular. Theoretically, a child can sit down with the app and watch content that’s appropriate for them, featuring the characters they know and love.
Unfortunately, the reality is much more complex. Parents have started to notice that their children have been exposed to inappropriate content on YouTube Kids, and what’s even scarier is that it’s disguised to look like the videos your kids watch on a regular basis.
Just so you know, this article will be covering sensitive topics, and some may find the images below disturbing.
A few weeks ago, the writer James Bridle published a post on Medium detailing the disturbing trend he’s noticed among children’s entertainment on YouTube Kids. The first videos he explores aren’t necessarily upsetting, but they do reveal that channels are using largely automated processes to create videos designed to game the YouTube algorithm to get a huge number of plays and thus ad revenue. This isn’t particularly shocking because YouTube channels are designed to make money. But it does get into some weird territory.
Bridle explains that these automated, algorithmic approaches to creating content for kids feel off, and they result in crazy mashups that use nursery rhymes, songs, and characters from across shows, movies, and genres. There’s nothing inherently bad about the video below, but it’s kind of weird and unsettling. But this odd use of copyrighted characters is what starts us down the path of the truly evil.
Some of the most upsetting content on YouTube Kids is disguised as a regular episode of, say, Peppa Pig. Whether it’s right away or several minutes in, these videos then turn violent, sexual, or sometimes both, exposing children to things they shouldn’t have to see. As Bridle points out, it’s easy to understand how content like this slips through the YouTube filters, but we have no idea who is creating this disturbing imagery and why.
In addition to animated content, there are also live action channels, some of which have been removed since Bridle’s article. YouTube has responded, saying it has demonetized more than 2 million videos, removing more than 150,000 of them for having inappropriate content. Still, it’s shocking that so many videos were allowed to remain up for so long, and some parents are saying this move is too little, too late.
Read more: http://www.viralnova.com