Menu

Kids love YouTube, but unfortunately, surprise eggs are not the only surprise on the platform. The same things that make YouTube great for well-meaning content creators also attract people who are using it to introduce children to concepts like suicide and self-harm.

As the Washington Post reports, Florida mom and pediatrician Free Hess, founder of the child safety website, PediMom raised the alarm about such videos after a friend noticed one while playing the YouTube kids app to distract her child from a nosebleed.

The video, which began as an innocent cartoon but was interrupted by a man instructing children on how to commit suicide, was removed from YouTube thanks to Hess' campaigning, but it was just one of an unknown number of videos promoting suicide to children on the YouTube Kids app.

FEATURED VIDEO

"My research has led me into a horrifying world where people create cartoons glorifying dangerous topics and scenarios such self-harm, suicide, sexual exploitation, trafficking, domestic violence, sexual abuse, and gun violence which includes a simulated school shooting. All of these videos were found on YouTube Kids, a platform that advertises itself to be a safe place for children 8 years old and under," Hess writes on PediMom.

A spokesperson for YouTube told the Washington Post the video platform is always working to make sure it is "not used to encourage dangerous behavior and we have strict policies that prohibit videos which promote self-harm."

According to YouTube's spokesperson, the Google-owned platform relies on both user flagging and smart detection technology to flag content that violates its policies. "Every quarter we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views," the spokesperson told the Post in a written statement.

YouTube is aware of and trying to fix the problem, but it's a really hard one to fix because it's really hard to police YouTube.

This is not the first time inappropriate content aimed at YouTube's youngest viewers has made the news.

Back in 2017 the inappropriate use of beloved kids characters like Peppa Pig, Spider-Man and Elsa became international news after a viral Medium post by writer James Bridle and a report by The New York Times illuminated the weird, creepy and downright disturbing videos YouTube users were creating featuring the characters.

In 2018, YouTube announced plans to roll out a non-algorithmic version option within parental controls in the YouTube Kids apps that serves up content curated by humans who know the difference between the real Peppa Pig and a crazed counterfeit version.

The thing about YouTube is that while it's been around for so long that some of its stars don't remember a world where it didn't exist, it's still kind of a Wild West, anything-goes content portal, and while it certainly attracts some creators of quality kids programming, it's also open to people who don't care about kids at all, and, even worse, are actually seeking to scare or harm them.

Parents can take action on this by saying no to unsupervised YouTube. If you know you're going to want to have a portable video handy in case of emergency (like a long wait at the DMV or the doctor's office), download some legit videos (like an episode of Sesame Street) from Google Play or the Apple App Store and save the YouTube sessions for when you can devote as much attention to the screen as your child can, and monitor their content consumption in real time.

You might also like:

They say a watched pot never boils, but every pumping mama knows the expression should really say "a watched bottle never fills."

When I think back to those early days of pumping, I remember settling in front of the TV to attempt to distract myself from the tedium of being hooked to a machine. It never worked: I'd always pay little attention to whatever I was watching, opting instead to stare at the bottles I was pumping into and wonder why they were filling so slowly.

Keep reading Show less
News