Google’s YouTube Kids App Isn’t That Kid-Friendly According to Consumer Groups

In April, a group of consumer advocate groups including The Center for Digital Democracy (CDD), Campaign for a Commercial Free Childhood (CCFC), American Academy of Child and Adolescent Psychiatry, Ce...
Google’s YouTube Kids App Isn’t That Kid-Friendly According to Consumer Groups
Written by Josh Wolford

In April, a group of consumer advocate groups including The Center for Digital Democracy (CDD), Campaign for a Commercial Free Childhood (CCFC), American Academy of Child and Adolescent Psychiatry, Center for Science in the
Public Interest, Children Now, Consumer Federation of America, Consumer Watchdog, and Public Citizen called on the Federal Trade Commission to open up an investigation against Google’s YouTube Kids app. The main point of that complaint involved the intermixing of “commercial and other content in ways that are deceptive and unfair to children and would not be permitted to be shown on broadcast or cable television.”

Basically, these groups alleged that the YouTube Kids app was showing ads to kids.

Now, CCFC and CDD are reporting that an additional review has surfaced even more disturbing things about the YouTube Kids app – pervasive adult content.

According to the groups, they were able to find Explicit sexual language presented amidst cartoon animation; Videos that model unsafe behaviors such as playing with lit matches, shooting a nail gun, juggling knives, tasting battery acid, and making a noose; A profanity-laced parody of the film Casino featuring Bert and Ernie from Sesame Street; Graphic adult discussions about family violence, pornography, and child suicide; Jokes about pedophilia and drug use; and Advertising for alcohol products.

To drive the point home, the CCFC and CDD made a video:

Is YouTube Kids A Safe Place for Young Children to Explore? from CCFC on Vimeo.

They’ve sent a letter to the FTC to update their complaint.

Google claims that YouTube Kids was “built from the ground up with little ones in mind” and is “packed full of age-appropriate videos.” The app includes a search function that is voice-enabled for easy use for preschool children. Google says it uses “a mix of automated analysis, manual sampling, and input from our users to categorize and screen out videos and topics that may make parents nervous.” Google also assures parents that they “can rest a little easier knowing that videos in the YouTube Kids app are narrowed down to content appropriate for kids.”

Google does not, in fact, “screen out the videos that make parents nervous” and its representations of YouTube Kids as a safe, child-friendly version of YouTube are deceptive. Parents who download the app are likely to expose their children to the very content they believed they would avoid by using the preschool version of YouTube. In addition to the unfair and deceptive marketing practices we identified in our initial request for an investigation, it is clear that Google is deceiving parents about the effectiveness of their screening processes and the content on YouTube Kids.

A YouTube spokesperson has issued a statement, reiterating that parents can turn off search inside the app.

“We work to make the videos in YouTube Kids as family-friendly as possible and take feedback very seriously. Anyone can flag a video and these videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed. For parents who want a more restricted experience, we recommend that they turn off search,” says YouTube.

While it’s true that parents can disable the app’s search function, it is enabled by default.

And as YouTube’s statement reiterates – much of the content moderation is done by fielding manual reports, at which point Google then yanks offending videos from the app.

From the get-go, Google admitted that some stuff could slip through the cracks.

“When your child browses the app’s home screen, they’ll find a vast selection of kid-appropriate channels and playlists. When families search in the app, we use a mix of input from our users and automated analysis to categorize and screen out the videos that make parents nervous. And for added peace of mind, parents can quickly notify YouTube if they see anything questionable directly from the app,” said Google back in February, upon launch of the app.

Google said this new YouTube Kids app is just a first step – the “first building block in tech for tykes.” We’ve heard for a while that Google is getting more serious about building products and services for kids. If this is the goal, content filtering is going to have to get better.

Sure, Google presents YouTube Kids as a way for parents to feel safer about their kids watching YouTube.And it’s clear that Google has failed to prevent some adult-themed content from appearing inside the app. But parents can turn off the search function (maybe Google should have it switched off by default?) and in the end, parents should know that no content moderation system is 100% foolproof.

Subscribe for Updates

Newsletter

By signing up for our newsletter you agree to receive content related to ientry.com / webpronews.com and our affiliate partners. For additional information refer to our terms of service.
Get the WebProNews newsletter delivered to your inbox

Get the free daily newsletter read by decision makers

Subscribe
Advertise with Us

Ready to get started?

Get our media kit