Citation record for Incident 1

Suggested citation format

Yampolskiy, Roman. (2015-05-19) Incident Number 1. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Partnership on AI. Retrieved on July 30, 2021 from incidentdatabase.ai/cite/1.

Incident Stats

Incident ID
Report Count
Incident Date
1
14
2015-05-19

Tools

All IncidentsDiscover

CSET Taxonomy Classifications

Taxonomy Details

Full Description

The content filtering system for YouTube's children's entertainment app, which incorporated algorithmic filters and human reviewers, failed to screen out inappropriate material, exposing an unknown number of children to videos that included sex, drugs, violence, profanity, and conspiracy theories. Many of the videos, which apparently numbered in the thousands, closely resembled popular children's cartoons such as Peppa Pig, but included disturbing or age-inappropriate content. Additional filters provided by YouTube, such as a "restricted mode" filter, failed to block all of these videos, and YouTube's recommendation algorithm recommended them to child viewers, increasing the harm. The problem was reported as early as 2015 and was ongoing through 2018.

Short Description

YouTube’s content filtering and recommendation algorithms exposed children to disturbing and inappropriate videos.

Severity

Moderate

Harm Distribution Basis

Age

Harm Type

Psychological harm

AI System Description

A content filtering system incorporating machine learning algorithms and human reviewers. The system was meant to screen out videos that were unsuitable for children to view or that violated YouTube's terms of service. These videos were initially collected either algorithmically or on the basis of user reports. A recommendation system that suggested videos to viewers based on their viewing history on the platform.

System Developer

YouTube

Sector of Deployment

Arts, entertainment and recreation

Relevant AI functions

Perception, Cognition, Action

AI Techniques

machine learning

AI Applications

content filtering, decision support, curation, recommendation engine

Location

Global

Named Entities

Google, YouTube, YouTube Kids

Technology Purveyor

YouTube, Google

Beginning Date

2015-01-01T00:00:00.000Z

Ending Date

2018-01-01T00:00:00.000Z

Near Miss

Unclear/unknown

Intent

Accident

Lives Lost

No

Data Inputs

Videos; video metadata; complaints submitted by users

Incidents Reports

Google’s YouTube Kids App Criticized for ‘Inappropriate Content’

blogs.wsj.com · 2015

Child and consumer advocacy groups complained to the Federal Trade Commission Tuesday that Google’s new YouTube Kids app contains “inappropriate content,” including explicit sexual language and jokes about pedophilia.

Google launched the app for young children in February, saying the available videos were “narrowed down to content appropriate for kids.”...

Google’s YouTube Kids App Criticized for ‘Inappropriate Content’
Remove YouTube Kids app until it eliminates its inappropriate content

change.org · 2015

Videos filled with profanity, sexually explicit material, alcohol, smoking, and drug references - this is what parents are finding on Google’s YouTube Kids app. That’s right - its kids app. Now, parents across the country are calling on Google to remove the app until it can guarantee the total elimination of this inappropriate content.

When my neighbors told me about the horrible adult content popping up on the Youtube Kids app, I thought there must be a mistake. Why would Google market an app as “a family-friendly place to explore” and not have proper safeguards in place? Unfortunately, it turned out to be true. And I’ve since learned of the numerous complaints filed to the Federal Trade Commission about this very problem.

Even worse, Google’s response has been laughable. They tell parents to simply flag inappropriate material or set new filters. As a father of two, it makes me angry when a large company like Google doesn’t take responsibility for its kids’ products. Parents are being sold on an app built for kids 5 and under that is supposed to keep them safe from adult content. Parents like myself are joining forces to hold Google accountable.

Tell Google to remove the YouTube Kids app until it can live up to its marketing.

The solution is simple: only allow content pre-approved for ages 5 and under to appear on the app, and don’t allow ads clearly meant for adults. Unless it can live up to expectations, the app should be removed.

Parents are not the only ones outraged. The media has blasted Google’s app, calling it “the most anti-family idea ever to come out of Silicon Valley," and reporting that it “ignores basic protections for children.”

With your support, we can get Google to remove YouTube Kids until the proper protections are in place.

These are examples of videos encountered on YouTube Kids:

A graphic lecture discussing hardcore pornography by Cindy Gallop:

https://www.youtube.com/watch?v=EgtcEq7jpAk

How to make chlorine gas with household products (chemical weapon used in Syria):

https://www.youtube.com/watch?v=DF2CXHvh8uI

How to tie a noose:

https://www.youtube.com/watch?v=TpAA2itjI34

How to throw knives:

https://www.youtube.com/watch?v=NGgzn1haQ-E

A guy tasting battery acid:

https://www.youtube.com/watch?v=gif-OWNjJSw

How to use a chainsaw:

https://www.youtube.com/watch?v=Kk28thdgCEU

A “Sesame Street” episode dubbed with long strings of expletives:

https://www.youtube.com/watch?v=kVkqzE-iiEY

References to pedophilia in a homemade video reviewing a “My Little Pony” episode:

https://www.youtube.com/watch?v=7K9uH4d-HnU

A DIY video on conducting illegal piracy, featuring pictures of marijuana leaves:

https://www.youtube.com/watch?v=dZDF5uqORA0...

Remove YouTube Kids app until it eliminates its inappropriate content
Disturbing YouTube Kids video shows Mickey Mouse with gun

today.com · 2016

Update, Nov. 7, 2017: TODAY Parents is resharing this story from 2016 because a new series of inappropriate videos have cropped up on YouTube Kids and are making headlines. While the channel names and characters used may be different, the problem remains the same — disturbing videos hiding behind familiar cartoon characters are making their way on to YouTube and are finding young audiences.

Here's TODAY's video about the latest developments. Helpful tips to avoid such content include following only trusted YouTube channels, carefully setting and updating parental controls for video programs and apps, listening to and watching content with your children, and keeping electronic devices in an open area while they're being used.

Parents of tablet-using kids are no stranger to YouTube videos with the catchy tune of "Finger Family" songs — videos in which cartoon characters dance on the ends of illustrated fingers while singing lyrics like, "Daddy finger, daddy finger, where are you? Here I am. Here I am. How do you do?"

Several videos on the Superkidz Finger Family YouTube channel begin with familiar cartoon characters singing, then turn to images of graphic violence. YouTube/Superkidz Finger Family

While moms and dads have joked about these videos being annoying, they were presumably safe on the YouTube Kids app, a version of the video site that YouTube describes as "a delightfully simple and free app, where kids can discover videos, channels and playlists they love."

However, one channel available on YouTube Kids, Superkidz Finger Family, has offered a dark take on the "Finger Family" song. Recently, some moms have shared on social media their shock at finding graphic images of Mickey Mouse and his family, shooting one another — and themselves — in the head with guns.

As this video progresses, Mickey Mouse is shown holding a gun, shooting others and himself. YouTube/Superkidz Finger Family

YouTube removed the channel after speaking to TODAY Parents about it — days after moms started to spread the alarm about the totally inappropriate content. Beth Brister-Kaster, an Ohio mom, posted a video to Facebook, sharing one of the videos — which begins with innocent cartoons before switching to the violent scenes.

"I just deleted YouTube Kids' app forever. Never again," Brister-Kaster says as she shows the questionable video. "This is absolutely insane. This is what our children are watching...it's just Peppa Pig and then, all of a sudden, it goes to Mickey Mouse shooting people."

Brister-Kaster, whose daughter is 4, says she posted the video because she was horrified to see such content on a site she trusted.

RELATED: Child advocacy groups say YouTube Kids rife with 'inappropriate' videos

"I need to protect (my daughter) and I needed other parents to be aware of this kind of garbage that is out on the Internet," Brister-Kaster told TODAY Parents. "I definitely don't want (my daughter) to see that kind of stuff. Who would have thought to do something like that three minutes into a little kids video?"

"This is absolutely insane," Brister-Kaster says in her video. "This is what our children are watching...it's just Peppa Pig and then, all of a sudden, it goes to Mickey Mouse shooting people." YouTube/Superkidz Finger Family

Maryland mom Chaye Benjamin also posted a response to the content, filming a Facebook video while she hid in her bathroom, so that her 3-year-old daughter would not see the video a second time.

"This is a kids' YouTube channel...this isn't even a regular YouTube Channel, it's a kids' channel," Benjamin says through tears. "Please be careful. Please watch the videos with your children, don't just let them watch the videos by themselves."

"Please be careful," Benjamin said in her video. "Please watch the videos with your children, don't just let them watch the videos by themselves." YouTube/Superkidz Finger Family

Benjamin's plea to other parents inspired a Change.org petition, asking YouTube to ban the Superkidz Finger Family channel completely. Other parents left supportive comments saying they had reported the inappropriate content to YouTube.

A spokesperson for YouTube told TODAY Parents that the company works hard to ensure content found on YouTube Kids is family-friendly, adding that they take viewer feedback very seriously. The company also has plans for future updates to the YouTube Kids app, which will allow parents to further customize the types of content they want their kids to watch through the parental control area of the app.

"We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video," said the spokesperson. "Flagged videos are manually reviewed 24/7 and any videos that don't belong in the app are removed."

When asked about the Superkidz Finger Family channel specifically, Yo...

Disturbing YouTube Kids video shows Mickey Mouse with gun
The disturbing YouTube videos that are tricking children

bbc.com · 2017

Media playback is unsupported on your device

Thousands of videos on YouTube look like versions of popular cartoons but contain disturbing and inappropriate content not suitable for children.

If you're not paying much attention, it might look like an ordinary video featuring Peppa Pig, the cheeky porcine star of her own animated series. But soon after pressing play on this particular YouTube clip, the plot turns dark. A dentist with a huge syringe appears. Peppa's teeth get pulled out. Distressed crying can be heard on the soundtrack.

Parent and journalist Laura June almost immediately noticed something was not quite right as her three-year-old daughter was watching it.

"Peppa does a lot of screaming and crying and the dentist is just a bit sadistic and it's just way, way off what a three-year-old should watch," June says. She wrote about her experiences on the website The Outline.

"But the animation is like close enough to looking like Peppa - it's crude but it's close enough that my daughter was like 'This is Peppa Pig.'"

It's far from an isolated case - BBC Trending has found hundreds of similar videos of children's cartoon characters with inappropriate themes. In addition to Peppa Pig, there are similar videos featuring characters from the Disney movie Frozen, the Minions franchise, Doc McStuffins, Thomas the Tank Engine, and many more.

Some of the videos are parodies or have such over-the-top content that they're clearly meant for mature audiences. Others are unauthorised copies of authentic cartoons or use the characters in innocent ways - troubling to copyright lawyers perhaps, but not necessarily harmful to children.

However many, like the video Laura June's daughter saw, both contain disturbing content and can pass for the real cartoons, particularly when viewed by children.

Image copyright SmileKidsTV/YouTube Image caption Some of the cartoons feature violence or frightening situations

More from BBC Trending

Visit the Trending Facebook page

Hundreds of these videos exist on YouTube, and some generate millions of views. One channel "Toys and Funny Kids Surprise Eggs" is one of the top 100 most watched YouTube accounts in the world - its videos have more than 5 billion views.

Its landing page features a photo of a cute toddler alongside official-looking pictures of Peppa Pig, Thomas the Tank Engine, the Cookie Monster, Mickey and Minnie Mouse and Elsa from Frozen.

But the videos on the channel have titles like "FROZEN ELSA HUGE SNOT", "NAKED HULK LOSES HIS PANTS" and "BLOODY ELSA: Frozen Elsa's Arm is Broken by Spiderman". They feature animated violence and graphic toilet humour.

The people behind the account didn't respond to Trending's request for an interview. We attempted to contact several other producers of similar videos - and got the same result.

How to avoid inappropriate videos on YouTube

• The YouTube Kids app filters out most - but not all - of the disturbing videos.

• YouTube suggests turning on "restricted mode" which can be found at the bottom of YouTube pages:

Image copyright YouTube

• The NSPCC also has a series of guidelines about staying safe online, and there are more resources on the BBC Stay Safe site.

Image copyright CandyFamily/YouTube

Trending also contacted two companies behind the cartoon series being ripped off, Disney and EntOne - the distributor of Peppa Pig. Neither wanted to comment.

So should parents take more care when it comes to allowing their children to watch cartoons on YouTube?

Sonia Livingstone is an expert on child online safety and professor of social psychology at the London School of Economics,

"It's perfectly legitimate for a parent to believe that something called Peppa Pig is going to be Peppa Pig," she says. "And I think many of them have come to trust YouTube... as a way of entertaining your child for ten minutes while the parent makes a phone call. I think if it wants to be a trusted brand then parents should know that protection is in place."

"I don't think we want to police it for the whole world," Livingstone says. "A lot of this material is satirical, creative - or actually offensive but within freedom of expression. What we need is child protection."

Image copyright CandyFamily/YouTube

YouTube did not offer a spokesperson for interview, but in a statement said: "We take feedback very seriously. We appreciate people drawing problematic content to our attention, and make it easy for anyone to flag a video.

"Flagged videos are manually reviewed 24/7 and any videos that don't belong in the app are removed within hours. For parents who want a more restricted experience, we recommend that they turn off the Search feature in the app."

The company also suggested that parents use the YouTube Kids app, which is available for mobile phones and tablets, and turn on "restricted mode" which limits flagged content. It can be found at the bottom of any page on the YouTube site, but cautions that "no filter is 100% accurate".

And since Trending began investigati...

The disturbing YouTube videos that are tricking children
YouTube has thousands of disturbing videos targeted at kids: Report

businessinsider.com · 2017

An off-brand Paw Patrol video called "Babies Pretend to Die Suicide" features several disturbing scenarios. YouTube

Parents who let their kids watch YouTube unattended might want to pay closer attention to what they're viewing.

YouTube is serving up to kids thousands of disturbing videos, many from obscure producers, that are sprinkled in with kid-friendly ones from well-known studios, according to analysis in a recent Medium post and past reporting from the BBC. At first glance, the inappropriate videos can be hard to distinguish from the benign ones, often involving well-known characters and themes. But they can often be violent or involve abuse to kids, and many appear to be generated automatically and cheaply in an apparent attempt to profit off the sometimes indiscriminate tastes of younger kids.

"The architecture [that Google and YouTube] have built to extract the maximum revenue from online video is being hacked by persons unknown to abuse children, perhaps not even deliberately, but at a massive scale," wrote writer and artist James Bridle, who took to Medium on Monday to help sound the alarm about the continuing phenomenon.

YouTube's recommended videos are a mix of branded and off-brand content featuring the same cast of characters. YouTube In a statement to Business Insider, YouTube said that while it is looking out for younger viewers, it's trying to keep its site as open as possible.

"We're always looking to improve the YouTube experience for all our users and that includes ensuring that our platform remains an open place for free communication while balancing the removal of controversial content," the company said in the statement. It continued: "In the last year, we have updated our advertising policy to clearly indicate that videos depicting family entertainment characters engaged in inappropriate behavior are not eligible for advertising on YouTube."

Peppa Pig's disturbing trip to the dentist

One video Bridle highlighted as "controversial" that hasn't been removed is a knockoff video of Peppa Pig, a popular animated character in the UK. The BBC first reported on the disturbing video in March. Posted to YouTube last year, the clip looks and sounds similar to a real Peppa Pig video and even involves a similar situation — going to the dentist.

But the knockoff video is punctuated with the disturbing sound of a child wailing in the background. When Peppa gets to the dentist's office, he injects her with a green serum and proceeds to pull out all her teeth. She then has to fight a sinister-looking masked monster. Needless to say, the real Peppa Pig's dental visit isn't nearly as horrifying.

That's just one of many disturbing pseudo-Peppa Pig videos on the site, Bridle said.

"Many are so close to the original, and so unsignposted — like the dentist example — that many, many kids are watching them," he wrote on Medium. "I understand that most of them are not trying to mess kids up, not really, even though they are."

But bogus Peppa Pig clips represent a small portion of the disturbing kids videos that can be found on the site, Bridle noted. There are thousands of other videos that may not show disturbing content per se, but are upsetting nonetheless, because they appear to be automatically generated based on popular keywords and tuned specifically to attract kids' clicks, according to Bridle.

"There are vast, vast numbers of these videos," Bridle said. "Channel after channel after channel of similar content, churned out at the rate of hundreds of new videos every week. Industrialized nightmare production."

It's unclear what motivates users to post the disturbing content to the site. Last year, in an attempt to discourage these types of videos, YouTube began pulling ads from videos that depict kids characters engaged in inappropriate behavior, removing the ability for the creators to make money off of them.

YouTube Kids may be designed for kids, but it's not a guaranteed refuge from inappropriate videos

YouTube says its main YouTube.com site and app are not intended to be used by kids younger than 13. However the main YouTube service contains plenty of kids videos. Meanwhile, the company does little to inform parents or kids that the service is not intended for younger children or to prevent kids from accessing it.

The company's notification that the main YouTube service is for people 13 and older is stated within point 12 of its Terms of Service, a document most users likely never see. The company does take steps to block kids younger than 13 from creating Google accounts — which are needed to subscribe to YouTube channels or upload videos to the service — and to prevent them from seeing some videos depicting sex or other adult themes. But most videos on the main YouTube service can be accessed without a Google account, and many kids view the site while logged into their parents' accounts.

Instead of having kids use its main site and app, YouTube recommends that parents direct their children to its...

YouTube has thousands of disturbing videos targeted at kids: Report
YouTube to crack down on inappropriate content masked as kids’ cartoons

arstechnica.com · 2017

Recent news stories and blog posts highlighted the underbelly of YouTube Kids, Google's children-friendly version of the wide world of YouTube. While all content on YouTube Kids is meant to be suitable for children under the age of 13, some inappropriate videos using animations, cartoons, and child-focused keywords manage to get past YouTube's algorithms and in front of kids' eyes. Now, YouTube will implement a new policy in an attempt to make the whole of YouTube safer: it will age-restrict inappropriate videos masquerading as children's content in the main YouTube app.

The reasoning behind this decision has to do with the relationship between the main YouTube app and YouTube Kids (which has its own dedicated app). Before any video appears in the YouTube Kids app, it's filtered by algorithms that are supposed to identify appropriate children's content and content that could be inappropriate or in violation of any YouTube policies. YouTube also has a team of human moderators that review any videos flagged in the main YouTube app by volunteer Contributors (users who flag inappropriate content) or by systems that identify recognizable children's characters in the questionable video.

If the human moderator finds that the video isn't suitable for the YouTube Kids app, it will be age-restricted in the main YouTube app. No age-restricted content is allowed in the YouTube Kids app at all. As for those using the main YouTube app, age-restricted content cannot be viewed by anyone not logged into a YouTube account, anyone under the age of 18, or anyone with Restricted Mode turned on. According to a report from The Verge, YouTube claims this policy has been in the works for some time now and is not in response to the recent online concern.

Also, all age-restricted content is not eligible for advertising, which will undoubtedly hit the wallets of the creators making these videos. While it's hard to understand why anyone would make a video about Peppa Pig drinking bleach or a bunch of superheroes and villains participating in a cartoonish yet violent "nursery rhyme," it's been a decent way to make money on YouTube. Some of these videos have amassed hundreds of thousands (and sometimes millions) of views, gleaning ad dollars and channel popularity.

The unnerving reality is that it's possible that many of those views came from YouTube's "up next" and "recommended" video section that appears while watching any video. YouTube's algorithms attempt to find videos that you may want to watch based on the video you chose to watch first. If you don't pick another video to watch after the current video ends, the "up next" video will automatically play. Since some of these inappropriate videos showed up on YouTube Kids (and on the main YouTube app as well), it's possible that any one of them was an "up next" video that automatically played after hours of kids watching other appropriate yet categorically similar content.

This new age-restriction policy should prevent that from happening by stopping inappropriate content from ever making it to YouTube Kids. It takes a few days for content to transition from the main YouTube app to YouTube Kids, and the company is hoping the work of human moderators, Contributors, and the new policy will prevent any more of this content from getting into its safe place for children.

Even though the new policy is geared toward making YouTube Kids a safer place, it does have implications for audiences of the main YouTube site as well. But these videos aren't going away, and some would argue that many of them are satires or parodies, both of which are permissible under YouTube guidelines. Families that use the regular YouTube app instead of YouTube Kids will want to check their account restrictions if they don't want these videos popping up unexpectedly....

YouTube to crack down on inappropriate content masked as kids’ cartoons
YouTube says it will crack down on bizarre videos targeting children

theverge.com · 2017

Earlier this week, a report in The New York Times and a blog post on Medium drew a lot of attention to a world of strange and sometimes disturbing videos on YouTube aimed at young children. The genre, which we reported on in February of this year, makes use of popular characters from family-friendly entertainment, but it’s often created with little care, and can quickly stray from innocent themes to scenes of violence or sexuality.

In August of this year, YouTube announced that it would no longer allow creators to monetize videos which “made inappropriate use of family friendly characters.” Today it’s taking another step to try and police this genre.

“We’re in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged,” said Juniper Downs, YouTube’s director of policy. “Age-restricted content is automatically not allowed in YouTube Kids.” YouTube says that it’s been formulating this new policy for a while, and that it’s not rolling it out in direct response to the recent coverage.

The first line of defense for YouTube Kids are algorithmic filters. After that, there is a team of humans that review videos which have been flagged. If a video with recognizable children’s characters gets flagged in YouTube’s main app, which is much larger than the Kids app, it will be sent to the policy review team. YouTube says it has thousands of people working around the clock in different time zones to review flagged content. If the review finds the video is in violation of the new policy, it will be age restrictied, automatically blocking it from showing up in the Kids app.

YouTube says it typically takes at least a few days for content to make its way from YouTube proper to YouTube Kids, and the hope is that within that window, users will flag anything potentially disturbing to children. YouTube also has a team of volunteer moderators, which it calls Contributors, looking for inappropriate content. YouTube says it will start training its review team on the new policy and it should be live within a few weeks.

Along with filtering content out of the Kids app, the new policy will also tweak who can see these videos on YouTube’s main service. Flagged content will be age restricted, and users won’t be able to see those videos if they’re not logged in on accounts registered to users 18 years or older. All age-gated content is also automatically exempt from advertising. That means this new policy could put a squeeze on the booming business of crafting strange kid’s content.

YouTube is trying to walk a fine line between owning up to this problem and arguing that the issue is relatively minor. It says that the fraction of videos on YouTube Kids that were missed by its algorithmic filters and then flagged by users during the last 30 days amounted to just 0.005 percent of videos on the service. The company also says the reports that inappropriate videos racked up millions of views on YouTube Kids without being vetted are false, because those views came from activity on YouTube proper, which makes clear in its terms of service that it’s aimed at user 13 years and older.

In today’s policy announcement, YouTube is acknowledging the problem, and promising to police it better. It doesn’t want to outright ban the use of family-friendly characters by creators who aren’t the original copyright holders across all of YouTube. There is a place, the company is arguing, for satire about Peppa Pig drinking bleach, however distasteful you might find it. But YouTube is acknowledging that YouTube Kids requires even more moderation. And, the company is willing to forgo additional ad revenue — and there is a lot of money flowing through this segment of the industry — if that’s what it takes to ensure YouTube Kids feels like a safe experience for families....

YouTube says it will crack down on bizarre videos targeting children
YouTube Kids has been a problem since 2015 - why did it take this long to address?

polygon.com · 2017

InIn the last few weeks, the world has learned through a number of reports that YouTube is plagued by problems with children’s content. The company has ramped up moderation in recent weeks to fight the wave of inappropriate content, but this isn’t the first time YouTube has been in this position.

WHEN DID IT START?

On Feb. 23, 2015, YouTube announced YouTube Kids, a stand-alone app built for children and child-appropriate entertainment. The idea was to make YouTube a safer platform for parents, who didn’t want their children using the main site unsupervised. The initial blog post about the Kids app mentions that “parents can rest a little easier knowing that videos in the YouTube Kids app are narrowed down to content appropriate for kids.”

Parental controls, including giving parents the ability to remove the search option from the app, giving their children access to “just the pre-selected videos available on the home screen” were also included. The Kids App, according to Shimrit Ben-Yair, YouTube Kids Group’s product manager, marked the “first step toward reimagining YouTube for families.”

Less than two months later, in May 2015, the Campaign for a Commercial-Free Childhood, a coalition of children’s and consumers advocacy groups, complained to the Federal Trade Commission (FTC) about content they called “not only ... disturbing for young children to view, but potentially harmful.”

Using popular characters like Frozen’s Elsa and Spider-Man, YouTubers are able to lure children into offensive videos featuring their favorite characters. While at first these videos seem normal, they soon lead to those same Disney princesses and superheroes participating in lewd or violent acts. YouTube’s search algorithm makes it easy for children to fall into gruesome playlist traps full of this kind of content, as users name their videos and use thumbnails that can get around YouTube’s algorithm, ensuring the content seems safe for kids.

The report lists a number of issues Golin and other members of advocacy teams discovered in the YouTube Kids app early on. These include:

Explicit sexual language presented amidst cartoon animation;

A profanity-laced parody of the film Casino featuring Bert and Ernie from Sesame Street;

Graphic adult discussions about family violence, pornography and child suicide;

Jokes about pedophilia and drug use;

Modeling of unsafe behaviors such as playing with lit matches.

A YouTube representative told the San Jose Mercury News after the complaint was filed that, when the company was working on the YouTube Kids app, it “consulted with numerous partners and child advocacy and privacy groups,” adding that YouTube is “always open to feedback on ways to improve the app.”

But not much changed. A report from The Guardian in June 2016, pointed out that the third most popular channel on YouTube at the time was “Webs & Tiaras,” which curated content that was targeted to children. The channel, according to reports, starred an assortment of adults in superhero costumes or princess attire performing more mature acts.

The channel’s content was questionable, but mostly understood to be acceptable. Other bad actors who wanted to piggyback off the success of the channel began posting similar content but with sexual imagery and disturbing content. In 2016, Phil Ranta, a spokesperson for the channel told The Verge it wasn’t surprising this was happening.

“I think it’s natural that when something is as big as this [new genre] is, and they see people making millions of dollars a year, they will try almost everything: go cleaner, adding dialog, go sexier, or crazier,” Ranta said. “They kind of just need to exhaust those measures before they realize if they can stay in this game.”

One channel in particular, Webs & Tiaras — Toy Monster was using the Webs & Tiaras name to spread disturbing content under a trending association. Many of those videos have been deleted, but some still remain on YouTube. The original Webs & Tiaras channel was removed before this article was written.

YouTube is finally addressing the issue on its main site, making changes to the way it moderates content. YouTube CEO Susan Wojcicki said the company was expanding its moderation corps to more than 10,000 contractors in 2018, focusing them on “content that might violate our policies.”

“Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content,” she wrote in a recent blog post.

Almost three years later, YouTube has responded to a number of concerns raised by parents and media critics about content — both on the main site and in YouTube’s stand-alone Kids app — they find disturbing, glorifying violence or obscene. The company issued the following statement to Polygon:

Content that misleads or endangers children is unacceptable to us. We have clear policies against these videos and we enforce them aggressively. We use a combination of machine learning, algorithms and community flagging to determine content in the YouTube Kids app. The YouTube team is made up of parents who care deeply about this, and are committed to making the app better every day.

But questions regarding content on its independent Kids app — including whether future content will be curated, or if the app will feature fewer videos to allow for more human-led moderation — remain mostly unanswered.

A YouTube representative told Polygon that only five-thousandths (0.005) of a percent of content on YouTube Kids is considered disturbing and against the company’s policies, adding that once content is reported, the company takes strict action to remove videos and, in serious cases, entire channels from the app.

“YouTube is marketing this as a safe place for children to explore but it’s not a safe place for children to explore,” Campaign for a Commercial-Free Childhood director Josh Golin told Polygon. “We were really the first ones to raise this issue, and this is going back two-and-a-half years. What we’ve found in the interim [is] — it’s like Whac-A-Mole — so we pointed out videos as we saw them. Every video we named in our complaint came off [the app], but of course there were more.

“It’s a terrible way to build an app for children,” he said.

In the May 2015 complaint to the FTC on May 19, 2015, Golin pointed out how YouTube’s search algorithm could be exploited even within a supposedly safe environment for children.

As users of YouTube Kids search for material, the app begins to recommend similar videos, as the “Recommended” function on YouTube Kids is apparently based on “Search” history. When we were conducting our review, YouTube Kids actually began recommending videos about wine tasting on its app for preschoolers, as the screen shot below indicates. Thus, the more inappropriate videos children search for, the more inappropriate videos they will be shown via the app’s “Recommended” function.

Golin and his colleagues weren’t the only people who noticed that content targeted toward children on the YouTube Kids app and main site were problematic. In the past couple of years, multiple parent groups have sprung up on Facebook with the intent of learning how to navigate, flag and curate a safe experience for their children.

The mission statement of one Facebook group, “Parents Navigating YouTube,” states the group was created to:

Help white list YouTube content that is safe for our kids to watch without parental guidance. This will cover adult content but also objectionable content like white supremacy, sexism and things of that nature. We will also discuss problematic YouTube content so that we know what's out there and can be prepared to discuss it with our kids.

With little support from YouTube, these groups often act as volunteer watchdogs, going through the worst of YouTube and flagging it. A YouTube representative told Polygon that despite its own machine-learning algorithm being improved daily — learning what content is unacceptable for children — the team does rely on flags from parents to help address problematic videos and channels.

At a time when parents were working together on creating a new, improved system to keep an eye on what children see, YouTube was focused on other aspects of the app. The company chose to highlight other areas of the YouTube Kids app in 2016, including the fact that it could be viewed on the Apple TV and was compatible with YouTube Red.

It wasn’t until this year, that YouTube began to invest heavily in preventative measures.

First and foremost, moderation.

CHANGES BEING MADE

Although YouTube has maintained that its main site is being exploited by bad actors, not the YouTube Kids App specifically, it’s still impossible for YouTube to guarantee that the Kids App is 100 percent safe. The company told USA Today that parents who want to be sure their children aren’t stumbling upon disturbing content, the options for “recommended videos” should be switched off.

“Sometimes your child may find content in the app that you may not want them to watch,” a YouTube representative told USA Today.

Malik Ducard, YouTube’s global head of family and children’s content, told the New York Times these types of videos were “the extreme needle in the haystack,” and pointed to the algorithm’s machine learning and lack of oversight for reasons these videos may have slipped through. Ducard also said YouTube Kids did not serve a curated experience, meaning parents were responsible for controlling what their children watch.

When restriction mode is on, which gives parents the ability to disable the search function and prevent additional videos from the main site flooding the app, it’s difficult for children to switch back to a more open, free roaming setting. YouTube also makes it clear once parents sign up for the app that, despite the company’s best efforts, disturbing content created by bad actors may appear.

GRID VIEW

1 of 5

YouTube Kids app welcome screen YouTube

From YouTube’s perspective, perfect moderation is impossible. 400 hours of video are uploaded every minute.

But for Golin, current efforts aren’t enough. Golin told Polygon that it’s irresponsible for YouTube to treat its algorithm as a “big fishing net,” assuming that the algorithm will catch every bad video meant to exploit it.

“The entire premise of YouTube Kids’ app is wrong if you’re worried about child safety,” Golin said. “You can’t have an app that has millions and millions of videos on it, but that’s okay. Children don’t need an app with millions and millions of videos on it. They don’t need 20,000 videos of eggs on one app. What they need is a place where content has been vetted and safe.

“From a child standpoint, the problem is not fixable,” Golin said. “The YouTube model has created something, which is so vast, but there are 400 hours of content are uploaded every minute. It’s simply too big. People have been raising these issues for years, just visit any parenting forum and they’ve been talking about the fake Peppa Pig videos. It was only after the Medium piece went viral that YouTube started to take any proactive steps. To be clear, they took steps because advertisers were concerned, not parents.”

YouTube already went through an “adpocalypse,” in which big advertisers pulled out of the platform after finding their ads attached to videos filled with hateful content. So the company wants to avoid anything that would cause others to leave.

Part of YouTube’s plan is to increase human moderation and tweak its algorithm, “training machine-learning technology across other challenging content areas, including child safety and hate speech.” YouTube will also cut down on channels that receive monetization and advertisements attached to these videos. Since YouTube Kids also includes ads — many of which, Golin says, aren’t child appropriate — this will affect channels and videos on the platform.

WHAT’S NEXT FOR YOUTUBE KIDS?

YouTube Kids is a moneymaker; YouTube wouldn’t tell Polygon how much, exactly, but it does sell ads against the videos. Whatever it is, Golin thinks it’s enough that YouTube has no incentive to change its app. YouTube declined to comment when asked whether the company was going to curate its content and restrict the number of videos going forward.

“Their goal is to make money and unless there is enough of an outcry and there’s continued pressure on them, we’re going to see the same problems,” Golin said. “I don’t know that the problems are fixable. It would be great if YouTube came to the realization that these problems were fixable and made it clear [that if the company is not curating content] this is for adults who want to watch videos. I don’t have a lot of faith that they will get their on their own.”

A YouTube representative told Polygon that despite reports, the majority of the problem lies on the main site, which it will spend a large portion of its time addressing in the coming year. The representative said that expended changes to its current policy were put in place to discourage inappropriate content targeting families on the main app and site; by doing this, a representative confirmed, it’s supposed to ensure age-gated content (flagged for an audience of 18+) doesn't appear on YouTube Kids.

A YouTube representative also confirmed that content which is flagged on the main YouTube site is not supposed to appear on the Kids app. If a video does make its way to the app, the representative confirmed a secondary screening takes place, adding that a team is in place to moderate new videos that are flagged on the app at all times.

Questions are still being raised by parent groups and watch dog organizations, like ElsaGate on Reddit and Discord, which keep an eye on nefarious channels or videos that are getting through YouTube’s system about what’s next. A YouTube representative could not provide any more details at the time of writing.

With critics calling out YouTube for its slow response — among them News Corp CEO Robert Thompson, who called YouTube a “toxic waste dump,” — the question now is how immediately the problem is managed after the new year. Ducard and Wojcicki said the company is “working on ways to more effectively and proactively prevent this type of situation from occurring.”...

YouTube Kids has been a problem since 2015 - why did it take this long to address?
YouTube Kids app is STILL showing disturbing videos

dailymail.co.uk · 2018

Google-owned YouTube has apologised again after more disturbing videos surfaced on its YouTube Kids app.

Investigators found several unsuitable videos including one of a burning aeroplane from the cartoon Paw Patrol and footage explaining how to sharpen a knife.

YouTube has been criticised for using algorithms to sieve through material rather than using human moderators to judge what might be appropriate.

There have been hundreds of disturbing videos found on YouTube Kids in recent months that are easily accessed by children.

These videos have featured horrible things happening to various characters, including ones from the Disney movie Frozen, the Minions franchise, Doc McStuffins and Thomas the Tank Engine.

Parents, regulators, advertisers and law enforcement have become increasingly concerned about the open nature of the service.

Scroll down for video

YouTube has apologised again after more disturbing videos surfaced on its YouTube Kids app. Investigators found several unsuitable videos including one from the cartoon Paw Patrol on a burning aeroplane and footage showing how to sharpen a knife

A YouTube spokesperson has admitted the company needs to 'do more' to tackle inappropriate videos on their kids platform.

This investigation is the latest to expose inappropriate content on the video-sharing site which has been subject to a slew of controversies since its creation in 2005.

As part of an in-depth investigation by BBC Newsround, Google's Public Policy Manager Katie O'Donovan met five children who told her about the distressing videos they had seen on the site.

They included videos showing clowns covered in blood and messages warning them there was someone at the door.

Ms O'Donovan said she was 'very, very sorry for any hurt or discomfort'.

'We've actually built a whole new platform for kids, called YouTube Kids, where we take the best content, stuff that children are most interested in and put it on there in a packaged up place just for kids,' she said.

It normally takes five days for supposedly child-friendly content like cartoons to get from YouTube to YouTube Kids.

Within that window it is hoped users and a specially-trained team will flag disturbing content.

Once it has been flagged and reviewed, it won't appear on the YouTube Kids app and only people who are signed in and older than 18 years old will be able to view it.

The company say thousands of people will be working around the clock to flag content.

However, as part of the investigation Newsround revealed there are still lots of inappropriate videos on the Kids section.

'We have seen significant investment in building the right tools so people can flag that [content], and those flags are reviewed very, very quickly', Ms O'Donovan said.

'We're also beginning to use machine learning to identify the most harmful content, which is then automatically reviewed.'

The problem was managing an open platform where content is uploaded straight onto the site, she added.

'It is a difficult environment because things are moving so, so quickly', said Ms O'Donovan.

'We have a responsibility to make sure the platform can survive and can thrive so that we have a collection that comes from around the world on there'.

By the end of last year YouTube said it had removed more than 50 user channels and had stopped running ads on more than 3.5 million videos since June.

'Content that endangers children is unacceptable to us and we have clear policies against such videos on YouTube and YouTube Kids', a YouTube spokesperson told MailOnline.

'When we discover any inappropriate content, we quickly take action to remove it from our platform.

'Over the past few months, we've taken a series of steps to tackle many of the emerging challenges around family content on YouTube, including: tightening enforcement of our Community Guidelines, age-gating content that inappropriately targets families, and removing it from the YouTube Kids app.'

YouTube has been criticised for using algorithms to sieve through material rather than using human moderators to judge what might be appropriate (stock image)

In March, a disturbing Peppa Pig fake, found by journalist Laura June, shows a dentist with a huge syringe pulling out the character's teeth as she screams in distress.

Mrs June only realised the violent nature of the video as her three-year-old daughter watched it beside her.

'Peppa does a lot of screaming and crying and the dentist is just a bit sadistic and it's just way, way off what a three-year-old should watch,' she said.

'But the animation is close enough to looking like Peppa - it's crude but it's close enough that my daughter was like 'This is Peppa Pig.''

Another video depicted Peppa Pig and a friend deliberately burning down a house with someone in it.

All of these videos are easily accessed by children through YouTube's search results or recommended videos.

In March, a disturbing Peppa Pig fake, found by journalist Laura June, shows a dentist with a huge syringe pulling ou...

YouTube Kids app is STILL showing disturbing videos
YouTube suggested conspiracy videos to children using its Kids app

businessinsider.com · 2018

Children were able to watch David Icke's conspiracy videos through YouTube Kids. Flickr/Tyler Merbler

YouTube's app specifically for children is meant to filter out adult content and provide a "world of learning and fun," but Business Insider found that YouTube Kids featured many conspiracy theory videos which make claims that the world is flat, that the moon landing was faked, and that the planet is ruled by reptile-human hybrids.

YouTube Kids is a separate app from the main YouTube app, and it's meant to allow parents to let their children browse YouTube without being worried about any unsuitable content appearing. Children are encouraged to learn languages, read books, and watch educational videos.

Search for "UFO" on YouTube Kids and you'll mostly find videos of toys that are clearly fine for children to watch. But one of the top videos claimed to show a UFO shooting at a chemtrail, and we found several videos by prominent conspiracy theorist David Icke in the suggested videos. YouTube removed the videos from YouTube Kids after we contacted it about the issue.

One suggested video was an hours-long lecture by Icke in which he claims that aliens built the pyramids, that the planet is run by reptile-human hybrids, that Freemasons engage in human sacrifice, that the assassination of President Kennedy was planned by the US government, and that humans would evolve in 2012.

Two other conspiracy theory videos by Icke appeared in the related videos, meaning it was easy for children to quickly go from watching relatively innocent videos about toys to conspiracy content.

One of the videos which was suggested on YouTube Kids. YouTube/UFOTV

YouTube said in a statement to Business Insider that "sometimes we miss the mark" on content appearing on YouTube Kids and said it would "continue to work to improve the YouTube Kids app experience."

Here's the full statement from YouTube:

"The YouTube Kids app is home to a wide variety of content that includes enriching and entertaining videos for families. This content is screened using human trained systems. That being said, no system is perfect and sometimes we miss the mark. When we do, we take immediate action to block the videos or, as necessary, channels from appearing in the app. We will continue to work to improve the YouTube Kids app experience."

YouTube Kids is meant to block unsuitable content

The YouTube Kids app blocks searches for most unsuitable videos. Search "9/11" or "porn" and you find no results. But we found that buried in the app's suggested videos were conspiracy videos that children could stumble on.

YouTube Kids

Conspiracy theory videos appear in search results

If you searched for "moon landing" on YouTube Kids, three videos appeared that claim that the moon landing was hoaxed. All three videos have since been hidden by YouTube after we informed it of the issue.

YouTube Kids

Following related videos that appear in YouTube Kids, we ended up watching a video that claims that a gateway to a new world had opened, and that a female employee working on the Large Hadron Collider mysteriously vanished in a magic portal.

Through YouTube Kids' suggested videos feature, we also found videos from conspiracy theorists Ben Davidson, Gerald Pollack, and Wallace Thornhill. YouTube removed the specific videos that we sent it, but many other videos by the conspiracy theorists remain in the app.

YouTube Kids

Conspiracy videos also appear when children search for popular conspiracy theories. Searches for "chemtrails," "flat earth," and "nibiru" are all allowed in the app. However, it's (hopefully) unlikely that children are regularly watching these videos unless they appear as suggestions on more popular content in the app.

The conspiracy videos didn't just appear in searches or suggested videos, either. After watching several conspiracy videos, the top recommended video on the home page of YouTube Kids was a conspiracy theory about aliens on the moon:

YouTube Kids

This issue with the YouTube Kids app shows the problem with YouTube's suggested videos algorithm. The suggested videos try to convince you to watch related content after your current video ends.

That's fine when it's adults watching the main YouTube site, but children on YouTube Kids can easily go from innocent content about the moon landing to Icke claiming lizard people rule the world.

YouTube Kids criticised for featuring inappropriate videos

This isn't the first time that YouTube Kids was found to feature videos that weren't suitable for children. In 2017, the app was criticised in a lengthy Medium post by author James Bridle after he found disturbing videos targeted at children.

"Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatise, and abuse children, automatically and at scale," Bridle wrote.

In November, YouTube published a blog post in which it promised to remove "unacceptable" videos from YouTube Kids.

YouTube is fighting against fake new...

YouTube suggested conspiracy videos to children using its Kids app
Children's YouTube is still churning out blood, suicide and cannibalism

wired.co.uk · 2018

Video still of a reproduced version of Minnie Mouse, which appeared on the now-suspended Simple Fun channel Simple Fun / WIRED

YouTube videos using child-oriented search terms are evading the company's attempts to control them. In one cartoon, a woman with a Minnie Mouse head tumbles down an escalator before becoming trapped in its machinery, spurting blood, while her children (baby Mickey and Minnie characters) cry.

The cartoon, Minnie Mouse Mommy Has Pregnancy Problem & Doctor Treats Episodes! Mickey Mouse, Donald Duck Cartoon, racked up over three million views in a single day. It could be viewed even with YouTube's family-friendly restricted mode enabled and existed, along with plenty of similarly distressing content, on Simple Fun, a channel that had been in operation since July 2017.

Advertisement

The channel has now been removed by YouTube "due to multiple or severe violations of YouTube's policy against spam, deceptive practices and misleading content or other Terms of Service violations."

WIRED found videos containing violence against child characters, age-inappropriate sexualisation, Paw Patrol characters attempting suicide and Peppa Pig being tricked into eating bacon. These were discovered by following recommendations in YouTube's sidebar or simply allowing children's videos to autoplay, starting with legitimate content.

Read next Inside the prophetic, angry mind of Black Mirror's Charlie Brooker Inside the prophetic, angry mind of Black Mirror's Charlie Brooker

"Recommendations are designed to optimize watch time, there is no reason that it shows content that is actually good for kids. It might sometimes, but if it does it is coincidence," says former YouTube engineer Guillaume Chaslot, who founded AlgoTransparency, a project that aims to highlight and explain the impact of algorithms in determining what we see online. "Working at YouTube on recommendations, I felt I was the bad guy in Pinocchio: showing kids a colourful and fun world, but actually turning them into donkeys to maximise revenue."

The videos WIRED found were reported to YouTube and were removed or restricted by the Google-owned company before the publication of this article. The company explained that it is increasing its efforts to control content that violates its terms and conditions.

Advertisement

Weird children's YouTube

If you can imagine it, there's a deeply weird Finger Family version of it Munnik TV / WIRED

YouTube is home to millions of hours of children's entertainment – part of the 400 hours of video uploaded to the service every minute – ranging from CBeebies and Disney to the incomprehensibly successful Little Baby Bum, a UK-based YouTube-native children's channel devoted to 3D animated songs and nursery rhymes for pre-schoolers in numerous languages.

Content for pre-school children, in particular, can be lucrative for ad-funded channels, as small children will readily watch and poke at whatever videos YouTube suggests, while harried parents are often unable to fully supervise every minute of their child's media consumption.

Read next The strange story of Section 230, the obscure law that created our flawed, broken internet The strange story of Section 230, the obscure law that created our flawed, broken internet

AlgoTransparency regularly indexes the kids' videos most likely to be recommended by YouTube. Its lists show that YouTube's most-suggested children's videos lean disproportionately towards a combination of YouTube-native songs and nursery rhymes designed for a US audience; long edited-together compilations of TV series such as Peppa Pig, and strange, low-budget 2D and 3D animated mash-ups of animals, characters and voice samples.

Advertisement

Previously described in James Bridle's Something is wrong on the internet Medium post as "decidedly off", the latter type of content can be loosely and collectively categorised as 'weird children's YouTube'.

Titles are typically a word salad designed to attract children's and parents' searches, while the videos' content leans heavily on generic 2D or 3D animated models, usually incongruously combined with familiar figures from hit Disney or superhero franchises. The sheer number of them on the platform is staggering, and many have millions or even hundreds of millions of views.

The video Disney Frozen Finger Family Collection Disney Frozen Finger Family Songs Nursery Rhymes has 43m views, while LEARN COLOR BMX & MotorCycles JUMP! for kids w/ Superheroes Cartoon for children Nursery rhymes has 176m.

Neither video contains any content more distressing than badly-animated video and the intensely annoying Finger Family song, but both are good examples of videos that use popular franchises and the promise of education to target searches that parents and children are likely to carry out.

Read next With AI and DNA, Massive Attack are hacking a new kind of music With AI and DNA, Massive Attack are hacking a new kind of music

Based on what YouTube insiders have said about how...

Children's YouTube is still churning out blood, suicide and cannibalism
YouTube Kids, Criticized for Content, Introduces New Parental Controls

nytimes.com · 2018

YouTube Kids, which has been criticized for inadvertently recommending disturbing videos to children, said Wednesday that it would introduce several ways for parents to limit what can be watched on the popular app.

Beginning this week, parents will be able to select “trusted channels” and topics that their children can access on the app, like “Sesame Workshop” or “learning,” that have been curated by people at YouTube Kids and its partners. The Google-owned app said in a blog post on Wednesday that parents would also have the option to restrict video recommendations to channels that have been “verified” by YouTube Kids, avoiding the broader sea of content that the app pulls from the main YouTube site through algorithms and other automated processes.

YouTube Kids was introduced in 2015 for children of preschool age and older, and it says it has more than 11 million weekly viewers. But parents have discovered a range of inappropriate videos on the app, highlighting the platform’s dependence on automation and a lack of human oversight. The New York Times reported in the fall that children using the app had been shown videos with popular characters from Nick Jr. and Disney Junior in violent or lewd situations, and other disturbing imagery, sometimes set to nursery rhymes....

YouTube Kids, Criticized for Content, Introduces New Parental Controls
What parents should know about inappropriate content on YouTube

goodmorningamerica.com · 2018

After multiple reports of inappropriate content on YouTube over the past year, “Good Morning America” wanted to take a closer look at the site. How often do kids end up seeing inappropriate content on the video platform? We talked with a group of Philadelphia-area tweens and their parents, who say their children often watch YouTube.

Here's what parents said when asked if they've ever tried to take YouTube away from their kids.

“It’s like the end of the world," said Eve Ehrich, a mother of three kids.

“You’re ending their life for a day," another parent, Jaime Meltzer, said.

Almost all the parents said they use some form of parental controls on their computers and mobile devices to try to limit their children's exposure to inappropriate content.

The kids we talked to were all ages 10 to 13 and said they know who's “kid-friendly“ on YouTube.

Sam, 11, said SuperMarioLogan is “one of my favorite channels. It was a suggested video. And I watched it and it kept reeling me in to watch more videos.”

The other two boys in the group said they know “Jeffy,” a puppet on the popular SuperMarioLogan YouTube channel.

“It attracts kids because you wouldn’t think of him as inappropriate because of the way he looks," said 13-year-old David.

Family watchdog group Common Sense Media called SuperMarioLogan “Your basic online nightmare for parents of young kids." The group, who started rating YouTubers this year due to overwhelming requests from parents, noted SuperMarioLogan is intended for ages 17 and older.

“YouTube is the biggest pain point for parents,” Jill Murphy, editor-in-chief of Common Sense Media, told "Good Morning America." “Part of it is parents feeling like they are in the dark and have no idea of what their kids are up to online.”

Even the kids "GMA" spoke with agreed that YouTube doesn’t do enough to block inappropriate content and that it’s not a matter of trust.

“I think that sometimes kids get drawn in. It’s not their fault“ said 13-year-old Aubrey. “It looks kid-friendly. But then you watch it, and you don’t really know that it’s not.”

"GMA" showed some of the kids’ interviews to Murphy.

“Developmentally kids aren’t even primed at that age to have the wherewithal to shut off YouTube, the autoplay. They don’t even have the self-control to manage that,” Murphy said.

The creator of SuperMarioLogan told "GMA" he has lost revenue since YouTube started age-restricting and demonetizing his videos.

"Common Sense Media only viewed our old content, and their review was accurate solely regarding those old videos," he said in a statement. "We invite Common Sense Media to conduct a review of our newer videos, which are much cleaner in content. It’s important to note when we began creating these videos back in 2008, we were kids ourselves. We were just a few teenagers goofing around. Given we were just kids, we did not understand many things about YouTube or the audience we would subsequently attract. Today is much different. We have adjusted our content to appeal to a wider audience."

"While it's in everyone’s interest to ensure children are not exposed to inappropriate content online, it's ultimately the responsibility of the parents, guardians, and/or supervising adults. These are the only people that have control over what their children have access to.”

YouTube in a statement to "GMA" noted that the site offers YouTube Kids, which it dubs as the safe alternative for kids and families.

“Our main YouTube site is for those age 13+... Protecting families is our priority and we created the YouTube Kids app to offer parents a safer alternative for their children," according to the statement. "Beyond that, we’ve ramped up our efforts to age-gate flagged videos on the main app that are better suited for older audiences and increased resources to remove content that doesn’t adhere to our policies. We know there’s more work to be done so we’ve enlisted third-party experts to help us assess this evolving landscape, and we’re launching new tools in YouTube Kids for parents to choose a personalized experience for their child.”

YouTube does state in its terms of service that it is not intended for children under age 13.

However, the parents who spoke with "GMA" were not aware of that aspect of the terms of service.

Additionally, YouTube has tools it says can help parents filter out inappropriate content.

How well do YouTube's age restrictions work?

4:14

"GMA" explores YouTube's age restrictions in the wake of headlines about inappropriate content that's popular for children.

"GMA" examined two tools that parents can use to filter content for kids on YouTube’s main site.

We created an account for a 14-year-old so age-restricted content would be screened out, and we turned on "restricted mode," which is supposed to filter out potentially mature content.

Even with these restrictions, we found sexual content interlaced with videos of people playing customized versions of popular children’s games Minecraft and Roblox.

"The algorithm is zeroing in on you and saying, 'You want to see more and more and more of these things,' and just putting this in front of kids, again, who are in restricted mode," said the Campaign For a Commercial Free Childhood's Josh Golin, who observed the YouTube content with "GMA."

Most of the videos were taken down or placed behind filters after "GMA" told YouTube about the findings. Several of the Minecraft and Roblox videos involving sexual activity were left up without age restrictions.

"Roblox is committed to providing a safe community, and we have zero tolerance for content that violates our Rules of Conduct," Roblox said in a statement to "GMA." "The videos highlighted date as far back as 2011 and show features that are not possible on today's platform. We use a combination of technology and a robust team of moderators to identify and remove any questionable content or behavior that violates our rules of conduct. In addition we are also proactive in identifying and requesting that sites across the web, such as YouTube, remove any content that does not depict the true nature or functionality of the Roblox platform.”

YouTube told "GMA" in a statement that "protecting families is our priority."

"Protecting families is our priority and we created the YouTube Kids app to offer parents a safer alternative for their children," the statement read. "Beyond that, we’ve ramped up our efforts to age-gate flagged videos on the main app that are better suited for older audiences and increased resources to remove content that doesn’t adhere to our policies.

"We know there’s more work to be done so we’ve enlisted third party experts to help us assess this evolving landscape and we’re launching new tools in YouTube Kids for parents to choose a personalized experience for their child," the company said.

YouTube also told "GMA" it has since created more parental controls on YouTube Kids so only videos screened by human moderators can be used.

Parents have to turn the controls on themselves, and we found they do appear to work. By turning search off, parents can limit kids to videos that have been verified by the YouTube Kids Team.

Parents can also choose collections of channels recommended by YouTube Kids and their partners. A feature in which parents can handpick videos is supposed to become available later this year.

Child advocates say there are also steps parents can take on their own, from spot checking their child's browser history to co-viewing YouTube with their child and talking to their child about what they're viewing online....

What parents should know about inappropriate content on YouTube
YouTube Kids Is Nowhere Near as Innocent As It Seems

studybreaks.com · 2018

YouTube is both a massive industry and browsing staple that people use to fill their educational and entertainment needs. According to Business Insider, the website accumulates around 1.8 billion logged-in users per month. With its high traffic rates, it makes sense that the company wants to pay its creators in order to encourage them to upload more videos. Lately, though, you may have seen YouTubers complaining about the company’s advertisement revenue algorithm, as YouTube is trying to figure out the best way to pay their creators.

Simply put, the more views a video gets, the more money it makes. While some YouTubers operate with integrity and continue to create the content they wish to, others have decided to alter their content in hopes of appealing to a large, easy-to-please audience: children.

In 2015, YouTube launched YouTube Kids, which was specifically created for child viewers. This branch of YouTube consists of user-face and family-friendly videos that are readily available for young kids. Personally, I believe it was created with good intentions, and to ensure parents that their children are watching age-appropriate content.

Somewhere along the way, however, something went horribly, horribly wrong.

On its current website, YouTube Kids describes itself as “a safer online experience for kids,” acknowledging that some inappropriate videos may find their way into the service.

You might also like: Consuming Monstrous Sea Creatures Is Apparently the Newest Mukbang Niche

“We use a mix of filters, user feedback and human reviewers to keep the videos in YouTube Kids family friendly,” the website explains. “But no system is perfect and inappropriate videos can slip through, so we’re constantly working to improve our safeguards and offer more features to help parents create the right experience for their families.”

The question is, what terrible thing happened on YouTube Kids that prompted them to put this disclaimer on their homepage?

I was born in 1997 and have had access ti a computer for as long as I can remember. As a kid, I played Freddie Fish and Putt-Putt computer games, eventually graduating to Neopets and Webkinz when we upgraded from dial-up internet. To my knowledge, social media didn’t even exist until Myspace picked up around 2008.

Technology evolved with me. I didn’t have a lot of restrictions placed on my internet consumption, but I really don’t think I needed them. Everything was so new back then, people hadn’t figured out how to exploit them in a truly damaging way. The worst that could happen was accidentally stumbling across shock sites like Meatspin or Lemonparty.

Back then, YouTube was a harmless place for people to waste time. In 2008, FilmCow had posted “Charlie the Unicorn” on its YouTube channel, with “Llamas with Hats” to follow a year later. Both videos became cult classics among middle schoolers of the time, and, admittedly, the videos are a little crude. Charlie gets his kidney stolen by the other unicorns, and Carl the llama has a problem with stabbing people.

But these videos are no worse than what people saw on Adult Swim, and they don’t come even remotely close to the damage done by the Logan Paul scandal earlier this year.

On YouTube today, children are being exploited for money. YouTubers with channels specifically marketed toward children are cranking out videos to provide kids with loads of content to consume, as each video around 16 minutes long. (Which is the sweet spot for maximum ad revenue.) Frankly, YouTubers are practically begging their viewers to “smash” that like button and comment on their videos.

I spent a weekend babysitting my brother’s children and they spent most of that time watching channels like Chad Wild Clay. He would ask a question like, “Who is going to win this game?” ask kids to comment their predictions in the comments and then proceed to play the game, giving the kids the answer in the same video. He’d do that same thing several times throughout the video.

What’s the point of the interactive bits if they can just skip ahead and get their answers without commenting at all? It’s simple: the more engagement the video gets, the more likely it is to be picked up by YouTube’s recommendation algorithm, thus bringing in more traffic and more money.

You might also like: Season 2 of ‘The Real Bros of Simi Valley’ Is Back and Chiller Than Ever

I’m not trying to be a curmudgeon about all this. It’s not like children’s television was any less brainless or exploitative when I was growing up. “Ed, Edd n Eddy” drove my mom nuts from how stupid it was, and shows like “Blue’s Clues” very often asked questions before they were immediately answered. And besides gems like “Mr. Roger’s Neighborhood,” I’d also bet that cartoon companies really cared more about making money than they ever did about me. It’s all the same principles, just in a more modern format.

What’s the most concerning is that now, through YouTube, these videos are not passed through a strict cens...

YouTube Kids Is Nowhere Near as Innocent As It Seems