Incident 281: YouTube's Algorithms Failed to Remove Violating Content Related to Suicide and Self-Harm

Suggested citation format

Lam, Khoa. (2019-02-04) Incident Number 281. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Editors
281
3
2019-02-04
Khoa Lam

Incidents Reports

Speaking to The Telegraph, a former Tumblr blogger, who asked for anonymity, said she had to stop her own depression and anxiety help blog after she found herself “falling down the rabbit hole of content that triggered negative emotions”.

“I found it really easy to continuously fall back into bad habits and bad coping skills that only worsened my mental health,” she said.

“A few of my friends… were really frequent viewers of YouTube videos of people’s stories with eating disorders, depression and anxiety.

“I think it is really difficult to find that line between what is helpful and inspirational and what is triggering content.”

Writing in The Daily Telegraph last week, Digital Secretary Jeremy Wright said: “Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people.”

The Health Secretary Matt Hancock has written to executives at Facebook, Instagram, Twitter, Snapchat, Pinterest, Google and Apple ordering them to “step up and purge this content once and for all”.

A YouTube spokesman said: “We know many people use YouTube to find information, advice or support sometimes in the hardest of circumstances. We work hard to ensure our platforms are not used to encourage dangerous behaviour.

YouTube recommended self-harm videos to children as young as 13

Potentially harmful content has reportedly slipped through the moderation algorithms again at YouTube.

According to a report by The Telegraph on Monday, YouTube has been recommending videos that contain graphic images of self-harm to users as young as 13 years old.

The report contends that at least a dozen videos featuring the graphic images still exist on YouTube today, even though the video platform's policy is to ban videos that promote self-harm or suicide.

YouTube has reportedly taken down at least two videos flagged by The Telegraph, but others - including one entitled "My huge extreme self-harm scars" - remain.

In response to The Telegraph's report, YouTube said it removes videos that promote self-harm (which violate its terms of service), but may allow others that offer support.

The Telegraph also found search term recommendations such as "how to self-harm tutorial," "self-harming girls," and "self-harming guide." These recommendations were removed by YouTube once the company was notified, according to the report.

"We know many people use YouTube to find information, advice or support sometimes in the hardest of circumstances. We work hard to ensure our platforms are not used to encourage dangerous behavior," a YouTube spokesperson told Business Insider in a statement. "Because of this, we have strict policies that prohibit videos which promote self-harm and we will remove flagged videos that violate this policy."

YouTube does return numbers for users to call or text for help when type search terms like "suicide" or "self-harm" are entered.

YouTube has long struggled with moderating disturbing and dangerous content on its platform.

Last December, The Times of London found the company failed to remove videos of child exploitation in a timely manner. In January, YouTube faced a wave "Bird Box challenge" inspired by the Netflix drama where viewers - including one of the platforms biggest celebrities, Jake Paul - put themselves in dangerous situations while blindfolded. YouTube forced Paul to take down his video and gave the rest of the community two months to do so themselves.

And then there are the conspiracy theory videos, like the Earth is flat and potentially more harmful ones like phony cures for serious illnesses. In late January, YouTube said it would recommend less "borderline" content, and that it thinks it had created a better solution for stopping them from spreading.

YouTube criticized for recommending 'self-harm' videos with graphic images

YouTube has been caught recommending "dozens" of graphic videos relating to self-harm to children.

The hugely popular video sharing app has been blasted for promoting dangerous clips to users as young as 13.

YouTube is under fire for failing to protect users from harmful contentCredit: Reuters

Google-owned YouTube has come under fire for hosting inappropriate suicide-themed content before.

But a new report by The Telegraph found that YouTube has failed to clean up its act.

It found videos depicting "graphic images of self-harm", which were easily accessible to youngsters on the site.

One flagged clip titled "My huge extreme self-harm scars" is still live on the site, having racked up nearly 400,000 views over the last two years.

YouTube was also found to be offering worrying search term recommendations, including "how to self-harm tutorial, "self-harming girls", and "self-harming guide".

UK ministers are currently drawing up plans for tech giants to have a legal Duty of Care for young people using social media platforms.

There's growing pressure on mega-corporations like Google and Facebook to take more responsibility for the well-being of children on their sites.

In a statement, UK health secretary Matt Hancock said: "We are masters of our own fate as a nation, and we must act to ensure that this amazing technology is used for good, not leading to young girls taking their own lives."

A YouTube spokesperson issued a statement on the exposé, saying it "works hard" to prevent harmful videos and recommendations from turning up on the site.

"We know many people use YouTube to find information, advice or support sometimes in the hardest of circumstances," a YouTube spokesperson said.

"We work hard to ensure our platforms are not used to encourage dangerous behaviour.

"Because of this, we have strict policies that prohibit videos which promote self-harm and we will remove flagged videos that violate this policy.

"Our policies also prohibit autocomplete predictions for these topics, and we will remove any suggestions which don't comply with our policies."

YouTube also shows a phone number to contact suicide support charity Samaritans when users search for terms like "suicide" on the site.

But experts think YouTube simply isn't going far enough to tackle these issues.

Speaking to The Sun, Andy Burrows, Associate Head of Child Safety Online at the NSPCC, said: "It’s concerning that YouTube is actively recommending inappropriate videos to young people.

"YouTube, as with so many social media platforms, appears to be failing to abide by its own rules to keep children safe.

"The NSPCC’s Wild West Web campaign has for months been calling on Government to impose a statutory duty of care that finally forces social networks to truly protect children and be faced with tough punishments if they don’t. This is not an opportunity the Government can miss."

Last September, an investigation by The Sun revealed how YouTube was profiting from sick pranksters who post shocking fake suicide videos online.

We uncovered hundreds of shocking videos: some have millions of views, and have been live on YouTube for years.

Mental health charities warned The Sun that these videos could even inspire real suicides, due to the detailed methods shown in some clips.

One clip saw a woman fabricate her own death in a bathtub filled with fake blood, filming her husband's reaction when he returned home.

The woman's distressed partner weeps, cries out her name, and even steps into the bathtub to resuscitate her.

Another five-year-old video sees a Brit prankster fake an angry phone call, before jumping into the Thames – an act that claims 25 lives a year – stunning onlookers. It has more than 3million views.

Speaking to The Sun at the time, Brian Dow, managing director at Mental Health UK and co-chair of the National Suicide Prevention Alliance said: "It really should not need stating that suicide is not a joke or a prank.

"Every day people lose parents, children, siblings and friends, and to see it trivialised in this way is both cruel and incredibly irresponsible.

"To present this very serious issue in this way can have immediate and lasting effects not only on the viewer, who might be triggered by what they see on screen, but also the victims of the ‘pranks’ that we see being performed."

At the time, a spokesperson for the Department for Digital, Culture, Media and Sport told The Sun: "Suicide is a very serious issue that affects millions of people every year.

"We would urge YouTube to consider whether videos that trivialise someone taking their own life should be on its platform."

That wasn't the first time YouTube had been exposed for hosting shocking content.

In December 2017, popular YouTube vlogger Logan Paul sparked controversy after filming the body of a suicide victim.

The clip, which was posted to YouTube, showed the recently deceased corpse of someone who had hung themselves in a forest in Japan.

Paul earned millions of views within hours, but was widely condemned. He eventually removed the video, issued an apology, and took a month-long break from YouTube.

The Sun has also uncovered a rogue steroids advert, a secret cache of porn, smut playlists designed to "lure kids", and webcam sex ads on YouTube.

Social media sites may soon be prosecuted for failing to protect kids from disturbing content online.

Instagram has admitted its failure to block self-harm and suicide pics.

And Facebook was recently caught paying children up to £15 a month to install a "spying app" that monitors everything they did online.

Do you think Google needs to do more to keep YouTube clean? Let us know in the comments!

YouTube caught promoting deadly 'how to self harm' tutorials for youngsters aged 13

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents