Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2381

関連インシデント

インシデント 1603 Report
Alexa Recommended Dangerous TikTok Challenge to Ten-Year-Old Girl

インシデント 2863 Report
TikTok’s "For You" Allegedly Pushed Fatal “Blackout” Challenge Videos to Two Young Girls

インシデント 2793 Report
TikTok’s “For You” Algorithm Exposed Young Users to Pro-Eating Disorder Content

Loading...
TikTok self-harm study results ‘every parent’s nightmare’
theguardian.com · 2022

TikTok’s recommendation algorithm pushes self-harm and eating disorder content to teenagers within minutes of them expressing interest in the topics, research suggests.

The Center for Countering Digital Hate (CCDH) found that the video-sharing site will promote content including dangerously restrictive diets, pro-self-harm content and content romanticising suicide to users who show a preference for the material, even if they are registered as under-18s.

For its study the campaign group set up accounts in the US, UK, Canada and Australia, registered with ages of 13, the minimum age for joining the service. It created “standard” and “vulnerable” accounts, the latter containing the term “loseweight” in their usernames, which CCDH said reflected research showing that social media users who seek out eating disorder content often choose usernames containing related language.

The accounts “paused briefly” on videos about body image, eating disorders and mental health, and also liked them. This took place over a 30-minute initial period when the accounts launched, in an attempt to capture the effectiveness of TikTok’s algorithm that recommends content to users.

On “standard” accounts, content about suicide followed within nearly three minutes and eating disorder material was shown within eight minutes.

“The results are every parent’s nightmare,” said Imran Ahmed, CCDH’s chief executive. “Young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health.”

The group said the majority of mental health videos presented to its standard accounts via the For You feed – the main way TikTok users experience the app – consisted of users sharing their anxieties and insecurities.

Body image content was more harmful, the report said, with accounts registered for 13-year-olds being shown videos advertising weight loss drinks and “tummy-tuck” surgery. One animation that appeared in front of the standard accounts carried a piece of audio stating “I’ve been starving myself for you” and had more than 100,000 likes. The report said the accounts were shown self-harm or eating disorder videos every 206 seconds.

The researchers found that videos relating to body image, mental health and eating disorders were shown to “vulnerable” accounts three times more than to standard accounts. The vulnerable accounts received 12 times as many recommendations for self-harm and suicide-related videos as the standard accounts, the report said.

The recommended content was more extreme for the vulnerable accounts, including methods of self-harm and young people discussing plans to kill themselves. CCDH said a mental health or body image-related video was shown every 27 seconds, although the content was dominated by mental health videos, which CCDH defined as videos about anxieties, insecurities and mental health conditions, excluding eating disorders, self-harm and suicide.

The group said its research did not differentiate between content with a positive intent – such as content discovering recovery – or negative content.

A spokesperson for TikTok, which is owned by the Chinese firm ByteDance and has more than 1 billion users worldwide, said the CCDH study did not reflect the experience or viewing habits of real-life users of the app.

“We regularly consult with health experts, remove violations of our policies and provide access to supportive resources for anyone in need,” they said. “We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”

TikTok’s guidelines ban content that promotes behaviour that could lead to suicide and self-harm, as well as material that promotes unhealthy eating behaviours or habits.

The UK’s online safety bill proposes requiring social networks to take action against so-called “legal but harmful” content being shown to children.

A DCMS spokesperson said “We are putting a stop to unregulated social media causing harm to our children. Under the Online Safety Bill, tech platforms will need to prevent under-18s from being exposed to illegal content assisting suicide and protect them from other harmful or age-inappropriate material, including the promotion of self-harm and eating disorders, or face huge fines.”

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd