インシデント 286の引用情報

Description: TikTok’s recommendation algorithm was alleged in a lawsuit to have intentionally and repeatedly pushed videos of the “blackout” challenge onto children’s feeds, incentivizing their participation which ultimately resulted in the death of two young girls.

インシデントのステータス

インシデントID
286
レポート数
3
インシデント発生日
2021-02-26
エディタ
Khoa Lam
Families sue TikTok after girls died while trying ‘blackout challenge’
theguardian.com · 2022

The families of two young girls who allegedly died as a result of a viral TikTok challenge have sued the social media platform, claiming its “dangerous” algorithms are to blame for their children’s deaths.

Parents of two girls who died in a…

Parents Sue TikTok, Saying Children Died After Viewing ‘Blackout Challenge’
nytimes.com · 2022

The parents of two girls who said their children died as a result of a “blackout challenge” on TikTok are suing the company, claiming its algorithm intentionally served the children dangerous content that led to their deaths.

The girls were…

TikTok self-harm study results ‘every parent’s nightmare’
theguardian.com · 2022

TikTok’s recommendation algorithm pushes self-harm and eating disorder content to teenagers within minutes of them expressing interest in the topics, research suggests.

The Center for Countering Digital Hate (CCDH) found that the video-shar…

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください