インシデント 417の引用情報

Description: Facebook feed algorithms were known by internal research to have harmed people having low digital literacy by exposing them to disturbing content they did not know how to avoid or monitor.
推定: Facebookが開発し提供したAIシステムで、low digitally skilled Facebook usersに影響を与えた

インシデントのステータス

インシデントID
417
レポート数
4
インシデント発生日
2019-11-15
エディタ
Khoa Lam

インシデントレポート

Why Some People See More Disturbing Content on Facebook Than Others, According to Leaked Documents
time.com · 2021

Some users are significantly more likely to see disturbing content on Facebook than others, according to internal company documents leaked by whistleblower Frances Haugen.

A 2019 report from Facebook’s Civic Integrity team details the resul…

Facebook fed posts with violence and nudity to people with low digital literacy
usatoday.com · 2021
  • Facebook studies said algorithms harmed users with low tech skills with repeated disturbing content.
  • Some users did not understand how content came to appear in their feeds or how to control it.
  • These users were often older, people of colo…
Facebook Exposed Its Less Digital Conversant Audience To Graphic Content
screenrant.com · 2021

Facebook's track record with content available on its platform is nothing worth envying, but for users who are not well-versed with social media tools, the platform dished out more disturbing content that could be anything from graphically …

Facebook’s Latest Scandal: Exposing Low Digitally Skilled Users to More Violent and Adult Content
visiontimes.com · 2021

Facebook has been dealing with scandal after scandal for some time and has come under intense scrutiny from global lawmakers and regulators. According to a recent report, users with low digital literacy skills have become the latest victims…

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください