Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 1545

関連インシデント

インシデント 1695 Report
Facebook Allegedly Failed to Police Anti-Rohingya Hate Speech Content That Contributed to Violence in Myanmar

Loading...
Rohingya refugees sue Facebook for $150 billion over Myanmar violence
reuters.com · 2021

Dec 6 (Reuters) - Rohingya refugees from Myanmar are suing Meta Platforms Inc (FB.O), formerly known as Facebook, for $150 billion over allegations that the social media company did not take action against anti-Rohingya hate speech that contributed to violence.

A U.S. class-action complaint, filed in California on Monday by law firms Edelson PC and Fields PLLC, argues that the company's failures to police content and its platform's design contributed to real-world violence faced by the Rohingya community.

In a coordinated action, British lawyers also submitted a letter of notice to Facebook's London office.

A Meta spokesperson said in a statement: "We're appalled by the crimes committed against the Rohingya people in Myanmar. We've built a dedicated team of Burmese speakers, banned the Tatmadaw (Myanmar military), disrupted networks manipulating public debate and taken action on harmful misinformation to help keep people safe. We've also invested in Burmese-language technology to reduce the prevalence of violating content."

The company has previously said it was "too slow to prevent misinformation and hate" in Myanmar.

A Myanmar junta spokesman did not answer phone calls from Reuters seeking comment on the legal action against Facebook.

In 2018, U.N. human rights investigators said the use of Facebook had played a key role in spreading hate speech that fueled the violence.

A Reuters investigation that year, cited in the U.S. complaint, found more than 1,000 examples of posts, comments and images attacking the Rohingya and other Muslims on Facebook. Almost all were in the main local language, Burmese.

The invective included posts calling the Rohingya or other Muslims dogs, maggots and rapists, suggested they be fed to pigs, and urged they be shot or exterminated.

The posts were tolerated in spite of Facebook rules that specifically prohibit attacking ethnic groups with "violent or dehumanizing speech" or comparing them to animals.

Facebook has said it is protected from liability over content posted by users by a U.S. internet law known as Section 230, which holds that online platforms are not liable for content posted by third parties. The complaint says it seeks to apply Myanmar law to the claims if Section 230 is raised as a defense.

Although U.S. courts can apply foreign law to cases where the alleged harms and activity by companies took place in other countries, two legal experts interviewed by Reuters said they did not know of a successful precedent for foreign law being invoked in lawsuits against social media companies where Section 230 protections could apply.

Anupam Chander, a professor at Georgetown University Law Center, said that invoking Myanmar law wasn't "inappropriate." But he predicted that "it's unlikely to be successful," saying that "it would be odd for Congress to have foreclosed actions under U.S. law but permitted them to proceed under foreign law."

More than 730,000 Rohingya Muslims fled Myanmar's Rakhine state in August 2017 after a military crackdown that refugees said included mass killings and rape. Rights groups documented killings of civilians and burning of villages.

Myanmar authorities say they were battling an insurgency and deny carrying out systematic atrocities.

The International Criminal Court has opened a case into the accusations of crimes in the region. In September, a U.S. federal judge ordered Facebook to release records of accounts connected to anti-Rohingya violence in Myanmar that the social media giant had shut down.

The new class-action lawsuit references claims by Facebook whistleblower Frances Haugen, who leaked a cache of internal documents this year, that the company does not police abusive content in countries where such speech is likely to cause the most harm.

The complaint also cites recent media reports, including a Reuters report last month, that Myanmar's military was using fake social media accounts to engage in what is widely referred to in the military as "information combat."

Mohammed Taher, a refugee living in the camps in Bangladesh that are home to more than a million Rohingya, said Facebook had been widely used to spread anti-Rohingya propaganda.

"We welcome the move," he said by phone.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd