Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2642

関連インシデント

インシデント 4718 Report
Facebook Allegedly Failed to Police Hate Speech Content That Contributed to Ethnic Violence in Ethiopia

Loading...
Open letter to Facebook on violence-inciting speech: act now to protect Ethiopians
accessnow.org · 2020

Civil society organizations, human rights defenders, activists, and individuals from around the world are calling on Facebook to immediately enact the recommendations listed in this letter and take action to stop its services from being used to incite violence, propagate hatred, and advance discrimination in Ethiopia.

Update (December 18, 2020): Facebook sent Access Now a response to our letter. Read it here.

Update (August 6 and July 27, 2020): The language in this letter has been edited slightly for clarity.

July 27, 2020

To Facebook:

The undersigned – activists, journalists, human rights organizations – call on you to stop the spread of violence and hate-inciting speech on your services in Ethiopia. On June 29th, 2020,  Haacaaluu Hundeessa, an Oromo musician and social activist, was shot and killed in Addis Ababa. Following his death, parts of Ethiopia were engulfed in protests, unrest, and violence. Since then, more than 160 people have been killed. The state-ordered internet shutdown has obscured any human rights violations perpetrated by government authorities and others, and prevented reporting and documentation.

The offline troubles that rocked the country are fully visible on the online space. The actors that are instigating violence offline also incite violence and propagate hate in the country online. Content shared online in text, livestream, and other formats called for violence, discrimination, and destruction of property of different ethnic groups. This is not the first time this has happened; previous incidents have led to similar situations. 

We understand that incitement to violence is a complex issue where government action – or lack thereof – plays a key role in its materialization. Companies, including those that provide and curate a platform for communication, have a responsibility under human rights law “to prevent or mitigate adverse human rights impacts that are directly linked to their […]…services” and have the obligation to “[remedy] any adverse human rights impacts they cause or to which they contribute.” So far, Facebook has failed to prevent the escalation of incitement to violence on its services, and particularly in Ethiopia.

This is not the first time Facebook has neglected its responsibility to respect human rights, or to offer remedy for abuses to the extent it has contributed to them, in Ethiopia or other parts of the world. For instance, according to the Independent International Fact-Finding Mission on Myanmar, Facebook was used to campaign against and spread anti-Rohingya Muslim sentiment, and the company has admitted that “[it] weren’t doing enough to help prevent [the] platform from being used to foment division and incite offline violence” specifically against the Rohingya Muslims. This lack of action and its effects are still evident in Myanmar and where Rohingya refugees reside. 

As human rights organizations, journalists, and activists, we are seeing the negative impact that content on Facebook that incites violence has on the communities we serve. This content can lead to physical violence and other acts of hostility and discrimination against minority groups. Based on international human rights law, content that meets the threshold of incitement to violence, hostility and discrimination does not belong to the protective scope of the right to freedom of expression. Despite the real risk it carries for minority groups and others in Ethiopia, such content remains online and visible on the platform.

Concerned individuals, organizations, and communities have warned Facebook and other social media platforms repeatedly on numerous occasions in private and in public about the imminence of the escalation we are seeing now. David Kaye, the former U.N Special Rapporteur on Freedom of Expression, during his last visit to Ethiopia, called on social media platforms, including Facebook, to regularly engage with Ethiopian authorities and civil society. Specifically, he requested that “at a minimum, [social media platforms] establish regular and rapid-reaction mechanisms to enable civil society to report the most concerning kinds of content on their platforms.” 

We call on Facebook to address incitement to violence in Ethiopia as a priority and take these immediate and long-term mitigation measures:

Immediate actions:

  • Make content reporting on Facebook services in Afaan Oromo and Tigrinya languages fully available (it is already available in Ahmaric and English).
  • Do not allow amplification of content that incites violence and discrimination, whether by content recommendation systems that are in use or as a result of targeted advertising practices.
  • Consider transparent and temporary changes to limit massive sharing functionalities in specific cases where there is an imminent risk of human rights abuse. Additionally, add specific, transparent, and temporary modifications to user interfaces to add tags to content that help users contextualize information.
  • Preserve restricted content that Facebook makes unavailable that incites violence and discrimination, as it could serve as evidence for victims and organizations seeking to hold perpetrators accountable. Ensure said content is available to victims, organizations, and international and national judicial authorities without undue delay. 
  • Inform Ethiopian users about reporting mechanisms and relevant platform rules or guidelines. Reporting incitement to violence should be easy and intuitive. Facebook can achieve this, among other ways, by making adjustments to its user interfaces, pinned news feed announcements, etc. Facebook should invest in advertising, even in traditional media, to make sure that users know about these mechanisms and are able to use them.
  • Establish early warnings systems for emergency escalation that will help detect imminent harm to the physical security of individuals. Facebook should develop early warning systems in close cooperation with all relevant stakeholders operating at the grassroots level in Ethiopia, including civil society organizations and human rights experts. Facebook should ensure these systems enable trusted national partners to evaluate their performance regularly.

Essential long-term mitigation efforts:

  • Add significant resources to right-respecting content moderation efforts. Facebook should ensure that it recruits a sufficient number of content reviewers that have the required skills in all languages spoken in Ethiopia. The local-language content reviewers should also demonstrate sufficient understanding of the national political, social, historical, and cultural context in the country. The ratio of Facebook moderators has to correspond adequately with the number of Facebook users in Ethiopia. Sources estimate that Facebook ads can reach up to 6 – 7 million users in Ethiopia.
  • Content-moderation activities concerning Ethiopia should be coordinated by individuals that understand the dynamic local context, and they should be able to coordinate locally. These individuals should have the capacity to appropriately identify and assess calls for violence and must be able to take the necessary measures to mitigate harm.
  • Enforce meaningful and robust transparency initiatives about policies, standards, and practices for identification, removal, or other restrictions of online content that incites violence, propagates hates, and discrimination in accordance with human rights laws. Such data should be publicly released on a regular basis and, at minimum, should contain the following information: number and type of content violation, number of received complaints and average response time, and number of content removals as well as pages and disabled accounts.
  • Facilitate access to the internet for trusted partners during any internet shutdowns. Trusted partners can’t collaborate with Facebook to report content if they are deliberately cut off from the platform or the internet. Even though the majority of Ethiopians do not have access to the internet, the diaspora can still propagate violence-inciting content.
  • Actively support and help to develop initiatives that promote human rights, tolerance, diversity, and equality for all people in Ethiopia. Engage in efforts promoting media literacy and help users distinguish credible news sources from propaganda and disinformation. In doing so, develop strong and continuous cooperation with trusted partners, independent media organizations, individuals, and flaggers on the ground, especially when matters of the highest public interest are at stake or if activities are likely to escalate violence.
  • Coordinate the efforts of global, regional, and local offices and staff to provide timely and coherent decision-making, led by human rights officers that are well versed in the dynamic context in Ethiopia. There are clear gaps between the Africa Policy Program managers who engage with trusted partners, civil society in Ethiopia, and the Facebook team that handles public policy issues. But the involvement of experts specializing in human rights issues should be a priority, with cross-functional reach to product, engineering, marketing, emergencies, elections, and other teams as necessary.
  • Finally, we call on Facebook and other dominant social media platforms, including private messaging services, to conduct in-depth human rights impact assessments for their products, policies, and operations, based on the national context, before they enter any new market. These assessments should be publicly available and translated to relevant local languages. This is particularly important for regions of the world that suffer from volatile ethnic, religious, political, or other social tensions. Private actors should employ measures to reduce risks as much as possible.

These requests do not preclude other efforts Facebook should make to help end incitement to violence in Ethiopia. We propose an initial set of long-overdue measures that must be implemented urgently.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd