Incident 82: #LekkiMassacre: Why Facebook labelled content from October 20 incident ‘false’

Description: Facebook incorrectly labels content relating to an incident between #EndSARS protestors and the Nigerian army as misinformation.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Facebook developed and deployed an AI system, which harmed Facebook users and Facebook users interested in the Lekki Massacre incident.

Incident Stats

Incident ID
82
Report Count
1
Incident Date
2020-10-21
Editors
Sean McGregor, Khoa Lam

CSETv0 Taxonomy Classifications

Taxonomy Details

Full Description

In October 2020, Facebook incorrectly labelled content from the #LekkiMassacre2020 incident as false. The incident consisted of an interaction #EndSARS protestors and the Nigerian army at the Lekki toll gate in Lagos, Nigeria. Images and videos posted on social media related to the event were marked as false by Facebook's fact-checking system, which is a hybrid system of human moderators, third-party fact checking organizations, and AI.

Short Description

Facebook incorrectly labels content relating to an incident between #EndSARS protestors and the Nigerian army as misinformation.

Severity

Moderate

Harm Type

Harm to social or political systems, Harm to civil liberties

AI System Description

Facebook's content moderation system consists of a hybrid system of AI and human moderators. The AI assists in dectecting hate speech, prioritizing, the queue to help moderators deal with sensitive content more quickly and detecting similar content to mark as containing 'false information'

System Developer

Facebook

Sector of Deployment

Information and communication

Relevant AI functions

Perception, Cognition, Action

AI Techniques

machine learning, supervised learning, open-source

AI Applications

content moderation, recommendation engine

Location

Lagos, Nigeria

Named Entities

Facebook, #EndSARS, Nigerian Army, Nigeria

Technology Purveyor

Facebook

Beginning Date

2020-10-21T07:00:00.000Z

Ending Date

2020-10-21T07:00:00.000Z

Near Miss

Unclear/unknown

Intent

Accident

Lives Lost

No

Data Inputs

user content (textposts, images, videos)

#LekkiMassacre: Why Facebook labelled content from October 20 incident ‘false’
techpoint.africa · 2020

On Wednesday, October 21, 2020, several content containing images related to the unfortunate incident that occurred at the Lekki toll gate in Lagos, Nigeria on Tuesday were flagged as misinformation on Facebook and Instagram.

The shocking i…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.