Incident 158: Facial Recognition in Remote Learning Software Reportedly Failed to Recognize a Black Student’s Face

Description: A Black student's face was not recognized by the remote-proctoring software during check-in of a lab quiz, causing her to excessively change her environments for it to work as intended.
Alleged: Unknown developed and deployed an AI system, which harmed Amaya Ross , Black students and Black test-takers.

Suggested citation format

Anonymous. (2021-02-01) Incident Number 158. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
158
Report Count
1
Incident Date
2021-02-01
Editors
Sean McGregor, Khoa Lam

Reports Timeline

Incident OccurrenceAmaya's Flashlight

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

Incident Reports

We believe in people and causes that make our world better. Students of colour are getting flagged to their teachers because testing software can’t see them. Mozilla approached us to help tell Amaya’s story of encountering software that failed to recognize her because of her skin tone. Testing and Recognition Software that is essential for students across the world to use for tests and take classes. Did you know some facial detection testing software that schools use fail to recognize non-white skin tones faces over half the time? Music and sound design by Jeff Moberg.

Amaya's Flashlight

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents