Incident 124: Algorithmic Health Risk Scores Underestimated Black Patients’ Needs

Description: Optum's algorithm deployed by a large academic hospital was revealed by researchers to have under-predicted the health needs of black patients, effectively de-prioritizing them in extra care programs relative to white patients with the same health burden.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Optum developed an AI system deployed by unnamed large academic hospital, which harmed Black patients.

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor, Khoa Lam
A Health Care Algorithm Offered Less Care to Black Patients · 2019

Care for some of the sickest Americans is decided in part by algorithm. New research shows that software guiding care for tens of millions of people systematically privileges white patients over black patients. Analysis of records from a ma…

Racial bias in a medical algorithm favors white patients over sicker black patients · 2019

A widely used algorithm that predicts which patients will benefit from extra medical care dramatically underestimates the health needs of the sickest black patients, amplifying long-standing racial disparities in medicine, researchers have …

Millions of black people affected by racial bias in health-care algorithms · 2019

An algorithm widely used in US hospitals to allocate health care to patients has been systematically discriminating against black people, a sweeping analysis has found.

The study, published in Science on 24 October, concluded that the algor…

New York Insurance Regulator to Probe Optum Algorithm for Racial Bias · 2019

New York's Financial Services and Health departments sent a letter to UnitedHealth Group’s CEO David Wichmann Friday regarding an algorithm developed by Optum, The Wall Street Journal reported. The investigation is in response to a study pu…

These Algorithms Look at X-Rays-and Somehow Detect Your Race · 2021

Millions of dollars are being spent to develop artificial intelligence software that reads x-rays and other medical scans in hopes it can spot things doctors look for but sometimes miss, such as lung cancers. A new study reports that these …

'Racism is America’s oldest algorithm': How bias creeps into health care AI · 2022

Artificial intelligence and medical algorithms are deeply intertwined with our modern health care system. These technologies mimic the thought processes of doctors to make medical decisions and are designed to help providers determine who n…

Algorithms Are Making Decisions About Health Care, Which May Only Worsen Medical Racism · 2022

Artificial intelligence (AI) and algorithmic decision-making systems — algorithms that analyze massive amounts of data and make predictions about the future — are increasingly affecting Americans’ daily lives. People are compelled to includ…


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents