Incident 124: Algorithmic Health Risk Scores Underestimated Black Patients’ Needs

Description: Optum's algorithm deployed by a large academic hospital was revealed by researchers to have under-predicted the health needs of black patients, effectively de-prioritizing them in extra care programs relative to white patients with the same health burden.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover
Alleged: Optum developed an AI system deployed by unnamed large academic hospital, which harmed Black patients.

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor, Khoa Lam


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents