Description: A study published in JAMA Network Open reveals that racial bias built into a commonly used medical diagnostic algorithm for lung function may be leading to underdiagnoses of breathing problems in Black men. The study suggests that as many as 40% more Black male patients might have been accurately diagnosed if the software were not racially biased. The software algorithm adjusts diagnostic thresholds based on race, affecting medical treatments and interventions.
Entities
View all entitiesAlleged: unknown developed an AI system deployed by University of Pennsylvania Health System, which harmed Black men who underwent lung function tests between 2010 and 2020 and potentially received inaccurate or delayed diagnoses and medical interventions due to the biased algorithm.
Incident Stats
Incident ID
582
Report Count
1
Incident Date
2023-06-01
Editors
Daniel Atherton
Incident Reports
Reports Timeline
apnews.com · 2023
- View the original report at its source
- View the report at the Internet Archive
NEW YORK (AP) — Racial bias built into a common medical test for lung function is likely leading to fewer Black patients getting care for breathing problems, a study published Thursday suggests.
As many as 40% more Black male patients in th…
Variants
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.