インシデント 124の引用情報

Description: Optum's algorithm deployed by a large academic hospital was revealed by researchers to have under-predicted the health needs of black patients, effectively de-prioritizing them in extra care programs relative to white patients with the same health burden.
推定: Optumが開発し、unnamed large academic hospitalが提供したAIシステムで、Black patientsに影響を与えた

インシデントのステータス

インシデントID
124
レポート数
7
インシデント発生日
2019-10-24
エディタ
Sean McGregor, Khoa Lam
A Health Care Algorithm Offered Less Care to Black Patients
wired.com · 2019

Care for some of the sickest Americans is decided in part by algorithm. New research shows that software guiding care for tens of millions of people systematically privileges white patients over black patients. Analysis of records from a ma…

Racial bias in a medical algorithm favors white patients over sicker black patients
washingtonpost.com · 2019

A widely used algorithm that predicts which patients will benefit from extra medical care dramatically underestimates the health needs of the sickest black patients, amplifying long-standing racial disparities in medicine, researchers have …

Millions of black people affected by racial bias in health-care algorithms
nature.com · 2019

An algorithm widely used in US hospitals to allocate health care to patients has been systematically discriminating against black people, a sweeping analysis has found.

The study, published in Science on 24 October, concluded that the algor…

New York Insurance Regulator to Probe Optum Algorithm for Racial Bias
fiercehealthcare.com · 2019

New York's Financial Services and Health departments sent a letter to UnitedHealth Group’s CEO David Wichmann Friday regarding an algorithm developed by Optum, The Wall Street Journal reported. The investigation is in response to a study pu…

These Algorithms Look at X-Rays-and Somehow Detect Your Race
wired.com · 2021

Millions of dollars are being spent to develop artificial intelligence software that reads x-rays and other medical scans in hopes it can spot things doctors look for but sometimes miss, such as lung cancers. A new study reports that these …

'Racism is America’s oldest algorithm': How bias creeps into health care AI
statnews.com · 2022

Artificial intelligence and medical algorithms are deeply intertwined with our modern health care system. These technologies mimic the thought processes of doctors to make medical decisions and are designed to help providers determine who n…

Algorithms Are Making Decisions About Health Care, Which May Only Worsen Medical Racism
aclu.org · 2022

Artificial intelligence (AI) and algorithmic decision-making systems — algorithms that analyze massive amounts of data and make predictions about the future — are increasingly affecting Americans’ daily lives. People are compelled to includ…

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents