インシデント 81の引用情報

Description: A study by the University of Toronto, the Vector Institute, and MIT showed the input databases that trained AI systems used to classify chest X-rays led the systems to show gender, socioeconomic, and racial biases.

インシデントのステータス

インシデントID
81
レポート数
1
インシデント発生日
2020-10-21
エディタ
Sean McGregor, Khoa Lam

CSETv0 分類法のクラス

分類法の詳細

Full Description

A study by the University of Toronto, the Vector Institute, and MIT showed the input databases that trained AI systems used to classify chest X-rays led the systems to show gender, socioeconomic, and racial biases. Google startups like Qure.ai, Aidoc, and DarwinAI can scan chest X-rays to determine likelihood of conditions like fractures and collapsed lungs. The databases used to train the AI were found to consist of examples of primarily white patients (67.64%), leading the diagnostic system to be more accurate with diagnosing white patients than other patients. Black patients were half as likely to be recommended for further care when it was needed.

Short Description

A study by the University of Toronto, the Vector Institute, and MIT showed the input databases that trained AI systems used to classify chest X-rays led the systems to show gender, socioeconomic, and racial biases.

Severity

Unclear/unknown

Harm Distribution Basis

Race, Sex, Financial means

Harm Type

Harm to physical health/safety

AI System Description

Google start up companies Qure.ai, Aidoc, and DarwinAI that use AI systems to analyze medical imagery

System Developer

Google

Sector of Deployment

Human health and social work activities

Relevant AI functions

Perception, Cognition

AI Techniques

medical image processor

AI Applications

image classification

Named Entities

MIT, Mount Sinai Hospital, University of Toronto, Vector Institute, Google, Qure.ai, Aidoc, DarwinAI

Technology Purveyor

Google

Beginning Date

2020-10-21T07:00:00.000Z

Ending Date

2020-10-21T07:00:00.000Z

Near Miss

Unclear/unknown

Intent

Unclear

Lives Lost

No

Infrastructure Sectors

Healthcare and public health

Data Inputs

medical imagery databases

CSETv1 分類法のクラス

分類法の詳細

Harm Distribution Basis

race, sex, financial means, age

Sector of Deployment

human health and social work activities

Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers
venturebeat.com · 2020

Google and startups like Qure.ai, Aidoc, and DarwinAI are developing AI and machine learning systems that classify chest X-rays to help identify conditions like fractures and collapsed lungs. Several hospitals, including Mount Sinai, have p…

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents