Registro de citas para el Incidente 81

Description: A study by the University of Toronto, the Vector Institute, and MIT showed the input databases that trained AI systems used to classify chest X-rays led the systems to show gender, socioeconomic, and racial biases.

Herramientas

Nuevo InformeNuevo InformeNueva RespuestaNueva RespuestaDescubrirDescubrirVer HistorialVer Historial

Estadísticas de incidentes

ID
81
Cantidad de informes
1
Fecha del Incidente
2020-10-21
Editores
Sean McGregor, Khoa Lam

Clasificaciones de la Taxonomía CSETv0

Detalles de la Taxonomía

Full Description

A study by the University of Toronto, the Vector Institute, and MIT showed the input databases that trained AI systems used to classify chest X-rays led the systems to show gender, socioeconomic, and racial biases. Google startups like Qure.ai, Aidoc, and DarwinAI can scan chest X-rays to determine likelihood of conditions like fractures and collapsed lungs. The databases used to train the AI were found to consist of examples of primarily white patients (67.64%), leading the diagnostic system to be more accurate with diagnosing white patients than other patients. Black patients were half as likely to be recommended for further care when it was needed.

Short Description

A study by the University of Toronto, the Vector Institute, and MIT showed the input databases that trained AI systems used to classify chest X-rays led the systems to show gender, socioeconomic, and racial biases.

Severity

Unclear/unknown

Harm Distribution Basis

Race, Sex, Financial means

Harm Type

Harm to physical health/safety

AI System Description

Google start up companies Qure.ai, Aidoc, and DarwinAI that use AI systems to analyze medical imagery

System Developer

Google

Sector of Deployment

Human health and social work activities

Relevant AI functions

Perception, Cognition

AI Techniques

medical image processor

AI Applications

image classification

Named Entities

MIT, Mount Sinai Hospital, University of Toronto, Vector Institute, Google, Qure.ai, Aidoc, DarwinAI

Technology Purveyor

Google

Beginning Date

2020-10-21T07:00:00.000Z

Ending Date

2020-10-21T07:00:00.000Z

Near Miss

Unclear/unknown

Intent

Unclear

Lives Lost

No

Infrastructure Sectors

Healthcare and public health

Data Inputs

medical imagery databases

Clasificaciones de la Taxonomía CSETv1

Detalles de la Taxonomía

Harm Distribution Basis

race, sex, financial means, age

Sector of Deployment

human health and social work activities

Los investigadores encuentran evidencia de sesgo racial, de género y socioeconómico en los clasificadores de rayos X de tórax
venturebeat.com · 2020

Google y nuevas empresas como Qure.ai, Aidoc y DarwinAI están desarrollando sistemas de inteligencia artificial y aprendizaje automático que clasifican las radiografías de tórax para ayudar a identificar afecciones como fracturas y pulmones…

Variantes

Una "Variante" es un incidente que comparte los mismos factores causales, produce daños similares e involucra los mismos sistemas inteligentes que un incidente de IA conocido. En lugar de indexar las variantes como incidentes completamente separados, enumeramos las variaciones de los incidentes bajo el primer incidente similar enviado a la base de datos. A diferencia de otros tipos de envío a la base de datos de incidentes, no se requiere que las variantes tengan informes como evidencia externa a la base de datos de incidentes. Obtenga más información del trabajo de investigación.

Incidentes Similares

Por similitud de texto

Did our AI mess up? Flag the unrelated incidents