Description: A study by the University of Toronto, the Vector Institute, and MIT showed the input databases that trained AI systems used to classify chest X-rays led the systems to show gender, socioeconomic, and racial biases.
Entités
Voir toutes les entitésPrésumé : un système d'IA développé par Google , Qure.ai , Aidoc and DarwinAI et mis en œuvre par Mount Sinai Hospitals, endommagé patients of minority groups , low-income patients , female patients , Hispanic patients and patients with Medicaid insurance.
Statistiques d'incidents
ID
81
Nombre de rapports
1
Date de l'incident
2020-10-21
Editeurs
Sean McGregor, Khoa Lam
Applied Taxonomies
Classifications de taxonomie CSETv0
Détails de la taxonomieProblem Nature
Indicates which, if any, of the following types of AI failure describe the incident: "Specification," i.e. the system's behavior did not align with the true intentions of its designer, operator, etc; "Robustness," i.e. the system operated unsafely because of features or changes in its environment, or in the inputs the system received; "Assurance," i.e. the system could not be adequately monitored or controlled during operation.