Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Entities

female patients

Incidents Harmed By

Incident 811 Report
Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers

2020-10-21

A study by the University of Toronto, the Vector Institute, and MIT showed the input databases that trained AI systems used to classify chest X-rays led the systems to show gender, socioeconomic, and racial biases.

More

Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
 

Entity

Mount Sinai Hospitals

Incidents involved as Deployer
  • Incident 81
    1 Report

    Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers

More
Entity

Google

Incidents involved as Developer
  • Incident 81
    1 Report

    Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers

More
Entity

Qure.ai

Incidents involved as Developer
  • Incident 81
    1 Report

    Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers

More
Entity

Aidoc

Incidents involved as Developer
  • Incident 81
    1 Report

    Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers

More
Entity

DarwinAI

Incidents involved as Developer
  • Incident 81
    1 Report

    Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers

More
Entity

patients of minority groups

Incidents Harmed By
  • Incident 81
    1 Report

    Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers

More
Entity

low-income patients

Incidents Harmed By
  • Incident 81
    1 Report

    Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers

More
Entity

Hispanic patients

Incidents Harmed By
  • Incident 81
    1 Report

    Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers

More
Entity

patients with Medicaid insurance

Incidents Harmed By
  • Incident 81
    1 Report

    Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • ecd56df