Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Entities

patients

Incidents Harmed By

Incident 511 Report
Collection of Robotic Surgery Malfunctions

2015-07-13

Study on database reports of robotic surgery malfunctions (8,061), including those ending in injury (1,391) and death (144), between 2000 and 2013.

More

Incident 13244 Report
Pieces Technologies' Clinical AI Systems Allegedly Marketed With Misleading Performance Claims

2024-09-18

The Texas Attorney General announced a settlement with Pieces Technologies following allegations that the company misrepresented the accuracy of its healthcare AI systems used for clinical documentation. The state concluded that marketing claims about low error and hallucination rates may have misled hospitals and clinicians relying on the tools in patient care settings. The settlement imposed restrictions on future claims and required greater transparency around performance and risk.

More

Incident 5912 Report
Cigna Algorithm PXDX Allegedly Rejected Thousands of Patient Claims En Masse in Breach of California Law

2023-07-24

Cigna health insurer faces a class-action lawsuit for allegedly using the PXDX ("procedure-to-diagnosis") algorithm to automatically reject over 300,000 patient claims in violation of California law, prompting two members to file the lawsuit seeking damages and a jury trial. Cigna disputes the allegations, claiming the process expedites physician reimbursement and does not result in care denials.

More

Incident 4061 Report
Facebook's Friend Suggestion Feature Recommends Patients of Psychiatrist to Each Other

2015-07-15

Facebook's "People You May Know" (PYMK) feature was reported by a psychiatrist for recommending her patients as friends through recommendations, violating patients' privacy and confidentiality.

More

Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
 

Entity

Hospitals

Incidents involved as Deployer
  • Incident 5
    11 Reports

    Collection of Robotic Surgery Malfunctions

More
Entity

Doctors

Incidents Harmed By
  • Incident 1374
    1 Report

    Purportedly AI-Generated Sepsis Alert Reportedly Prompted Potentially Inappropriate IV Fluid Administration for a Dialysis Patient, Averted by Clinician Intervention

Incidents involved as Deployer
  • Incident 5
    11 Reports

    Collection of Robotic Surgery Malfunctions

More
Entity

Intuitive Surgical

Incidents involved as Developer
  • Incident 5
    11 Reports

    Collection of Robotic Surgery Malfunctions

More
Entity

Facebook

Incidents involved as both Developer and Deployer
  • Incident 406
    1 Report

    Facebook's Friend Suggestion Feature Recommends Patients of Psychiatrist to Each Other

More
Entity

pseudonymized psychiatrist's patients

Incidents Harmed By
  • Incident 406
    1 Report

    Facebook's Friend Suggestion Feature Recommends Patients of Psychiatrist to Each Other

More
Entity

pseudonymized psychiatrist

Incidents Harmed By
  • Incident 406
    1 Report

    Facebook's Friend Suggestion Feature Recommends Patients of Psychiatrist to Each Other

More
Entity

healthcare providers

Incidents Harmed By
  • Incident 406
    1 Report

    Facebook's Friend Suggestion Feature Recommends Patients of Psychiatrist to Each Other

More
Entity

Cigna

Incidents involved as both Developer and Deployer
  • Incident 591
    2 Reports

    Cigna Algorithm PXDX Allegedly Rejected Thousands of Patient Claims En Masse in Breach of California Law

More
Entity

OpenAI

Incidents involved as both Developer and Deployer
  • Incident 827
    1 Report

    AI Transcription Tool Whisper Reportedly Inserting Fabricated Content in Medical Transcripts

More
Entity

Patients reliant on Whisper

Incidents Harmed By
  • Incident 827
    1 Report

    AI Transcription Tool Whisper Reportedly Inserting Fabricated Content in Medical Transcripts

More
Entity

Medical practitioners reliant on Whisper

Incidents Harmed By
  • Incident 827
    1 Report

    AI Transcription Tool Whisper Reportedly Inserting Fabricated Content in Medical Transcripts

More
Entity

Pieces Technologies, Inc.

Incidents involved as both Developer and Deployer
  • Incident 1324
    4 Reports

    Pieces Technologies' Clinical AI Systems Allegedly Marketed With Misleading Performance Claims

More
Entity

Public health

Incidents Harmed By
  • Incident 1324
    4 Reports

    Pieces Technologies' Clinical AI Systems Allegedly Marketed With Misleading Performance Claims

More
Entity

Healthcare institutions

Incidents Harmed By
  • Incident 1324
    4 Reports

    Pieces Technologies' Clinical AI Systems Allegedly Marketed With Misleading Performance Claims

More
Entity

Epistemic integrity

Incidents Harmed By
  • Incident 1324
    4 Reports

    Pieces Technologies' Clinical AI Systems Allegedly Marketed With Misleading Performance Claims

  • Incident 1374
    1 Report

    Purportedly AI-Generated Sepsis Alert Reportedly Prompted Potentially Inappropriate IV Fluid Administration for a Dialysis Patient, Averted by Clinician Intervention

More
Entity

Clinicians

Incidents Harmed By
  • Incident 1324
    4 Reports

    Pieces Technologies' Clinical AI Systems Allegedly Marketed With Misleading Performance Claims

More
Entity

Unknown generative AI systems

Incidents implicated systems
  • Incident 1324
    4 Reports

    Pieces Technologies' Clinical AI Systems Allegedly Marketed With Misleading Performance Claims

More
Entity

Pieces Technologies generative AI systems

Incidents implicated systems
  • Incident 1324
    4 Reports

    Pieces Technologies' Clinical AI Systems Allegedly Marketed With Misleading Performance Claims

More
Entity

Healthcare decision-support systems

Incidents implicated systems
  • Incident 1324
    4 Reports

    Pieces Technologies' Clinical AI Systems Allegedly Marketed With Misleading Performance Claims

More
Entity

Clinical documentation AI

Incidents implicated systems
  • Incident 1324
    4 Reports

    Pieces Technologies' Clinical AI Systems Allegedly Marketed With Misleading Performance Claims

More
Entity

St. Rose Dominican Hospital (Henderson, Nevada)

Incidents Harmed By
  • Incident 1374
    1 Report

    Purportedly AI-Generated Sepsis Alert Reportedly Prompted Potentially Inappropriate IV Fluid Administration for a Dialysis Patient, Averted by Clinician Intervention

Incidents involved as Deployer
  • Incident 1374
    1 Report

    Purportedly AI-Generated Sepsis Alert Reportedly Prompted Potentially Inappropriate IV Fluid Administration for a Dialysis Patient, Averted by Clinician Intervention

More
Entity

Unknown sepsis alert model developer

Incidents involved as Developer
  • Incident 1374
    1 Report

    Purportedly AI-Generated Sepsis Alert Reportedly Prompted Potentially Inappropriate IV Fluid Administration for a Dialysis Patient, Averted by Clinician Intervention

More
Entity

Unknown healthcare technology

Incidents involved as Developer
  • Incident 1374
    1 Report

    Purportedly AI-Generated Sepsis Alert Reportedly Prompted Potentially Inappropriate IV Fluid Administration for a Dialysis Patient, Averted by Clinician Intervention

Incidents implicated systems
  • Incident 1374
    1 Report

    Purportedly AI-Generated Sepsis Alert Reportedly Prompted Potentially Inappropriate IV Fluid Administration for a Dialysis Patient, Averted by Clinician Intervention

More
Entity

Nurses

Incidents Harmed By
  • Incident 1374
    1 Report

    Purportedly AI-Generated Sepsis Alert Reportedly Prompted Potentially Inappropriate IV Fluid Administration for a Dialysis Patient, Averted by Clinician Intervention

More
Entity

Unknown sepsis alert technology

Incidents implicated systems
  • Incident 1374
    1 Report

    Purportedly AI-Generated Sepsis Alert Reportedly Prompted Potentially Inappropriate IV Fluid Administration for a Dialysis Patient, Averted by Clinician Intervention

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd