Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1281: Alleged Harmful Health Outcomes Following Reported Use of Purported ChatGPT-Generated Medical Advice in Hyderabad

Description: Reports from Hyderabad describe two alleged patient harms after individuals acted on purportedly ChatGPT-generated medical advice instead of clinician guidance. A kidney-transplant recipient reportedly discontinued prescribed post-transplant medications based on a chatbot suggestion and experienced graft failure. In a separate case, a man with diabetes allegedly developed severe hyponatremia after following a chatbot-advised zero-salt diet.
Editor Notes: The reporting on this incident seems to draw on three discrete harm events to point to a wider concern in Hyderabad. The case involving an unnamed kidney transplant patient and an elderly man with diabetes are Hyderabad-specific, whereas the third case that seems to appear in the reporting is a reference to a New York case, Incident 1166: ChatGPT Reportedly Suggests Sodium Bromide as Chloride Substitute, Leading to Bromism and Hospitalization.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: OpenAI and ChatGPT developed and deployed an AI system, which harmed Unnamed patient with diabetes in Hyderabad , Unnamed kidney transplant patient in Hyderabad , OpenAI users , General public of India , General public of Hyderabad , General public and ChatGPT users.
Alleged implicated AI system: ChatGPT

Incident Stats

Incident ID
1281
Report Count
2
Incident Date
2025-11-10
Editors
Daniel Atherton

Incident Reports

Reports Timeline

+2
ChatGPT Isn't Your Doctor: Hyderabad Doctors Warn After Patients Harmed by AI Advice
Loading...
ChatGPT Isn't Your Doctor: Hyderabad Doctors Warn After Patients Harmed by AI Advice

ChatGPT Isn't Your Doctor: Hyderabad Doctors Warn After Patients Harmed by AI Advice

completeaitraining.com

Loading...
Doctors warn against relying on AI tools for medical advice

Doctors warn against relying on AI tools for medical advice

timesofindia.indiatimes.com

Loading...
ChatGPT Isn't Your Doctor: Hyderabad Doctors Warn After Patients Harmed by AI Advice
completeaitraining.com · 2025

Doctors urge patients to stop using AI chatbots as a substitute for medical care

Clinicians in Hyderabad are seeing a sharp rise in patients acting on generic chatbot advice and paying a heavy price. Two recent cases underscore the risk: di…

Loading...
Doctors warn against relying on AI tools for medical advice
timesofindia.indiatimes.com · 2025

Hyderabad: Doctors in Hyderabad have cautioned people against relying solely on artificial intelligence (AI) tools such as ChatGPT for medical advice. They emphasised that patients, especially those with chronic or serious health conditions…

Variants

A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?

Similar Incidents

Selected by our editors

ChatGPT Reportedly Suggests Sodium Bromide as Chloride Substitute, Leading to Bromism and Hospitalization

Aug 2025 · 5 reports
By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Collection of Robotic Surgery Malfunctions

Collection of Robotic Surgery Malfunctions

Jul 2015 · 12 reports
Loading...
Kidney Testing Method Allegedly Underestimated Risk of Black Patients

Kidney Testing Method Allegedly Underestimated Risk of Black Patients

Mar 1999 · 3 reports
Loading...
Racist AI behaviour is not a new problem

Racist AI behaviour is not a new problem

Mar 1998 · 4 reports
Previous IncidentNext Incident

Similar Incidents

Selected by our editors

ChatGPT Reportedly Suggests Sodium Bromide as Chloride Substitute, Leading to Bromism and Hospitalization

Aug 2025 · 5 reports
By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Collection of Robotic Surgery Malfunctions

Collection of Robotic Surgery Malfunctions

Jul 2015 · 12 reports
Loading...
Kidney Testing Method Allegedly Underestimated Risk of Black Patients

Kidney Testing Method Allegedly Underestimated Risk of Black Patients

Mar 1999 · 3 reports
Loading...
Racist AI behaviour is not a new problem

Racist AI behaviour is not a new problem

Mar 1998 · 4 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 353a03d