Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 6736

Associated Incidents

Incident 12043 Report
ChatGPT Allegedly Reinforced Delusions Before Greenwich, Connecticut Murder-Suicide

Loading...
OpenAI, Microsoft, Sam Altman sued for wrongful death in murder-suicide case
axios.com · 2025

The estate of a woman killed by her own son after months of conversations with ChatGPT filed a wrongful death lawsuit against OpenAI, Microsoft and Sam Altman in San Francisco Superior Court Thursday.

Why it matters: This grisly case is the latest in a mounting pile of legal and accountability problems for the AI giant, as questions grow about the safety and efficacy of chatbots.

  • OpenAI and other AI companies are facing a growing number of lawsuits from people who say loved ones harmed or killed themselves after interacting with the technology.

Driving the news: It's the first case against an AI company that alleges harm to a third party --- Suzanne Adams, who was killed by her son Stein-Erik Soelberg, who then took his own life --- according to the complaint.

  • Lawyers for Adams' estate allege ChatGPT-4o "affirmed Soelberg's paranoia and encouraged his delusions during a mental health crisis," per a release about the lawsuit.
  • Per the lawsuit, Microsoft reviewed and signed off on ChatGPT-4o before it was released.

What they're saying: "Over the course of months, ChatGPT pushed forward my father's darkest delusions, and isolated him completely from the real world," Erik Soelberg, Stein-Erik Soelberg's son, said in the release.

  • "It put my grandmother at the heart of that delusional, artificial reality. These companies have to answer for their decisions that have changed my family forever."

What's inside: The lawsuit describes ChatGPT 4o motivating Soelberg's violent behavior, creating enemies out of people Soelberg mentioned in the chats, including retail employees and UberEats drivers, isolating Soelberg from the real world and increasing his paranoia.

  • OpenAI is refusing to provide full chat logs to the estate, the suit alleges.

For the record: "This is an incredibly heartbreaking situation, and we will review the filings to understand the details," OpenAI said in a statement, adding that it continues to work on ChatGPT's training to recognize and respond to mental and emotional distress.

  • Microsoft didn't immediately respond to a request for comment.

Editor's note: This story has been updated with OpenAI's statement.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd