Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1399: South Korean Woman Allegedly Used ChatGPT to Assess Lethality of Drug-and-Alcohol Mixtures Before Two Fatal Motel Poisonings

Description: In Seoul, a woman allegedly used ChatGPT to ask whether mixing sleeping pills or benzodiazepines with alcohol could be fatal before poisoning drinks given to three men. Two men later died in separate motel incidents, and a third survived after losing consciousness. Police reportedly cited her chatbot queries and search history as evidence of intent.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: OpenAI developed an AI system deployed by Kim (suspect in Seoul poisoning case), which harmed Three unnamed men in their 20s in Seoul.
Alleged implicated AI system: ChatGPT

Incident Stats

Incident ID
1399
Report Count
1
Incident Date
2026-01-28
Editors
Daniel Atherton

Incident Reports

Reports Timeline

Incident Occurrence‘Could it kill someone?’ A Seoul woman allegedly used ChatGPT to help carry out two murders in South Korean motels
Loading...
‘Could it kill someone?’ A Seoul woman allegedly used ChatGPT to help carry out two murders in South Korean motels

‘Could it kill someone?’ A Seoul woman allegedly used ChatGPT to help carry out two murders in South Korean motels

fortune.com

Loading...
‘Could it kill someone?’ A Seoul woman allegedly used ChatGPT to help carry out two murders in South Korean motels
fortune.com · 2026

Careful how you interact with chatbots, as you might just be giving them reasons to help carry out premeditated murder.

A 21-year-old woman in South Korea allegedly used ChatGPT to help answer questions as she planned a series of murders th…

Variants

A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?
Previous IncidentNext Incident

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd