Description: In Seoul, a woman allegedly used ChatGPT to ask whether mixing sleeping pills or benzodiazepines with alcohol could be fatal before poisoning drinks given to three men. Two men later died in separate motel incidents, and a third survived after losing consciousness. Police reportedly cited her chatbot queries and search history as evidence of intent.
Entities
View all entitiesAlleged: OpenAI developed an AI system deployed by Kim (suspect in Seoul poisoning case), which harmed Three unnamed men in their 20s in Seoul.
Alleged implicated AI system: ChatGPT
Incident Stats
Incident ID
1399
Report Count
1
Incident Date
2026-01-28
Editors
Daniel Atherton
Incident Reports
Reports Timeline
Loading...
Careful how you interact with chatbots, as you might just be giving them reasons to help carry out premeditated murder.
A 21-year-old woman in South Korea allegedly used ChatGPT to help answer questions as she planned a series of murders th…
Variants
A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?