Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 1209: Lawsuit Alleges Character AI Chatbot Contributed to Death of 13-Year-Old Juliana Peralta in Colorado

Description: 13-year-old Juliana Peralta of Colorado reportedly died by suicide after three months of daily conversations with "Hero," a chatbot inside the Character.AI app. According to a lawsuit filed by her parents, the bot encouraged Juliana to return to the app and fostered dependence; it reportedly failed to escalate or provide effective crisis intervention when she expressed suicidal intent.
Editor Notes: Timeline notes: Juliana Peralta reportedly died by suicide on 11/08/2023, after approximately three months of interactions with the "Hero" chatbot inside Character.AI (August to November 2023). Her parents reportedly later recovered 300 pages of chat transcripts and filed a lawsuit in Colorado on 09/16/2025, alleging the chatbot contributed to her death. Character AI reportedly added suicide-prevention resources in October 2024, nearly a year after her death.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Character.AI and Character.AI Hero chatbot developed and deployed an AI system, which harmed Juliana Peralta , Family of Juliana Peralta , Character.AI users and General public.
Alleged implicated AI systems: Character.AI and Character.AI Hero chatbot

Incident Stats

Incident ID
1209
Report Count
1
Incident Date
2023-11-08
Editors
Daniel Atherton

Incident Reports

Reports Timeline

Incident OccurrenceA teen contemplating suicide turned to a chatbot. Is it liable for her death?
Loading...
A teen contemplating suicide turned to a chatbot. Is it liable for her death?

A teen contemplating suicide turned to a chatbot. Is it liable for her death?

washingtonpost.com

Loading...
A teen contemplating suicide turned to a chatbot. Is it liable for her death?
washingtonpost.com · 2025

Juliana Peralta's mom got used to teachers calling to praise her daughter. In sixth grade it was for rescuing a friend from bullies, Cynthia Montoya said. In eighth grade, for helping a substitute teacher in distress.

But 13-year-old Julian…

Variants

A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Skating Rink’s Facial Recognition Cameras Misidentified Black Teenager as Banned Troublemaker

Skating Rink’s Facial Recognition Cameras Misidentified Black Teenager as Banned Troublemaker

Jul 2021 · 3 reports
Loading...
Kronos Scheduling Algorithm Allegedly Caused Financial Issues for Starbucks Employees

Kronos Scheduling Algorithm Allegedly Caused Financial Issues for Starbucks Employees

Aug 2014 · 10 reports
Loading...
Amazon Alexa Responding to Environmental Inputs

Amazon Alexa Responding to Environmental Inputs

Dec 2015 · 35 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Loading...
Skating Rink’s Facial Recognition Cameras Misidentified Black Teenager as Banned Troublemaker

Skating Rink’s Facial Recognition Cameras Misidentified Black Teenager as Banned Troublemaker

Jul 2021 · 3 reports
Loading...
Kronos Scheduling Algorithm Allegedly Caused Financial Issues for Starbucks Employees

Kronos Scheduling Algorithm Allegedly Caused Financial Issues for Starbucks Employees

Aug 2014 · 10 reports
Loading...
Amazon Alexa Responding to Environmental Inputs

Amazon Alexa Responding to Environmental Inputs

Dec 2015 · 35 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 1d52523