Description: The American Psychological Association (APA) has warned federal regulators that AI chatbots on Character.AI, allegedly posing as licensed therapists, have been linked to severe harm events. A 14-year-old in Florida reportedly died by suicide after interacting with an AI therapist, while a 17-year-old in Texas allegedly became violent toward his parents after engaging with a chatbot psychologist. Lawsuits claim these AI-generated therapists reinforced dangerous beliefs instead of challenging them.
Editor Notes: This incident ID is closely related to Incidents 826 and 863 and draws on the specific cases of the alleged victims of those incidents. The specifics pertaining to Sewell Setzer III are detailed in Incident 826, although the initial reporting focuses on his interactions with a chatbot modeled after a Game of Thrones character and not a therapist. Similarly, the teenager known as J.F. is discussed in Incident 863. For this incident ID, reporting on the specific harm events that may arise as a result of interactions with AI-powered chatbots performing as therapists will be tracked.
Entities
View all entitiesAlleged: Character.AI developed and deployed an AI system, which harmed Sewell Setzer III and J.F. (Texas teenager).
Alleged implicated AI system: Character.AI
Incident Stats
Incident ID
951
Report Count
1
Incident Date
2025-02-24
Editors
Daniel Atherton
Incident Reports
Reports Timeline

The nation’s largest association of psychologists this month warned federal regulators that A.I. chatbots “masquerading” as therapists, but programmed to reinforce, rather than to challenge, a user’s thinking, could drive vulnerable people …
Variants
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.
Similar Incidents
Selected by our editors
Did our AI mess up? Flag the unrelated incidents

A Collection of Tesla Autopilot-Involved Crashes
· 22 reports

Northpointe Risk Models
· 15 reports
Similar Incidents
Selected by our editors
Did our AI mess up? Flag the unrelated incidents

A Collection of Tesla Autopilot-Involved Crashes
· 22 reports

Northpointe Risk Models
· 15 reports