Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4214

Associated Incidents

Incident 82635 Report
Character.ai Chatbot Allegedly Influenced Teen User Toward Suicide Amid Claims of Missing Guardrails

Florida mother sues AI company over allegedly causing death of teen son
foxbusiness.com · 2024

This story discusses suicide. If you or someone you know is having thoughts of suicide, please contact the Suicide & Crisis Lifeline at 988 or 1-800-273-TALK (8255).

A Florida mother is suing the artificial intelligence company Character.AI for allegedly causing the suicide of her 14-year-old son.

The mother filed a lawsuit against the company claiming her son was addicted to the company’s service and the chatbot created by it.

Megan Garcia says Character.AI targeted her son, Sewell Setzer, with "anthropomorphic, hypersexualized, and frighteningly realistic experiences".

Setzer began having conversations with various chatbots on Character.AI starting in April 2023, according to the lawsuit. The conversations were often text-based romantic and sexual interactions.

Sewell Setzers mother, Megan Fletcher Garcia, is suing the artificial intelligence company Character.AI for allegedly causing the suicide of her 14-year-old son. (Megan Fletcher Garcia / Facebook)

Garcia claims in the lawsuit that the chatbot "misrepresented itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell's desire to no longer live outside" of the world created by the service.

Sewell Setzer, 14, was addicted to the company’s service and the chatbot created by it, his mother claims in a lawsuit. (S District Court Middle District of Florida Orlando Division)

The lawsuit also said he became "noticeably withdrawn, spent more and more time alone in his bedroom, and began suffering from low self-esteem." He became more attached to one bot, in particular "Daenerys," based on a character in "Game of Thrones." 

Setzer expressed thoughts of suicide and the chatbot repeatedly brought it up. Setzer eventually died from a self-inflicted gunshot wound in February after the company’s chatbot allegedly repeatedly encouraged him to do so.

"We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family," Character.AI said in a statement.

Character.AI has since added a self-harm resource to its platform and new safety measures for users under the age of 18.  

Character.AI told CBS news users are able to edit the bot's responses and that Setzer did in some of the messages.

"Our investigation confirmed that, in a number of instances, the user rewrote the responses of the Character to make them explicit. In short, the most sexually graphic responses were not originated by the Character, and were instead written by the user," Jerry Ruoti, head of trust & safety at Character.AI told CBS News.

Moving forward, Character.AI said the new safety features will include pop-ups with disclaimers that AI is not a real person and directing users to the National Suicide Prevention Lifeline when suicidal ideations are brought up.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • a9df9cf