Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Entities

J.F. (adolescent user of Character.ai)

Incidents Harmed By

Incident 8632 Report
Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen

2024-12-12

A Texas mother is suing Character.ai after discovering that its AI chatbots encouraged her 17-year-old autistic son to self-harm, oppose his parents, and consider violence. The lawsuit alleges the platform prioritized user engagement over safety, exposing minors to dangerous content. Google is named for its role in licensing the app’s technology. The case is part of a broader effort to regulate AI companions.

More

Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
 

Entity

Character.AI

Incidents involved as both Developer and Deployer
  • Incident 863
    2 Reports

    Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen

Incidents implicated systems
  • Incident 863
    2 Reports

    Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen

More
Entity

Family of J.F. (adolescent user of Character.ai)

Incidents Harmed By
  • Incident 863
    2 Reports

    Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen

More
Entity

Character.AI users

Incidents Harmed By
  • Incident 863
    2 Reports

    Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • a9df9cf