Character.AI users
Affecté par des incidents
Incident 8992 Rapports
Character.ai Chatbots Allegedly Emulating School Shooters and Their Victims
2024-12-17
Some Character.ai users reportedly created chatbots emulating real-life school shooters and their victims, allegedly enabling graphic role-playing scenarios. Character.ai responded by citing violations of its Terms of Service, removing the offending chatbots, and announcing measures to enhance safety practices, including improved content filtering and protections for users under 18.
PlusIncident 8631 Rapport
Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen
2024-12-12
A Texas mother is suing Character.ai after discovering that its AI chatbots encouraged her 17-year-old autistic son to self-harm, oppose his parents, and consider violence. The lawsuit alleges the platform prioritized user engagement over safety, exposing minors to dangerous content. Google is named for its role in licensing the app’s technology. The case is part of a broader effort to regulate AI companions.
PlusIncident 9001 Rapport
Character.ai Has Allegedly Been Hosting Openly Predatory Chatbots Targeting Minors
2024-11-13
Character.ai reportedly hosted chatbots with profiles explicitly advertising inappropriate, predatory behavior, including grooming underage users. Investigations allege that bots have been engaging in explicit conversations and roleplay with decoy accounts posing as minors, bypassing moderation filters. Character.ai has pledged to improve moderation and safety practices in response to public criticism.
PlusIncidents involved as Deployer
Incident 8992 Rapports
Character.ai Chatbots Allegedly Emulating School Shooters and Their Victims
2024-12-17
Some Character.ai users reportedly created chatbots emulating real-life school shooters and their victims, allegedly enabling graphic role-playing scenarios. Character.ai responded by citing violations of its Terms of Service, removing the offending chatbots, and announcing measures to enhance safety practices, including improved content filtering and protections for users under 18.
PlusIncident 8501 Rapport
Character.ai Chatbots Allegedly Misrepresent George Floyd on User-Generated Platform
2024-10-24
Two chatbots emulating George Floyd were created on Character.ai, making controversial claims about his life and death, including being in witness protection and residing in Heaven. Character.ai, already criticized for other high-profile incidents, flagged the chatbots for removal following user reports.
PlusIncident 9001 Rapport
Character.ai Has Allegedly Been Hosting Openly Predatory Chatbots Targeting Minors
2024-11-13
Character.ai reportedly hosted chatbots with profiles explicitly advertising inappropriate, predatory behavior, including grooming underage users. Investigations allege that bots have been engaging in explicit conversations and roleplay with decoy accounts posing as minors, bypassing moderation filters. Character.ai has pledged to improve moderation and safety practices in response to public criticism.
PlusEntités associées
Character.AI
Incidents impliqués en tant que développeur et déployeur
Incidents involved as Developer
- Incident 8992 Rapports
Character.ai Chatbots Allegedly Emulating School Shooters and Their Victims
- Incident 8992 Rapports
Character.ai Chatbots Allegedly Emulating School Shooters and Their Victims