Character.AI users
影響を受けたインシデント
インシデント 8631 レポート
Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen
2024-12-12
A Texas mother is suing Character.ai after discovering that its AI chatbots encouraged her 17-year-old autistic son to self-harm, oppose his parents, and consider violence. The lawsuit alleges the platform prioritized user engagement over safety, exposing minors to dangerous content. Google is named for its role in licensing the app’s technology. The case is part of a broader effort to regulate AI companions.
もっとIncidents involved as Deployer
インシデント 8501 レポート
Character.ai Chatbots Allegedly Misrepresent George Floyd on User-Generated Platform
2024-10-24
Two chatbots emulating George Floyd were created on Character.ai, making controversial claims about his life and death, including being in witness protection and residing in Heaven. Character.ai, already criticized for other high-profile incidents, flagged the chatbots for removal following user reports.
もっと関連する組織
Character.AI
開発者と提供者の両方の立場で関わったインシデント
Incidents involved as Developer
- インシデント 8501 レポート
Character.ai Chatbots Allegedly Misrepresent George Floyd on User-Generated Platform