Entities
Parents
Incidents Harmed By
Incident 7911 Report
Google AI Error Prompts Parents to Use Fecal Matter in Child Training Exercise
2024-09-09
Google's AI Overview feature mistakenly advised parents to use human feces in a potty training exercise, misinterpreting a method that uses shaving cream or peanut butter as a substitute. This incident is another example of an AI failure in grasping contextual nuances that can lead to potentially harmful, and in this case unsanitary, recommendations. Google has acknowledged the error.
MoreIncident 12771 Report
Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI
2025-11-21
AI children's products by FoloToy (Kumma), Miko (Miko 3), and Character.AI (custom chatbots) reportedly and allegedly produced harmful outputs, including purported sexual content, suicide-related advice, and manipulative emotional messaging. Some systems also allegedly exposed user data. Several toys reportedly used OpenAI models.
MoreRelated Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.