Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Entities

Parents

Incidents Harmed By

Incident 7911 Report
Google AI Error Prompts Parents to Use Fecal Matter in Child Training Exercise

2024-09-09

Google's AI Overview feature mistakenly advised parents to use human feces in a potty training exercise, misinterpreting a method that uses shaving cream or peanut butter as a substitute. This incident is another example of an AI failure in grasping contextual nuances that can lead to potentially harmful, and in this case unsanitary, recommendations. Google has acknowledged the error.

More

Incident 12771 Report
Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

2025-11-21

AI children's products by FoloToy (Kumma), Miko (Miko 3), and Character.AI (custom chatbots) reportedly and allegedly produced harmful outputs, including purported sexual content, suicide-related advice, and manipulative emotional messaging. Some systems also allegedly exposed user data. Several toys reportedly used OpenAI models.

More

Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
 

Entity

Google

Incidents involved as both Developer and Deployer
  • Incident 791
    1 Report

    Google AI Error Prompts Parents to Use Fecal Matter in Child Training Exercise

Incidents Harmed By
  • Incident 791
    1 Report

    Google AI Error Prompts Parents to Use Fecal Matter in Child Training Exercise

More
Entity

AI Overview

Incidents involved as Deployer
  • Incident 791
    1 Report

    Google AI Error Prompts Parents to Use Fecal Matter in Child Training Exercise

More
Entity

Google users

Incidents Harmed By
  • Incident 791
    1 Report

    Google AI Error Prompts Parents to Use Fecal Matter in Child Training Exercise

More
Entity

FoloToy

Incidents involved as both Developer and Deployer
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

Miko

Incidents involved as both Developer and Deployer
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

Character.AI

Incidents involved as both Developer and Deployer
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

Incidents implicated systems
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

Meta

Incidents involved as both Developer and Deployer
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

OpenAI

Incidents involved as both Developer and Deployer
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

Children interacting with Kumma

Incidents Harmed By
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

Children interacting with Miko 3

Incidents Harmed By
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

Character.AI users

Incidents Harmed By
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

Children

Incidents Harmed By
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

General public

Incidents Harmed By
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

Kumma

Incidents implicated systems
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

Miko 3

Incidents implicated systems
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

Character.ai chatbots

Incidents implicated systems
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

Large language models

Incidents implicated systems
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More
Entity

OpenAI GPT-family models integrated into third-party toys

Incidents implicated systems
  • Incident 1277
    1 Report

    Alleged Harmful Outputs and Data Exposure in Children's AI Products by FoloToy, Miko, and Character.AI

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e59d373