Entidades
Ver todas las entidadesClasificaciones de la Taxonomía CSETv1
Detalles de la TaxonomíaIncident Number
1
Special Interest Intangible Harm
yes
Date of Incident Year
2016
Estimated Date
Yes
Multiple AI Interaction
no
Embedded
no
Clasificaciones de la Taxonomía CSETv0
Detalles de la TaxonomíaProblem Nature
Unknown/unclear
Physical System
Software only
Nature of End User
Amateur
Public Sector Deployment
No
Data Inputs
Videos
Lives Lost
No
Clasificaciones de la Taxonomía GMF
Detalles de la TaxonomíaKnown AI Goal Snippets
(Snippet Text: An off-brand Paw Patrol video called "Babies Pretend to Die Suicide" features several disturbing scenarios. The YouTube Kids app filters out most - but not all - of the disturbing videos.
Before any video appears in the YouTube Kids app, it's filtered by algorithms that are supposed to identify appropriate children's content YouTube also has a team of human moderators that review any videos flagged in the main YouTube app by volunteer Contributors (users who flag inappropriate content) or by systems that identify recognizable children's characters in the questionable video. Many of those views came from YouTube's "up next" and "recommended" video section that appears while watching any video. YouTube's algorithms attempt to find videos that you may want to watch based on the video you chose to watch first If you don't pick another video to watch after the current video ends, the "up next" video will automatically play. , Related Classifications: Content Recommendation, Content Search)
Risk Subdomain
1.2. Exposure to toxic content
Risk Domain
- Discrimination and Toxicity
Entity
AI
Timing
Post-deployment
Intent
Unintentional