Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 1026

Associated Incidents

Incident 5516 Report
Alexa Plays Pornography Instead of Kids Song

Loading...
Amazon Echo’s Alexa is turning out to be a bad influence on our kids
sheknows.com · 2017

We can’t keep our kids away from our gadgets, and the new Amazon Echo, the online retail giant’s bestselling product over Christmas, is no exception.

If you don’t already know, Amazon Echo — now in millions of homes worldwide — is a voice-controlled speaker powered by intelligent assistant Alexa. Alexa is the perfect virtual companion for those who are just too overloaded (or lazy) to do things like shop, play music and adjust your thermostat yourself.

More: Mom channels Chewbacca during labor

Sounds great. But when kids get in on the Echo game, the potential for turmoil is huge. In Dallas, Texas, a 6-year-old girl asked Alexa the innocent question, “Can you play dollhouse with me and get me a dollhouse?” As Alexa likes to make wishes come true, she immediately ordered the little girl a KidKraft’s dollhouse and — presumably in case she was hungry during play time — 4 pounds of sugar cookies. It only took seconds — and almost $200.

The story was reported on a local morning show on San Diego’s SW6 News, which led to several other dollhouses arriving at the doors of Echo owners who were watching the news broadcast. Apparently, anchor Jim Patton’s remark, “I love the little girl, saying ‘Alexa ordered me a dollhouse,'” triggered orders on viewers’ devices. It was a great day for KidKraft.

More: New videos take the awkward out of sex ed

It’s not just accidental ordering that takes place when Alexa goes off script. Last week, a young boy asked his parents’ Amazon Echo to “play “Digger, Digger’.” But instead of playing a song about a large earth-digging machine, Alexa announced, “You want to hear a station for porn detected,” and proceeded to list a number of choices that really aren’t suitable for young ears (or eyes.)

The funniest part is the parents’ reaction when they realize what’s happened. It’s the digital equivalent of a kid discovering their parents’ porn stash.

Luckily, there’s a simple solution, to avoid unwanted dollhouse deliveries and the traumatization of your kid. Pin-protect your devices, parents.

More: My 8-year-old downloaded porn — here’s how we handled it

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd