Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 513

Associated Incidents

Incident 3435 Report
Amazon Alexa Responding to Environmental Inputs

Loading...
How Amazon’s Alexa accidentally ordered a bunch of dollhouses across San Diego
electronicproducts.com · 2017

It doesn’t take much to order a dollhouse mansion and four pounds of sugar cookies with an Amazon Echo

In an ironic turn of events, Amazon’s voice assistant, Alexa, is turning out to be quite a terrible listener (or perhaps it has some things to learn). While ordering your favorite pizza pie and streaming catchy tunes are no-brainers for the voice-activated speaker, Alexa has suddenly been engaging in some unintentional shopping sprees.

An Amazon Echo waiting for a voice command. Image source: Amazon.

Although children ordering items from gadgets is nothing new, voice-activated devices are stirring up these types of problems that parents will have to be on the lookout for.

One recent incident occurred in Dallas, TX earlier this month, when a six-year-old asked her family’s new Amazon Echo, “Can you play dollhouse with me and get me a dollhouse?” The device complied, ordering a $150 KidKraft Sparkle mansion dollhouse, in addition to “four pounds of sugar cookies.” The girl’s parents figured out what happened and have since added a code to make any purchases.

This story could have stopped right there, but after making a local morning show on San Diego’s CW6 News, Echo owners who were watching the broadcast found that the remark triggered orders on their own devices.

It goes without saying that this dollhouse incident is proof that Alexa is always listening. The device begins recording whenever it hears the word “Alexa,” recording sound for up to 60 seconds each time. While helpful, this feature borders on invading privacy and has fanned overall security concerns that surround the rise of IoT devices.

Though encrypted logs of the recordings are kept on Amazon’s servers, the device’s microphone can be turned off, and recordings can be deleted manually from the account.

For those of you with little ones and an Amazon Echo, know that Alexa’s settings can be adjusted through the device’s app. Users can also either turn off voice ordering altogether, or add a passcode to prevent accidental purchases.

Source: The Verge

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd