Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 527

Associated Incidents

Incident 3435 Report
Amazon Alexa Responding to Environmental Inputs

Loading...
Amazon Echos accidentally order dollhouses after hearing US news programme
standard.co.uk · 2017

ES News email The latest headlines in your inbox ES News email The latest headlines in your inbox Enter your email address Continue Please enter an email address Email address is invalid Fill out this field Email address is invalid You already have an account. Please log in Register with your social account or click here to log in I would like to receive lunchtime headlines Monday - Friday plus breaking news alerts, by email Update newsletter preferences

A newsreader sparked mayhem by accidentally telling Amazon Echo devices to buy dollhouses during a TV bulletin.

The devices, responding to the name Alexa, automatically turn on when they are spoken to and can carry out tasks such as ordering grocery shopping or checking the weather.

But US presenter Jim Patton caused havoc while discussing an incident where a little girl inadvertently ordered a £140 dollhouse through the device.

Brooke Neitzel, six, had asked her electronic assistant: “Can you play dollhouse with me and get me a dollhouse?'

The device then ordered a KidKraft Sparkle mansion dollhouse as well as four pounds of cookies - to the surprise of Brooke’s mother.

When Mr Patton recounted the story on air, he said: “I love the little girl saying ‘Alexa ordered me a dollhouse’”

Stunned viewers then realised their devices had picked up on his voice and also ordered the toy, local media reported.

Although the devise recognises its name, it does not differentiate between voices so any command beginning with “Alexa” will be picked up.

This recent revelation sparked security concerns around the gadgets.

Stephen Cobb, a senior security researcher, told TV station CW6: “These devices don't recognize your specific voice and so then we have the situations where you have a guest staying or you have a child who is talking and accidentally order something because the device isn't aware that it's a child versus a parent.

“Down the road the technology will be more sophisticated where it will be able to identify certain individuals and register people can access it.”

He said the Federal Trade Commission was ensuring the voice-command devices were safe and secure.

The Standard has contacted Amazon for comment.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd