Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 524

Associated Incidents

Incident 3435 Report
Amazon Alexa Responding to Environmental Inputs

Loading...
Amazon Echo device ordered dollhouses being discussed on TV
dailymail.co.uk · 2017

It is supposed to make life easier – but owners of the new Amazon Echo have fallen foul of the high-tech gadget's automatic features.

Owners of the device been warned after a number accidentally ordered dollhouses that were being discussed on a TV show.

The Amazon Echo, which includes a virtual assistant called Alexa, was a popular Christmas gift.

The Amazon Echo (pictured, left), which includes a virtual assistant called Alexa, was one of the most popular Christmas gifts but the case of Brooke Neitzel (right) shows how it can sometimes confuse a conversation for an order

Alexa performs tasks at the vocal request of her owner – including internet shopping.

But a recent incident in the US has revealed the technology could prove costly to unsuspecting users.

Last week it was reported that an Amazon Echo in Dallas, Texas, had ordered a $170 (£140) dollhouse after six-year-old Brooke Neitzel asked: 'Can you play dollhouse with me and get me a dollhouse?'

Although the little girl apparently meant it as a rhetorical question the device saw it as a command and ordered a KidKraft Sparkle mansion dollhouse, as well as four pounds of sugar cookies.

But to make matters worse many Amazon Echoes apparently picked up on TV anchor Jim Patton's words in the report: 'I love the little girl saying "Alexa ordered me a dollhouse".'

Viewers at home were then left stunned when their own Amazon Echoes picked up the voice requests in the report and ordered dolls houses too.

As voice-command purchasing is enabled as a default on the Alexa devices, viewers found it had mistaken the show for their command and bought the toy.

Owners of the device been warned after a number accidentally ordered dolls houses that were being discussed on a TV show

The device starts recording whenever it hears the word Alexa, recording sound for up to 60 seconds each time.

Stephen Cobb, a senior security researcher with ESET North America, told CW6 TV station in San Diego: 'These devices don't recognize your specific voice and so then we have the situations where you have a guest staying or you have a child who is talking and accidentally order something because the device isn't aware that it's a child versus a parent.'

He said the Federal Trade Commission was looking into ensuring voice-command devices were safe and secure.

But he said: 'Down the road the technology will be more sophisticated where it will be able to identify certain individuals and register people can access it.'

Oops, mommy: Brooke Neitzel (pictured, with her mom Megan) denied ordering the dollhouse and said she was just playing

Brooke's parents have since added a security code for purchases and have donated the dollhouse to a local children's hospital.

Experts said the incident highlighted the need for people to protect their Amazon Echo devices using a four-digit password to avoid rogue payments being made.

There are fears that a growing wave of cyber criminals are targeting smart household devices in a bid to hack into people's accounts and steal money.

Amazon was approached for comment by the Daily Mail.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd