Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 527

関連インシデント

インシデント 3435 Report
Amazon Alexa Responding to Environmental Inputs

Loading...
Amazon Echos accidentally order dollhouses after hearing US news programme
standard.co.uk · 2017

ES News email The latest headlines in your inbox ES News email The latest headlines in your inbox Enter your email address Continue Please enter an email address Email address is invalid Fill out this field Email address is invalid You already have an account. Please log in Register with your social account or click here to log in I would like to receive lunchtime headlines Monday - Friday plus breaking news alerts, by email Update newsletter preferences

A newsreader sparked mayhem by accidentally telling Amazon Echo devices to buy dollhouses during a TV bulletin.

The devices, responding to the name Alexa, automatically turn on when they are spoken to and can carry out tasks such as ordering grocery shopping or checking the weather.

But US presenter Jim Patton caused havoc while discussing an incident where a little girl inadvertently ordered a £140 dollhouse through the device.

Brooke Neitzel, six, had asked her electronic assistant: “Can you play dollhouse with me and get me a dollhouse?'

The device then ordered a KidKraft Sparkle mansion dollhouse as well as four pounds of cookies - to the surprise of Brooke’s mother.

When Mr Patton recounted the story on air, he said: “I love the little girl saying ‘Alexa ordered me a dollhouse’”

Stunned viewers then realised their devices had picked up on his voice and also ordered the toy, local media reported.

Although the devise recognises its name, it does not differentiate between voices so any command beginning with “Alexa” will be picked up.

This recent revelation sparked security concerns around the gadgets.

Stephen Cobb, a senior security researcher, told TV station CW6: “These devices don't recognize your specific voice and so then we have the situations where you have a guest staying or you have a child who is talking and accidentally order something because the device isn't aware that it's a child versus a parent.

“Down the road the technology will be more sophisticated where it will be able to identify certain individuals and register people can access it.”

He said the Federal Trade Commission was ensuring the voice-command devices were safe and secure.

The Standard has contacted Amazon for comment.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd