Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 525

関連インシデント

インシデント 3435 Report
Amazon Alexa Responding to Environmental Inputs

Loading...
Amazon Echo Dot ad cleared over cat food order
bbc.com · 2018

Image copyright Getty Images Image caption The Amazon Echo Dot TV commercial was cleared by the UK's advertising regulator

A television ad for Amazon's Echo Dot smart speaker that caused a viewer's device to try to order cat food has been cleared by a UK regulator.

The advert, which aired in October, featured a man asking Amazon's voice assistant Alexa to order Purina cat food.

A viewer said the ad caused their Echo Dot device to respond after hearing the ad on the television.

The viewer complained that the ad was "socially irresponsible".

The Advertising Standards Authority (ASA) announced that it would not uphold the consumer's complaint because it did not find the advert to be in breach of the UK Code of Broadcast Advertising.

The regulator acknowledged that Amazon had taken measures to prevent its ads from triggering a response in devices that might "overhear" a command from a voice on the television.

In this case, the ad did cause the device to initiate an order for cat food, and the user cancelled the order personally.

However, ASA said that Amazon had programmed Alexa to automatically cancel any orders that had not been actively confirmed by the customer.

"We understood that it would not be possible for a purchase to be made without the account owner's knowledge, even in instances where technology, intended to stop ads interacting with devices, had not been effective," the regulator said in its decision.

"We concluded that the ad was not socially irresponsible and did not breach the Code."

Ordering mishaps

In January 2017, there was a spate of such incidents in the US involving Amazon Echo devices.

The devices overheard a television news anchor on CW6 in San Diego talking about a child who managed to order a doll's house and a tin of cookies from Alexa because the family had not activated parental controls on their Echo device.

The anchor in question, Jim Patton, said: "I love the little girl saying, 'Alexa order me a dollhouse.'"

CW6 said that after the news segment aired, the TV station received numerous calls from viewers complaining that their smart speakers had all tried to order doll's houses after the words were uttered on the screen.

At the time, Amazon advised users to open the Alexa app and turn off the "voice purchasing" setting.

Customers were also advised to set up a confirmation code that would need to be typed in before each order was authenticated.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd