インシデント 33の引用情報

Description: An Amazon Alexa, without instruction to do so, began playing loud music in the early morning while the homeowner was away leading to police breaking into their house to turn off the device.
推定: Amazonが開発し提供したAIシステムで、Oliver Haberstroh Neighborsに影響を与えた

インシデントのステータス

インシデントID
33
レポート数
4
インシデント発生日
2017-11-09
エディタ
Sean McGregor

CSETv0 分類法のクラス

分類法の詳細

Full Description

An Amazon Alexa, without instruction to do so, began playing loud music in the early morning while the homeowner was away leading to police breaking into their house to turn off the device. Oliver Haberstroh says he was out of the house from 1:50am-3:00am in Hamburg, Germany when his Amazon Alexa began to blast music. The police were called, leading to the door being broken down to turn off the device. Haberstroh had to pay $582 to repair the door (later paid by Amazon).

Short Description

An Amazon Alexa, without instruction to do so, began playing loud music in the early morning while the homeowner was away leading to police breaking into their house to turn off the device.

Severity

Negligible

Harm Type

Financial harm, Harm to physical property

AI System Description

Amazon Alexa, a smart speaker that can recognize speech, play music, etc.

System Developer

Amazon

Sector of Deployment

Information and communication

Relevant AI functions

Perception, Cognition, Action

AI Techniques

Amazon Alexa, AI enabled personal assistants

AI Applications

voice recognition, personal assistant

Location

Hamburg, Germany

Named Entities

Amazon, Oliver Haberstroh

Technology Purveyor

Amazon

Beginning Date

2017-11-08T08:00:00.000Z

Ending Date

2017-11-08T08:00:00.000Z

Near Miss

Harm caused

Intent

Accident

Lives Lost

No

Financial Cost

$528.00

Data Inputs

environment audio, Alexa software, user requests

CSETv1 分類法のクラス

分類法の詳細

Harm Distribution Basis

none

Sector of Deployment

information and communication

Alexa switches on and decides to have a party so loud the police came
mashable.com · 2017

The future belongs to AI-powered devices that will play music and party on their own when we're not there.

At least that's the takeaway from a curious/disturbing incident involving a German guy in Hamburg.

While home assistant devices like …

Alexa, please cause the cops to raid my home
theregister.co.uk · 2017

We all assume that intelligent devices will either serve our every need, or try to kill us, but what if they just want to party?

Well, it could work out pretty expensive as Oliver Haberstroh found out when his Amazon Alexa started its own e…

Audio spy Alexa now has a little pal called Dox
theregister.co.uk · 2017

Updated Amazon's audio surveillance personal assistant device, Alexa, has acquired an external battery pack called Dox.

The appropriately named portable energy store, made by lifestyle gadgetry firm Ninety7, does not (thankfully) do what it…

Top 5 AI Failures From 2017 Which Prove That ‘Perfect AI’ Is Still A Dream
analyticsindiamag.com · 2018

We have in the past seen instances such as the failure of Microsoft bot Tay, when it developed a tendency to come up with racist remarks. Within 24 hours of its existence and interaction with people, it starting sending offensive comments, …

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents