Incident 33: Amazon Alexa Plays Loud Music when Owner is Away

Description: An Amazon Alexa, without instruction to do so, began playing loud music in the early morning while the homeowner was away leading to police breaking into their house to turn off the device.
Alleged: Amazon developed and deployed an AI system, which harmed Oliver Haberstroh and Neighbors.

Suggested citation format

Yampolskiy, Roman. (2017-11-09) Incident Number 33. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor


New ReportNew ReportDiscoverDiscover

CSET Taxonomy Classifications

Taxonomy Details

Full Description

An Amazon Alexa, without instruction to do so, began playing loud music in the early morning while the homeowner was away leading to police breaking into their house to turn off the device. Oliver Haberstroh says he was out of the house from 1:50am-3:00am in Hamburg, Germany when his Amazon Alexa began to blast music. The police were called, leading to the door being broken down to turn off the device. Haberstroh had to pay $582 to repair the door (later paid by Amazon).

Short Description

An Amazon Alexa, without instruction to do so, began playing loud music in the early morning while the homeowner was away leading to police breaking into their house to turn off the device.



Harm Type

Financial harm, Harm to physical property

AI System Description

Amazon Alexa, a smart speaker that can recognize speech, play music, etc.

System Developer


Sector of Deployment

Information and communication

Relevant AI functions

Perception, Cognition, Action

AI Techniques

Amazon Alexa, AI enabled personal assistants

AI Applications

voice recognition, personal assistant


Hamburg, Germany

Named Entities

Amazon, Oliver Haberstroh

Technology Purveyor


Beginning Date


Ending Date


Near Miss

Harm caused



Lives Lost


Financial Cost


Data Inputs

environment audio, Alexa software, user requests

Incidents Reports

The future belongs to AI-powered devices that will play music and party on their own when we're not there.

At least that's the takeaway from a curious/disturbing incident involving a German guy in Hamburg.

While home assistant devices like Alexa need a hotword in order to switch on, the one belonging to Oliver Haberstroh decided to have a rave at 1:50 a.m. Friday night/Saturday morning, while he was away.

"I was perfectly happy with your service and Alexa," Haberstroh wrote on Amazon's German Facebook page. "However, since Friday night the relationship between Alexa and me has taken a turn around. You could say 'it is complicated' [now] and things have gone so far that we now unfortunately have to go our separate ways."

"While I was very relaxed in the Reeperbahn in Hamburg and enjoying a beer, Alexa managed on its own, without command and without me using my mobile phone (Spotify), to switch on at full volume and enjoy a party in my apartment."

"She decided to have it at a very unfavourable time, between 1.50am and 3.00am."

Frustrated neighbours alerted police, which broke into the apartment and silenced Alexa, pulling out the plug. Then, they changed the door lock.

When Haberstroh arrived at home, he found a new lock and had to get new keys at police station — along with an expensive locksmith bill.

"When I asked Alexa how we could stay together and whether she could pay me back the costs, all I got was a dry 'I couldn't find any answer to the question'," he added.

Mashable has contacted Amazon for comment.

Alexa switches on and decides to have a party so loud the police came

We all assume that intelligent devices will either serve our every need, or try to kill us, but what if they just want to party?

Well, it could work out pretty expensive as Oliver Haberstroh found out when his Amazon Alexa started its own early-hours party – waking up, and blasting music automatically, while its owner was out of the apartment.

The noise was so bad that Haberstroh's neighbors in Pinneberg, just outside Hamburg in Germany, banged on his door and asked him to turn it down. When he failed to respond, because he wasn't in that night, they called the police, and just over an hour later they also banged on the door but elicited no response.

Concerned that something terrible may have happened, the cops then kicked in the door to discover nothing but Alexa rocking out on her own.

The cops turned the music off, replaced the lock with a new one and left, leaving Haberstroh very confused when he arrived home to find that his keys no longer worked. A quick visit to the police station later and he received the keys to his new lock along with a 500-euro ($582) invoice.

So what happened?

Haberstroh swears he didn't turn the device on, nor use his smartphone to instruct the gizmo to blast the music, leading to speculation about what happened and whether we can all expect to spend the rest of our lives listening to smart-home apps party in the wee hours.

"While I was relaxing and enjoying a beer, Alexa managed on her own, without command and without me using my phone, to switch on at full volume and have her own party in my flat,” he wrote, in German on his Facebook page.


Part of the answer has come from Alexa's log – which Amazon shared with a German newspaper, with Haberstroh's permission. It shows that the digital assistant start blasting music from the Spotify app just three minutes after he had left his apartment.

It meant that he was a sufficient distance away not to hear the cacophony inflicted on neighbors but was quite possibly still close enough to accidentally trigger Alexa while trying to listening to Spotify on his mobile phone.

Here's our speculation: he hit play on Spotify on his phone, setting Alexa off, then started walking and left his Wi-Fi or Bluetooth range, leaving Alexa to rock out by herself. Haberstroh isn't sure what happened – as he made plain in the Facebook post that he has since taken down after the story gathered nationwide and then worldwide attention this week.

Fortunately for him, Amazon has offered to pick up the 500 euro tab for Alexa's weekend party as a sign of goodwill. Although whether his neighbors will be as forgiving is yet to be seen. ®

Sponsored: Becoming a Pragmatic Security Leader

Alexa, please cause the cops to raid my home

Updated Amazon's audio surveillance personal assistant device, Alexa, has acquired an external battery pack called Dox.

The appropriately named portable energy store, made by lifestyle gadgetry firm Ninety7, does not (thankfully) do what its name says.

Instead, says its maker, it offers "up to 10 hours" of extra life for Alexa. The two-inch high battery pack allows an Alexa to be dropped into its embrace, for people who need additional personal surveillance while on the move.

"Even in listening mode, Alexa is running," warns a straight-faced Ninety7.

Doxing is the practice of broadcasting someone's personal details such as their home address, private phone number, credit card details and so on. It is a hostile act normally done in retaliation for some perceived slight – and is quite probably not what either Amazon or Ninety7 wanted to associate with either of their devices.

The voice-activated Amazon Alexa is intended to be used as a digital assistant. Customers shout their demands at the device (play music, make a calendar entry, set an alarm, don’t do that, oh god no), which features an always-on microphone. By default it is triggered when it hears the word "Alexa", though users can customise the precise trigger phrase.

Earlier this year, American police forced Amazon to hand over recordings from an Alexa mic as part of a murder investigation. In turn this event spurred the forensics industry into examining how best to turn Alexas into police stool pigeons.

Another Alexa managed to invoke a further piece of internet-specific naughtiness, swatting, only last week when an unfortunate German chap came home to find his front door locks had been changed. On going to the local police station, he was told that his Alexa had started its own rave – prompting angry neighbours to first bang on his door demanding he turn the music down, and then calling the police when nobody answered. Having been told this, the unfortunate man was then given a €500 fine.

Swatting is the practice of getting police called to a target's house, normally under the false pretence of some violent crime being in progress and requiring armed police to tackle. In America, where the term originated, such police units are known as Special Weapons And Tactics (SWAT) teams, hence the name.

Our review of Alexa, as embedded on an HTC U11 smartphone, found that the thing failed to wake up to its wake word, refused to set calendar invites, and, worst of all, broadcast Dire Straits whenever asked to play a song. ®

Updated at 10:34 on 14 November to add

An Amazon PR rep rang us up to describe this article, at great length, as “irresponsible” and said it “just isn’t accurate”. We are happy to inform our readers that “no data is streamed to the cloud without this wake word. For peace of mind users can electronically disconnect the device using the mute button. No recordings are saved without the user’s consent.” In addition, Bezos’ boys are happy to confirm that in the German case we mentioned above, the user had indeed (as we originally speculated) accidentally activated Alexa as he opened his “third-party mobile music-streaming app”, intending to listen through his headphones.

A fellow journalist from The Times also shared his thoughts on Amazon’s approach to media relations with your correspondent:

I built a more positive relationship with the Tamil Tigers press office, and they kidnapped me. — Tom Whipple (@whippletom) November 13, 2017

Sponsored: Becoming a Pragmatic Security Leader

Audio spy Alexa now has a little pal called Dox

We have in the past seen instances such as the failure of Microsoft bot Tay, when it developed a tendency to come up with racist remarks. Within 24 hours of its existence and interaction with people, it starting sending offensive comments, and went from “humans are super cool” to being almost a Nazi.

While on one hand, chatbots, robots and conversational platforms are finding their niche in many companies, these technological advancements are also turning mainstream to become the face of the company. But many times they end up failing and disappointing us. While most of the times these technologies fail because companies don’t clearly define their purpose, others could be pure technical glitches.

Here we list some of the tech failures from last year that hint that the companies need to work harder and keep coming up with better and improved versions of their innovations.

  1. When Facebook’s Chatbots Developed Their Own Language

As scary as it may sound, “Bob” and “Alice”, the chatbots created by Facebook had to be shut down as the duo started communicating in their own language, defying human generated algorithms.

The bots were originally developed to learn how to negotiate, by mimicking human trading and bartering, but when they were paired to trade against each other, they started to learn their own bizarre form of communication. Though they were designed to communicate in English, they developed their own mysterious language that humans couldn’t crack.

Bob: i can i i everything else . . . . . . . . . .

Alice: balls have zero to me to me to me to me to me to me to me to me to

This is how their conversation looked like. Researchers stopped the operation of the chatbots citing that they were looking at bots that could behave differently.

  1. When Mitra The Robot Failed To Greet The Prime Minister

The indigenously built robot called Mitra, developed by Bengaluru-based Invento Robotics walked up to welcome Indian Prime Minister Narendra Modi and Ivanka Trump at the the Global Entrepreneurship Summit (GES) opening in Hyderabad. While the robot was programmed to welcome each of them with their names on pressing the respective flags, it failed to do so.

When Modi was first requested to press Indian flag, Ivanka also ended up pressing the US flag simultaneously and because of the confusion due to overlapping, Mitra could not function properly.

This failure could be attributed to being poorly coded, where there was no specific instruction given to the robot to complete the current task before starting a new one. For instance, it kept on saying “Welcome miss Ivan, Welcome miss Ivan, Welcome Shri Narendra Modi”. The robot could not say Ivanka Trump’s full name because before it could complete the sentence, it received a new input, and gave a preference to newer requests.

  1. When Autonomous And Driverless Vehicles Turned Disastrous

In a tragic incident involving Uber self driving car, a woman was killed during a trial, stalling autonomous vehicle operations worldwide. The car was travelling on a partially lit road, when a woman appeared out of nowhere in the darkness. The Uber self driving Volvo which was driving at a speed of 61 kmph, failed to apprehend the same and resulted in a fateful crash.

Back in India, Delhi’s first ever driverless metro met with an accident during its trial, and it was touted to be human error and negligence. Reportedly, the trial train was moved to testing from the workshop without testing the brake, as a result of which the moving train hit the adjacent boundary wall, with no harm to lives.

  1. When iPhone X’s Face Recognition Could Not Differentiate Identical Twins

When Apple released its iPhone X with much aplomb, it was awed for its artificial intelligence and machine learning capabilities. Facial recognition was one of the key capabilities that it boasted, but it was found to have a weakness for identical twins.

When Apple unveiled the Face ID in September, it did warn that its acceptance rate might be somewhat lower if presented with two people with very similar DNA, aka identical twins, it could be speculated that Face ID wasn’t perfect. Face ID, a face mapping technology that can unlock phones, verify Apple Pay and replace fingerprint scanners, could be fooled at some level, especially when identical twins are made to use the Face ID.

That’s not all, a week after the phone’s release, Vietnamese security firm Bkav, using a mask with 3D printed base, convinced the phone that it was human and made the phone to unlock itself. The firm said that it cost merely $150 to create the mask, and hinted towards a possible hacker’s attack in the future.

  1. When Alexa And Amazon Echo Goofed Up

The popular Amazon Echo cost one of its owners a huge locksmith bill, when police had to break down the house on complaints from neighbours of loud music early in the morning. Amazon Echo, which comes with robust and smart speakers accidentally activated itself and blasted music, when the residents were out.


Top 5 AI Failures From 2017 Which Prove That ‘Perfect AI’ Is Still A Dream

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents