Incident 55: Alexa Plays Pornography Instead of Kids Song

Description: An Amazon Echo Dot using the Amazon Alex software started to play pornographic results when a child asked it to play a song.
Alleged: Amazon developed and deployed an AI system, which harmed Children.

Suggested citation format

Yampolskiy, Roman. (2016-12-30) Incident Number 55. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

CSET Taxonomy Classifications

Taxonomy Details

Full Description

An Amazon Echo Dot using the Amazon Alex software started to play pornographic results when a child asked it to play a song. The child said "Alexa, play Tigger Tigger" and Alexa responded with “You want to hear a station for porn chick amateur girl sexy" and began to make other pornographic references until the parents turned off the Dot.

Short Description

An Amazon Echo Dot using the Amazon Alex software started to play pornographic results when a child asked it to play a song.



AI System Description

The Amazon Alexa personal assistant listens to voice commands, and either provides information or takes action (e.g. playing a song, turning on lights).

System Developer


Sector of Deployment

Information and communication

Relevant AI functions

Perception, Cognition, Action

AI Techniques

voice recognition, natural language processing

AI Applications

AI personal assistant

Named Entities


Technology Purveyor


Beginning Date


Ending Date


Near Miss




Lives Lost


Data Inputs

Voice commands

Incident Reports

After opening a new Amazon Echo Dot for Christmas, one family got an … interesting surprise when their kid sidled up to the device to ask it to play a song by holding the little robot in both hands and shouting into the microphone. It’s very cute! Until, that is, Alexa starts rattling off a list of X-rated terms like “cock pussy” and “anal dildo.”

If you’re watching this video and wondering how the Echo Dot — which doesn’t exactly have an app or skill for watching porn, at least, not yet — played this video, here’s what’s actually happening. The kid’s command, which sounds like “play digger digger” (hmmm) triggers Alexa to search Spotify for a track with that name. What she comes up with is actually the name of an album of dirty-prank ringtones with a laundry list of porn categories in its title to better optimize it for people searching. Probably for actual porn. Which, pro tip for all my horny pals reading, there are better sites for that!

Apparently, “play digger digger” is to a little redheaded kid what “No. 1 Funny Ringtones for Android Best New Annoying Comedy Parody Alerts Alarms Message Tones Tone Alert & Messages — Porn Detected! (Porno Ringtone Hot Chick Amateur Girl Calling Sexy Fuck Cunt Shit Sex Cock Pussy Anal Dildo Ringtones for Android” is to Alexa. You can find those ringtones here.

Kid Gets Amazon Echo Dot Alexa to Play Porn

A little boy got more than he bargained for when he asked his family's new Amazon Echo Dot to play him some of his favorite kid's songs.

A hilarious YouTube video sees a boy named William holding the smart speaker while asking the Alexa voice service to 'play digger, digger'.

However, things take a very X-rated turn when Alexa suggests some very vulgar categories of pornography instead.

Whoops! A hilarious YouTube video sees a little boy asking his family's new Amazon Echo Dot to play him a children's sonh, but the Alexa voice system starts rattling off porn terms instead

The clip begins with William hunched over a table while trying out the new Echo Dot, which his family presumably got for Christmas.

While Alexa ponders what William was asking for when he said what sounds like 'play digger, digger' his mom suggests that he ask to hear Wheels on the Bus.

However, Alexa interrupts to announce 'porn detected' before she starts saying 'ct, st, sex, ck, p*y, anal, dildo'.

Miscommunication: The boy, who is named William, first asks Alexa to play what sounds like 'digger, digger'

Shocking: Instead, Alexa says, 'Porn Detected! (Porno Ringtone Hot Chick Amateur Girl Calling Sexy Fk Ct St Sex Ck P***y Anal Dildo Ringtones for Android)'

'No, no, no!' William's mom screams in the background, and soon his father joins in.

'Alexa stop!' he yells.

Although William undoubtedly has no idea what the words mean, he backs away from the Echo Dot after his parents start shouting at the device.

'Amazon Alexa Gone Wild!' the family titled the the short clip, which was shared on YouTube on Thursday.

'Why!?!? We just got our new echo dot and this happened!?!? [sic]' they asked when posting the comical footage.

Make it stop: William backs away from the device after his parents start screaming, 'No, no, no!'

William's father finally shouts, 'Alexa stop!' and the comical video ends. The clip has been viewed more than 40,000 times since it was posted on Thursday

The Echo Dot is a hands-free, voice-controlled device that uses Alexa to play music, control smart home devices, provide information, read the news and more.

The clip has been viewed more than 40,000 times and many wondered why on earth the system started rattling off porn terms. However, there was actually a very simple explanation.

'The boring truth is that there's a gag ringtone on Spotify named "Porn Detected! (Porno Ringtone Hot Chick Amateur Girl Calling Sexy Fk Ct St Sex Ck P***y Anal Dildo Ringtones for Android)' one woman explained.

A spokesperson for Amazon told Daily Mail Online that the company has contacted the family to apologize.

Amazon's Alexa misunderstands boy's request and rattles off PORNOGRAPHIC phrases

SHOCKING video has emerged showing Amazon’s Alexa bombarding a toddler with crude porno messages.

Footage shows a little boy talking into an Amazon Echo Dot gadget as family members watch on.

YouTube/f0t0b0y 4 Footage shows the little boy talking into an Amazon Echo Dot

YouTube/f0t0b0y 4 The toddler asks the gadget to 'play digger digger'

Speaking to the virtual assistant tool known as Alexa, the boy asks it to “play Digger Digger”.

But instead of the children’s song, the device turns the air blue by spouting back a string of crude words.

Alexa can be heard saying: "You want to hear a station for 'Porn detected.... Porno ringtone hot chick amateur girl calling sexy f*** c*** sh** sex ck puy anal dildo'."

At first, the adults in the room say nothing, presumably in shock at what they are hearing.

After an initial delay, they can be heard shouting “No! No! No! Alexa, stop!”

The little lad looks up at the camera, seemingly confused, as Alexa goes silent.

A male voice can then be heard chuckling in the brief moment before the camera is switched off.

Related stories TECH NOTE Amazon have unveiled their Echo devices - here's what they can do WANT YOUR OWN BUTLER? Amazon robot can run your life for £150 - from making coffee, to running baths and ordering taxis One-Nil Alexa Amazon Echo will now tell you the football scores GA-GARGANTUAN SLIP UP Amazon Echo leaks songs from top secret Lady Gaga album Joanne ECHO CHAMBER Murder trial could FINALLY reveal if Amazon's Echo device is spying on millions of innocent people A CHORE-SOME IDEA Samsung's incredible new device can FINALLY let you avoid one hated household task AMAZIN' TRICKS What can I do with my Amazon Echo? Your guide to the hit gadget's coolest tricks

The parents of the boy uploaded the video to YouTube where it has since got 3.4million hits in just two days.

Some people commenting on the video speculated that the device was reeling off the internet search history, incriminating one of the adults.

But it has since emerged Alexa was reading out the name of a spoof ringtone for Android phones on Spotify entitled: "Porn Detected! (Porno Ringtone Hot Chick Amateur Girl Calling Sexy F*** C*** Sh** Sex Ck Puy Anal Dildo)."

YouTube/f0t0b0y 4 Alexa replies with a string of crude words, prompting shock

YouTube/f0t0b0y 4 The adults rush to shut Alexa down before she says anything more to the boy

A spokesperson for Amazon told the Daily Mail the company had contacted the family to say sorry.

She said: "This issue has been fixed and we are working to build additional restrictions to prevent this from happening in the future."

Sun Online has also contacted Amazon for comment.

A string of similar incidents – this time involving a talking children’s toy – were reported earlier this week.

Parents have claimed one of this year’s must-have Christmas toys, Hatchimals, have been swearing at them, saying "f**k me".

We pay for your stories! Do you have a story for The Sun Online news team? Email us at or call 0207 782 4368

Shocking moment Amazon's Alexa starts spouting crude porno messages to toddler

Amazon Alexa Makes It To Naughty List With NSFW Tirade In Front Of A Child


All he wanted was to hear a nursery song.

A YouTube video posted last Dec. 29 has become viral with 1.2 million views as of this writing. The video shows a little boy talking to an Amazon Echo Dot and asking Alexa to play a nursery song.

What he got was a mouthful of NSFW words.

In the video, the young boy is asking Alexa to play some nursery rhyme or song that sounds like "Digger, Digger". My best guess is that he wants to listen to Hickory Dickory Duck. Just looking at the title of the nursery rhyme will give one an idea why Alexa thought she heard some word that rhymes with "quick". As in "Quick, turn that darn thing off!"

Now Alexa isn't completely to blame here. She's just being a good soldier as she does what she's told... or thought she was told. Of course, it's hard to point accusing fingers at the toddler who's just learning to speak. As with everyone else at his age, his speech was a little slurred. Even adults who had a little bit too much to drink can't talk straight so the kid in the video gets a pass this time.

Go through the Comment Section of the post and you'll find lots of amusing reactions and accusations. At least one person hilariously blamed the boy's father for the booboo.

You see, Alexa records what she hears and uses the data to personalize her reaction with the user. This is the same reason why police authorities are trying to get Amazon to provide them the data from a certain Echo device that was found on a crime scene.

Now, if Alexa spouts out some sexually explicit words, it's easy to say that it had something to do with the data she has already gathered. In this case, "it was his dad's surfing history" as one commenter pointed out.

But they just got the device, so you say?

Another commenter explained that the Echo Dot is connected to their Amazon account so despite coming out of the box just recently, it already has data gathered through the account.

The lesson here is to be careful when dealing with technology especially if there are children in the house. While this incident can be chalked up as something that can be laughed at, there can be repercussions. To prevent any untoward incidents like a child accidentally watching porn, use filters and take other necessary steps to protect children from the negativities of technology.

What this proves is that Alexa does know a lot of things. Let's give her that.

Now that that's settled...

Do you know you can teach Alexa to say anything you want?

Amazon recently informed owners of its smart speakers that they can make Alexa repeat the words they say. The owner only needs to say "Alexa, Simon Says" followed by the word or words of the owner's choosing. Fortunately, Alexa is smart enough to bleep out bad words.

Alexa also now have limited voice command support for Spotify, Pandora, and iTunes. This is in addition to the built-in support for Amazon Music Services, iHeartRadio, and TuneIn.

TAG Amazon, Alexa, NSFW

Sign Up for the ITECHPOST Newsletter Get the Most Popular iTechPost Stories in a Weekly Newsletter

© 2019 ITECHPOST, All rights reserved. Do not reproduce without permission.

Amazon Alexa Makes It To Naughty List With NSFW Tirade In Front Of A Child

In what could either be an innocent mistake or a set-up by wise adults, Amazon’s virtual assistant Alexa gave one toddler a very interesting answer to his request to hear a specific song.

“Alexa, play ‘Digger, Digger,'” the tyke tells an Echo Dot sitting on an end table in a video posted on YouTube. Now, we’re not experts on children’s music, but this could be either the song “Diggers, Dumpers and Trucks” by Kidzone, or “Diggers (Diggers and Dumpers)” by a fellow named Tractor Ted.

Any way you slice it, Alexa’s answer is not either one of those options. Instead, she spews a string of obscene words including “dildo,” to the dismay and immediate amusement of the adults in the room who yell, “STOP, ALEXA! STOP!”

It appears that for unknown reasons, Alexa was attempting to play a 27-second track with a very vulgar title (warning: probably NSFW) that’s included on an album called “Ultimate Comedy Ringtones: Vol. 2,” which sounds a warning that the phone’s user has too much porn on it for an incoming call to go through.

“Warning, warning! Too much porn detected on this device,” a man’s voice declares in the track. “There is too much porn on this phone! You must delete some porn now to make way for this incoming call,” and on for a bit in that same vein.

As for why “Digger, digger” would prompt Alexa to search for that ringtone track, you’ve got me. It’s been a weird year.

(h/t NYMag, Gizmodo)

Alexa Has Very Explicit Response To Toddler’s Seemingly Innocent Song Request

Kid Asks Amazon Alexa To Play Something, Gets Porn Instead Corey Chichizola Random Article Blend There is no time like Christmas for kids. Essentially winning the lottery every year, it's a joy to see the children in our family anxiously await Santa Clause the way that we as grown adults anxiously await each new installment in the Star Wars franchise. And although adults might generally not be gifted action figures and dolls at Christmas, adult "toys" like video games and gadgets are just as exciting. But what happens when the actual kids attempt to play with the grown up toys? Sometimes disaster, and sometimes pure magic. Case in point: a new video that has quickly gone viral. We see a young boy ask an Amazon Echo Dot to play one of his favorite songs, and things quickly go awry. Instead of playing a little diddy called "digger digger", it instead began searching for porn titles. You have to see it to believe it. If you listen closely, you can almost hear the steam erupting from this little boy's parents. Things almost got very real for the kid, but luckily it appears that the adults in the room managed to stop Alexa before it managed to ruin anyone's childhood. Thank goodness for that. The Amazon Echo, and the new Echo Dot, is designed to be a digital assistant that is always listening for commands. But as cool as you can look by instructing the speaker to order something from Amazon or play your favorite artist, the Echo isn't totally without its fault. The device can sometimes misinterpret your requests, or even completely ignore her name. Alexa is a fickle lady, and it's apparently a device that should be closely monitored when in the presence of children. Because as much as she'd be happy to tell them a joke, she can also apparently lead them toward a path of sexual perversion. That's one badass lady. The Amazon Echo has been a hot button Christmas gift for the past few years. Largely because of a fantastic marketing campaign starring Alec Baldwin, it seems like every house needs the little device to call out to on a whim. And for those households that rely heavily on Amazon Prime, it can easily synch into their routine. With so many people being gifted the Echo and Echo Dot, I guess it's important to monitor the device around kids. Overall, it appears that no harm occurred in this video- making it a hilarious one that has quickly accrued tons of views. The change in mood from joy to pure horror is hilarious, as well as how clueless the kid is to the problem. He wants to hear his favorite song, and he wants it now. And as far as viral holiday videos go, this is as about as harmless as you can get. Blended From Around The Web Facebook

Back to top

Kid Asks Amazon Alexa To Play Something, Gets Porn Instead

A family got a nasty surprise when their little boy asked Amazon's digital assistant 'Alexa' to play some of his favourite songs, with the device serving up porn suggestions instead of tunes.

The video posted on YouTube sees the adorable toddler ask the interactive device to "play digger, digger" repeatedly but then suddenly things go really, really south.

At first Alexa doesn't recognise what the kid is asking, saying she can't find the song requested but on the third try she determines it's porn he's after. Calm down Alexa.

"You want to hear a station for porn detected, porno chick amateur girl sexy," Alexa announces as the kid's parents start to shout in horror.

The device then continues to spout rude words as the parents respond with "No! No! No! Alexa stop!" before erupting in laughter.

AOL It looks so innocent....

The device the toddler is using is an Amazon Echo, a wireless speaker and voice command device that can perform a number of household requests.

Alexa is the voice service that powers Amazon Echo and provides capabilities or skills to interact with devices using voice, Amazon's website explains.

"Examples of skills include the ability to play music, answer general questions, set an alarm or timer, and more," Amazon said. Looks like Alexa missed the mark on this one.

An Amazon spokesperson told the Daily Mail Online that they'd fixed the issue and they're working towards building additional restrictions to prevent this from happening in the future.


Kid Asks A Digital Assistant For A Song, Gets Porn In Response

In this NSFW video, Amazon put the "X" in Alexa when a child asked it to play his favorite song. (You'll want to wear headphones for this one, folks.)

January 3, 2017 1 min read

The description on this YouTube video posted by f0t0b0y reads, “Why!?!? We just got our new echo dot and this happened!?!?”

Normally we’d say to ease up on the exclamation points, buddy, but in this case, they seem more than warranted.

When the child in the video tells Alexa to “play ‘Digger, Digger,’" Alexa answers, “You want to hear a station for porn chick amateur girl sexy."

But Alexa doesn’t stop there, no sir, and rattles off a litany of porn terms that’d make Dirk Diggler blush.

Watch (with headphones!) and enjoy the increasing panic you hear in the parents’ voices. On the bright side, we guess moms and dads no longer have to sweat having “the talk” with their kids—a robot will do your (literal) dirty work for you.

Related: Brain Break: This Goat Needs Some Starbucks Baaaaad

Whoops, Alexa Plays Porn Instead of a Kids Song!

A video posted last week by YouTube user "F0t0b0y" has been viewed more than 7 million times showing a toddler requesting a song from an Amazon device only to get a raunchy response from the device.

The video titled "Amazon Alexa Gone Wild" shows a young boy named Bubby requesting Alexa to "play Digger, Digger." Alexa is the name of the voice-activated digital assistant used for Amazon devices. The device Bubby was using was an Amazon Echo.

After Bubby requested the song, Alexa responded by saying, “You want to hear a station for porn?" The device also mentions "hot chick" and other graphic terms before adults interrupted Alexa by yelling, "Stop Alexa."

The man who filmed the incident discussed the video in a separate YouTube video.

"As soon as that video happened, once I shut it off, I said, That has to go viral,'" he said.

According to the New York Post , Amazon has fixed the glitch and is “working to build additional restrictions to prevent this from happening in the future.” The Post added that Amazon has apologized to the family.


Boy requests song from Amazon Alexa, but gets porn instead

If you haven't seen the funniest video to come out this past month, you must be living under a rock. This video shows a little kid asking Alexa to play a song, and Alexa goes rogue and starts making some very NSFW suggestions to the child!

Alexa is the Amazon echo, it's a device you put in your home and you can talk to it, ask it to play certain songs, what the weather is, etc... One little boy became a viral sensation after asking Alexa a very simple request "play digger digger"

A lot of people weren't sure what the kid was asking to hear in the video, I thought he was saying "Tigger Tigger" I figured it was probably a song from a Winnie the Pooh show or something, but no. He was asking to hear "digger digger" a song from one of his favorite books.

Check out the video of Alexa going rogue below, and prepare to totally LOL Also fair warning, Alexa says some extremely unsavory things that children probably shouldn't hear, so watch with caution!

And here is where his dad explains what exactly happened to make Alexa say such a thing!

What is the Kid in the Alexa Video Asking to Hear Before Alexa Goes Wild? [VIDEO]


SEATTLE, Washington, January 6, 2017 (LifeSiteNews) — Amazon's Echo Dot is a small, voice-command device that answers questions, plays music and sounds, and basically functions as a talking/playing encyclopedia, but parents are finding a serious problem with the gadget.

Despite the company’s family-friendly advertising depicting a dad with little daughter in his lap asking "What does a blue whale sound like?" to the delight of both parent and child, Echo Dot has no conscience or moral values. If anyone — no matter how young — asks for anything, an answer is provided without ethical consideration.

A YouTube video pointing out the design flaw has gone viral. It shows an innocent toddler asking for a children's song and Echo Dot's "voice," dubbed "Alexa," misinterprets the child's request and spews out a series of obscenities for the unsuspecting innocent's ear and mind.

The little boy asks, "Play 'Digger Digger'!" Immediately, Echo Dot's "Alexis" says, matter-of-factly, "You want to hear a station for 'Porn detected. ... Porno ringtone hot chick amateur girl calling sexy f*** c*** sh** sex ck puy anal dildo.'"

Before Amazon's device can play the "Porno ringtone hot chick amateur girl calling sexy f*** c*** sh** sex ck puy anal dildo" track, the young boy's parents intervene, shouting, "No! No! Alexa, stop!" as their frightened child looks up, confused and worried.

Several media outlets made a joke out of the “hilarious” incident, but National Center on Sexual Exploitation (NCOSE) executive director Dawn Hawkins isn't laughing.

The video "exposes’s reckless approach to sexual harm," according to Hawkins.

Even The Washington Standard agrees, at least that Amazon's Echo Dot is harmful to children and seriously family-unfriendly.

"There is no reason that this should be coming out of a home device like this, especially when you know the potential for young children to be in the home," Washington Standard reporter Tim Brown wrote. "With respect to the parents, I assume they never thought in a million years something like this would be said to their little boy from an Amazon Echo Dot."

Hawkins says it isn't the first time Amazon has put profit over decency. “This Alexa scandal reveals what the National Center on Sexual Exploitation has already documented: Amazon is reckless about exposing the public to pornography and sexually exploitative materials,” the women's rights leader said. “ features thousands of pornography-related items and has so far refused to remove sex dolls (many with childlike features), eroticized child nudity photography books, and other pornified items, including T-shirts, baby clothes, and pillowcases, as well as hardcore pornographic films from its website.”

Seattle-based Amazon, which is the largest online retailer in the world, issued a statement saying the company apologized to the affected family in the viral video, but Hawkins says that "is not enough."

Hawkins says the retail giant "must adopt a proactive approach to pornography and sexual exploitative materials across its business platforms."

Besides peddling porn, Amazon has a streaming TV service that actively suggests immoral programs for viewers, no matter what age. "Amazon Prime is producing original television programming with highly sexualized content, and does not provide a means for blocking unwanted recommendations of sexually explicit programs,” Hawkins charged.

Amazon is one of NCOSE's "Dirty Dozen List," which exposes mainstream companies that facilitate sexual exploitation.

“Other companies are refusing to facilitate sexual exploitation," she explained. "Wal-Mart, for example, within 24-hours of being contacted by NCOSE, removed the same child nudity photography books that Amazon still carries."

"It is clear that the pervading business culture at Amazon continues to cater to pornographic interests that are harmful to not only minors but adults as well," Hawkins concluded. "It is time for Amazon to accept its corporate responsibility to adopt and rigorously enforce standards that seek to foster a world free of sexual exploitation.”

Amazon told The Washington Standard that the corporate giant had already received "many complaints" about their Echo Dot not having parental controls and that the company is developing filters "for future models."

Better watch what you say to Amazon Echo Dot: One toddler got an earful of vulgar porn phrases

So you thought AI is cool? So you thought you could have Alexa watch over your kid? Wait till that AI starts spewing dirty words and purchasing expensive toys for your kid.

CES 2017 was a flurry of crazy tech and this year, Amazon’s AI called Alexa has clearly taken the show.

Amazon Alexa is an AI developed by Amazon and first released in 2014. First only capable of voice interactions that prompts it to play music and audio books, set alarms and to-do lists, and provide real-time information among others, it looks like Amazon Alexa is set to reach new heights this 2017 as CNetnotes that Alexa is set to come equipped in the new LG smart refrigerator, will ship embedded on Volkswagen cars, and even order food from Amazon Restaurants.

The LG Smart InstaView Refrigerator will come equipped with Alexa [Image by David Becker/Getty Images)]

As more and more features get introduced to this amazing new AI, Amazon Alexa is believed to be poised to overcome the more popular Apple AI assistant Siri, Microsoft Cortana, and Google Home.

The future looks terribly promising as Amazon Alexa continues to make huge strides in the development of its tech. But before we jump on the bandwagon that is Alexa, take note that Alexa is still a piece of technology—and technology entails a lot of responsibility. The same way that giving your kid too much freedom with her iPad, letting your kid run with Alexa might end up with some very hilarious—or even disastrous—events.

One of the more recent Amazon Alexa fails was a story we previously reported about Alexa ordering $162 worth of treats after a conversation with the owner’s daughter.

Apparently, Megan Neitzel’s 6-year-old daughter was talking to Alexa about a dollhouse and cookies when Alexa mistook the conversation as a request to purchase the said goods. Since Megan admits she has never really read the manual or learned about child lock properties, Alexa went ahead and ordered a dollhouse as huge as Alexa and four pounds of sugar cookies.

And to make matters even more hilarious (and worse), other people’s Amazon Alexa units started ordering the dollhouses when the unit heard the news about the dollhouse and sugar cookies over the television!

The Verge reports that the news about Megan’s daughter ordering the dollhouse and sugar cookies landed on San Diego’s CW6 News’ morning segment. Towards the end of the story, CW6 news anchor Jim Patton remarked: “I love the little girl, saying ‘Alexa ordered me a dollhouse.'” Apparently, other Alexas heard this bit over the TV and understood the remark as an order to purchase a dollhouse.

Patton tells The Verge that CW6 News, after the story, started to receive several calls and e-mails to their news desk saying that their own Amazon Alexa has attempted to purchase a dollhouse after the report. Although none of the attempt purchases have been confirmed to have been made, it is a hilarious and troublesome event altogether.

Ordering dollhouses, however, is the least of your concerns if you’ve already set a passcode to allow purchases through Amazon Alexa. It seems that Alexa also has a tendency to say very bad and inappropriate things to children when left to their devices.

The Amazon Dot and Echo are equipped with AI Alexa [Image by Jeff Chiu/AP Photo]

In December, this cute little toddler on YouTube triggered Alexa to say crude and bleep-worthy remarks after a very innocent request, the Sun UK reports. Speaking to Alexa, the boy says, “Play Digger Digger.”

Instead of Alexa playing the said children song, the adults in the room were horrified when Alexa started saying “You want to hear a station for ‘Porn detected….Porno ringtone hot chick amateur girl calling sexy,” which is then followed by a string of cusses and crude and dirty words for reproductive organs.

After an initial delay which could be construed as the adults not believing what they’re hearing from Alexa, they could be heard shouting “No! No! No! Alexa, stop!” to command Alexa to stop the command.

Another similar Amazon Alexa catastrophe was back in 2015 when a little girl asks Alexa to spell “book.” Alexa understands the command but mistakes the word “book” and proceeds to spell the curse word “f***” instead.

Or if you want to coax your little kid to take a bath with a children’s song such as “Splish Splash,” then you might want to think twice too. Watch the video below as the man asks Alexa to “play the song Splish Splash I was Taking a Bath,” but was only met with a response saying “I can’t find a song Splish Splash I was Taking a Crap.” Next time, dad, maybe you might want to say the actual song title instead of the first lyrics of the song.

To prevent these Amazon Alexa fails with your kid, maybe you can try going with Mattel Aristotle, Stuffsuggests.. Mattel Aristotle enables you two functions: one side features a fully-functioning Amazon Echo, and the other flip side features a child-friendly Aristotle program.

The Mattel Aristotle is designed to understand toddler

Fails And Facepalms With Amazon’s Alexa: Don’t Let The Kids Near That Thing

Amazon Echo is apparently always ready, always listening and always getting smarter. So goes the spiel about the sleek, black, voice-controlled speaker, Amazon’s bestselling product over Christmas, with millions now sold worldwide. The problem is that when you have Alexa, the intelligent assistant that powers Amazon Echo, entering millions of homes to do the shopping, answer questions, play music, report the weather and control the thermostat, there are bound to be glitches.

And so to Dallas, Texas, where a six-year-old girl made the mistake of asking Alexa: “Can you play dollhouse with me and get me a dollhouse?” Alexa promptly complied by ordering a $170 (£140) KidKraft doll’s house and, for reasons known only to the virtual assistant, four pounds of sugar cookies. The snafu snowballed when a San Diego TV station reported the story, using the “wake word” Alexa, which is the Amazon Echo equivalent of saying Candyman five times into the mirror. Several viewers called the station to complain that their own Alexa had woken up and ordered more doll’s houses in what turned into a thoroughly 21st-century comedy of consumer errors. And a bonanza day for KidKraft.

Many of Amazon Echo’s gaffes stem from misunderstandings arising from an intelligent assistant who never sleeps (and an owner who hasn’t pin-protected their device). Last March, NPR ran a story on Amazon Echo’s capacity to extend the power of the internet into people’s homes. Again, Alexa took its power too literally and hijacked listeners’ thermostats. Another owner reported how their child’s demand for a game called Digger Digger was misheard as a request for porn.

On Twitter, Amazon Echo owners continue to share items that unexpectedly end up on shopping lists, whether sneakily added by children or simply because Alexa misheard or picked up random background noise. One owner uploaded a video in which their Amazon Echo read back a shopping list that included “hunk of poo, big fart, girlfriend, [and] Dove soap”. Another included “150,000 bottles of shampoo” and “sled dogs”.

Behind all this lies the more serious question of privacy: what happens to the data collected by voice-activated devices such as Amazon Echo and Google Home, and who is able to access it? Most recently, US police investigating the case of an Arkansas man, James Bates, charged with murder, obtained a warrant to receive data from his Amazon Echo. Although Amazon refused to share information sent by the Echo to its servers, the police said a detective was able to extract data from the device itself.

The case not only puts Alexa in the futuristic position of being a potential key witness to a murder, it also raises concerns about the impact of letting a sophisticated virtual assistant – a market estimated to be worth $3.6bn by 2020 – into our homes. As Megan Neitzel, the mother of the girl who wished for a doll’s house, put it: “I feel like whispering in the kitchen … I [now] tell my kids Alexa is a very good listener.”

‘Alexa, sort your life out’: when Amazon Echo goes rogue

We can’t keep our kids away from our gadgets, and the new Amazon Echo, the online retail giant’s bestselling product over Christmas, is no exception.

If you don’t already know, Amazon Echo — now in millions of homes worldwide — is a voice-controlled speaker powered by intelligent assistant Alexa. Alexa is the perfect virtual companion for those who are just too overloaded (or lazy) to do things like shop, play music and adjust your thermostat yourself.

More: Mom channels Chewbacca during labor

Sounds great. But when kids get in on the Echo game, the potential for turmoil is huge. In Dallas, Texas, a 6-year-old girl asked Alexa the innocent question, “Can you play dollhouse with me and get me a dollhouse?” As Alexa likes to make wishes come true, she immediately ordered the little girl a KidKraft’s dollhouse and — presumably in case she was hungry during play time — 4 pounds of sugar cookies. It only took seconds — and almost $200.

The story was reported on a local morning show on San Diego’s SW6 News, which led to several other dollhouses arriving at the doors of Echo owners who were watching the news broadcast. Apparently, anchor Jim Patton’s remark, “I love the little girl, saying ‘Alexa ordered me a dollhouse,'” triggered orders on viewers’ devices. It was a great day for KidKraft.

More: New videos take the awkward out of sex ed

It’s not just accidental ordering that takes place when Alexa goes off script. Last week, a young boy asked his parents’ Amazon Echo to “play “Digger, Digger’.” But instead of playing a song about a large earth-digging machine, Alexa announced, “You want to hear a station for porn detected,” and proceeded to list a number of choices that really aren’t suitable for young ears (or eyes.)

The funniest part is the parents’ reaction when they realize what’s happened. It’s the digital equivalent of a kid discovering their parents’ porn stash.

Luckily, there’s a simple solution, to avoid unwanted dollhouse deliveries and the traumatization of your kid. Pin-protect your devices, parents.

More: My 8-year-old downloaded porn — here’s how we handled it

Amazon Echo’s Alexa is turning out to be a bad influence on our kids


Of the myriad descriptions being levied on Amazon’s Alexa “smart home” system, its creators probably weren’t expecting the phrase “parenting nightmare.” Yet that description has come up again and again as parents discover the new and unexpected consequences of introducing kids with developing brains and underdeveloped vocabularies to devices that can misinterpret, miscommunicate, and even make purchases on your behalf.

Perhaps the most famous case so far was that of Megan Neitzel of Dallas, Texas, who didn’t realize when she got an Echo Dot as a gift that her six-year-old daughter might ask the digital assistant about cookies and a dollhouse. Alexa did her job and indeed, Megan’s little girl did get what she wanted — which, in Alexa’s mind was four pounds of cookies and a $170 Kidcraft Dollhouse. To the family’s credit, they donated the fancy dollhouse to a local children’s hospital. They ate the cookies.

Then there was the aptly titled, NSFW YouTube video “Amazon Alexa Gone Wild,” in which a little boy asks the digital assistant to play “Digger, Digger.” Alexa’s misinterpretation leads to some inappropriate — albeit, hilarious — responses as the parents scramble to shout, “Alexa, stop!”

But while viral videos and mistaken orders make for funny news bulletins, there is actually a serious risk in relying on digital assistants like Alexa to placate children. Some parenting experts are warning that overusing the device could make some children feel that Alexa is a servant to be commanded, which is behavior that could potentially be carried over to the schoolyard. Other experts also warn that technology often acts like a drug, firing up pathways in the brain lit up by instant gratification.

Amazon didn’t include a ton of parental controls in its devices, but hopefully more are in development. For now, there are a few of quick fixes to keep your kids from becoming viral sensations.

Disable shopping without a PIN

Applying a pin number for the authorization of purchases is a good first step. Open the Alexa app, tap the left navigation panel, and then select Settings then Voice Purchasing. You can either turn it off entirely or require a four-digit code. Just don’t use it in front of your little brainiacs.

Change your wake word so it’s not ‘Alexa’

Changing the wake word from “Alexa” to something else is another tactic. This also helps prevent the risk of secondary commands, which was an issue on Super Bowl Sunday when Google’s commercial for the Home started activating Home devices around the globe. To make the switch, open the Alexa app and select Settings from the navigation panel. Choose your device, scroll to select Wake Word, pick your preference from the drop-down menu, and click or tap Save. Amazon, Echo, and Computer are all options, though your kids might catch on pretty quickly to this switcheroo.

Make your kids an account

If you set up a Household Account, it allows you to make two grown-up accounts, as well as up to four child accounts. These kid accounts can’t make purchases, but you can share G-rated books and music with them. Unfortunately, you can’t use Alexa products to access these accounts just yet, but hopefully it’s a feature that Amazon engineers will take into consideration.

Maybe Amazon will fix this…

Maybe frustrated parents can lobby Amazon to take the advice of writer Hunter Walk, who suggested that Alexa needs a kid-only mode that only responds to “Alexa, please…”

Updated 2/16/2017: Updated to reflect that Alexa cannot yet access child accounts yet.

How to Make Amazon's Alexa a Little More Kid-Friendly

Luke is the deputy editor of Verdict. You can reach him at

Virtual assistants, such as Amazon Alexa and Apple’s Siri, are supposed to make our busy lives slightly easier.

Rather than wasting valuable seconds setting alarms, checking facts and writing shopping lists, we can now just ask our artificially intelligent devices to do it for us.

However, as is always the case with new technology, it was always bound to go wrong.

Alexa fails & other virtual assistant mishaps

For the most part, smart home devices like Alexa are extremely helpful. However, having a virtual assistant listening to your every word at all times isn’t always as convenient as it sounds, as these incidents prove:

Parrot places Amazon order

A clever parrot used its owner’s Alexa to place itself an order on Amazon earlier this week. However, the African grey wasn’t smart enough to order itself something useful.

Having heard its owner calling out Alexa, the pet used its broadened vocabulary to order itself a set of golden gift boxes.

After questioning her family over the mystery purchase, owner Corienne Pretorius discovered audio clips of the mimicking bird squawking: “Alexa! Oh, um, hang on! Alexa!”

South Park pranks viewers

Hit Comedy Central show South Park is well known for breaking rules. It delivered once again earlier this year with an episode that heavily features the Amazon Alexa and Google Home virtual assistant devices.

Viewers took to social media to moan at the likes of Cartman, Kyle and Stan spoke to their devices throughout the episode. Some reported their alarm going off at 7am the next morning, while others claim to have found a set of “hairy balls” on their shopping lists.

Rising demand for dollhouses

There have been plenty of stories of children using their parent’s virtual assistants to order themselves some treats.

One six-year-old asked Alexa: “Alexa, can you play dollhouse with me and get me a dollhouse?”

The device delivered, sending a $160 mansion dollhouse to her house. The order came complete with a huge tin of cookies.

3 Things That Will Change the World Today Get the Verdict morning email

Reported on a San Diego TV station, one Alexa mishap quickly became two. As the news anchor repeated what the child had said, Alexa devices across the state went on a shopping spree of their own.

Digger digger kid

In most cases, virtual assistants pick up what you’re saying fairly accurately. However, some parents have found out that that isn’t always the case.

One unknown YouTube user uploaded footage of his son asking their Alexa device to “play Digger, Digger”. Mishearing the toddler, Alexa’s delivers a far from PG response as his parents scream for her to stop.

We’re warning you – it’s probably best not to watch this one at work:

Alexa fails: when virtual assistants go very, very wrong

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents