
It is supposed to make life easier – but owners of the new Amazon Echo have fallen foul of the high-tech gadget's automatic features.
Owners of the device been warned after a number accidentally ordered dollhouses that were being discussed on a TV show.
The Amazon Echo, which includes a virtual assistant called Alexa, was a popular Christmas gift.
The Amazon Echo (pictured, left), which includes a virtual assistant called Alexa, was one of the most popular Christmas gifts but the case of Brooke Neitzel (right) shows how it can sometimes confuse a conversation for an order
Alexa performs tasks at the vocal request of her owner – including internet shopping.
But a recent incident in the US has revealed the technology could prove costly to unsuspecting users.
Last week it was reported that an Amazon Echo in Dallas, Texas, had ordered a $170 (£140) dollhouse after six-year-old Brooke Neitzel asked: 'Can you play dollhouse with me and get me a dollhouse?'
Although the little girl apparently meant it as a rhetorical question the device saw it as a command and ordered a KidKraft Sparkle mansion dollhouse, as well as four pounds of sugar cookies.
But to make matters worse many Amazon Echoes apparently picked up on TV anchor Jim Patton's words in the report: 'I love the little girl saying "Alexa ordered me a dollhouse".'
Viewers at home were then left stunned when their own Amazon Echoes picked up the voice requests in the report and ordered dolls houses too.
As voice-command purchasing is enabled as a default on the Alexa devices, viewers found it had mistaken the show for their command and bought the toy.
Owners of the device been warned after a number accidentally ordered dolls houses that were being discussed on a TV show
The device starts recording whenever it hears the word Alexa, recording sound for up to 60 seconds each time.
Stephen Cobb, a senior security researcher with ESET North America, told CW6 TV station in San Diego: 'These devices don't recognize your specific voice and so then we have the situations where you have a guest staying or you have a child who is talking and accidentally order something because the device isn't aware that it's a child versus a parent.'
He said the Federal Trade Commission was looking into ensuring voice-command devices were safe and secure.
But he said: 'Down the road the technology will be more sophisticated where it will be able to identify certain individuals and register people can access it.'
Oops, mommy: Brooke Neitzel (pictured, with her mom Megan) denied ordering the dollhouse and said she was just playing
Brooke's parents have since added a security code for purchases and have donated the dollhouse to a local children's hospital.
Experts said the incident highlighted the need for people to protect their Amazon Echo devices using a four-digit password to avoid rogue payments being made.
There are fears that a growing wave of cyber criminals are targeting smart household devices in a bid to hack into people's accounts and steal money.
Amazon was approached for comment by the Daily Mail.