Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 1028

関連インシデント

インシデント 5516 Report
Alexa Plays Pornography Instead of Kids Song

Loading...
Kid Asks Amazon Alexa To Play Something, Gets Porn Instead
cinemablend.com · 2017

Kid Asks Amazon Alexa To Play Something, Gets Porn Instead Corey Chichizola Random Article Blend There is no time like Christmas for kids. Essentially winning the lottery every year, it's a joy to see the children in our family anxiously await Santa Clause the way that we as grown adults anxiously await each new installment in the Star Wars franchise. And although adults might generally not be gifted action figures and dolls at Christmas, adult "toys" like video games and gadgets are just as exciting. But what happens when the actual kids attempt to play with the grown up toys? Sometimes disaster, and sometimes pure magic. Case in point: a new video that has quickly gone viral. We see a young boy ask an Amazon Echo Dot to play one of his favorite songs, and things quickly go awry. Instead of playing a little diddy called "digger digger", it instead began searching for porn titles. You have to see it to believe it. If you listen closely, you can almost hear the steam erupting from this little boy's parents. Things almost got very real for the kid, but luckily it appears that the adults in the room managed to stop Alexa before it managed to ruin anyone's childhood. Thank goodness for that. The Amazon Echo, and the new Echo Dot, is designed to be a digital assistant that is always listening for commands. But as cool as you can look by instructing the speaker to order something from Amazon or play your favorite artist, the Echo isn't totally without its fault. The device can sometimes misinterpret your requests, or even completely ignore her name. Alexa is a fickle lady, and it's apparently a device that should be closely monitored when in the presence of children. Because as much as she'd be happy to tell them a joke, she can also apparently lead them toward a path of sexual perversion. That's one badass lady. The Amazon Echo has been a hot button Christmas gift for the past few years. Largely because of a fantastic marketing campaign starring Alec Baldwin, it seems like every house needs the little device to call out to on a whim. And for those households that rely heavily on Amazon Prime, it can easily synch into their routine. With so many people being gifted the Echo and Echo Dot, I guess it's important to monitor the device around kids. Overall, it appears that no harm occurred in this video- making it a hilarious one that has quickly accrued tons of views. The change in mood from joy to pure horror is hilarious, as well as how clueless the kid is to the problem. He wants to hear his favorite song, and he wants it now. And as far as viral holiday videos go, this is as about as harmless as you can get. Blended From Around The Web Facebook

Back to top

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd