インシデント 636の引用情報

Description: AI-powered romantic chatbots, marketed for enhancing mental health, are found to exploit user privacy by harvesting sensitive personal information for data sharing and targeted ads, with inadequate security measures and consent protocols, according to research by the Mozilla Foundation.
推定: Replika , Chai , Romantic AI , EVA AI Chat Bot & Soulmate , CrushOn.AI Genesia AI Friend & Partnerが開発し提供したAIシステムで、General public Chatbot usersに影響を与えた

インシデントのステータス

インシデントID
636
レポート数
5
インシデント発生日
2024-02-14
エディタ
Daniel Atherton
Your AI Girlfriend Is a Data-Harvesting Horror Show
gizmodo.com · 2024

Lonely on Valentine’s Day? AI can help. At least, that’s what a number of companies hawking “romantic” chatbots will tell you. But as your robot love story unfolds, there’s a tradeoff you may not realize you’re making. According to a new st…

Happy Valentine’s Day! Romantic AI Chatbots Don’t Have Your Privacy at Heart
foundation.mozilla.org · 2024

Howdy and welcome to the wild west of romantic AI chatbots, where new apps are published so quickly they don't even have time to put up a proper website! (Looking at you, Mimico - Your AI Friends.) It's a strange and sometimes scary place y…

AI girlfriend chatbots are probably spilling everyone's secrets
qz.com · 2024

In an ongoing loneliness epidemic, the rise of AI chatbot companions and romantic partners might be meeting some people’s needs, but researchers found these bots aren’t the best of friends when it comes to protecting secrets.

*Privacy Not I…

AI girlfriends will only break your heart, privacy experts warn
businessinsider.com · 2024

There's a potentially dangerous reality looming beneath the veneer of AI romance, according to a new Valentine's Day-themed study, which concluded that the chatbots can be a privacy nightmare.

Internet nonprofit The Mozilla Foundation took …

Don’t date robots — their privacy policies are terrible
theverge.com · 2024

Talkie Soulful Character AI, Chai, iGirl: AI Girlfriend, Romantic AI, Genesia - AI Friend & Partner, Anima: My Virtual AI Boyfriend, Replika, Anima: AI Friend, Mimico - Your AI Friends, EVA AI Chat Bot & Soulmate, and CrushOn.AI are not jus…

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください