Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3717

関連インシデント

インシデント 6365 Report
AI Romance Apps Reportedly Compromise User Privacy for Data Harvesting

Loading...
Don’t date robots — their privacy policies are terrible
theverge.com · 2024

Talkie Soulful Character AI, Chai, iGirl: AI Girlfriend, Romantic AI, Genesia - AI Friend & Partner, Anima: My Virtual AI Boyfriend, Replika, Anima: AI Friend, Mimico - Your AI Friends, EVA AI Chat Bot & Soulmate, and CrushOn.AI are not just the names of 11 chatbots ready to play fantasy girlfriend — they’re also potential privacy and security risks.

A report from Mozilla looked at those AI companion apps, finding many are intentionally vague about the AI training behind the bot, where their data comes from, how they protect information, and their responsibilities in case of a data breach. Only one (Genesia) met its minimum standards for privacy.

Wired says the AI companion apps reviewed by Mozilla “have been downloaded more than 100 million times on Android devices.” 

“To be perfectly blunt, AI girlfriends are not your friends. Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you,” writes Misha Rykov in the report. For example, the CrushOn.AI app says in its privacy policy that it may collect sexual health information, prescribed medication, and gender-affirming care data.

Several of the apps also mention mental health benefits. Take Romantic AI, which says it’s “here to maintain your mental health.” But inside its terms and conditions, it says, “Romantiс AI MAKES NO CLAIMS, REPRESENTATIONS, WARRANTIES, OR GUARANTEES THAT THE SERVICE PROVIDE A THERAPEUTIC, MEDICAL, OR OTHER PROFESSIONAL HELP.”

Another chatbot maker, Replika, has expanded beyond just AI companionship to make Tomo, a wellness and talk therapy app with an AI guide that brings the user to a virtual zen island. Since I tried the app, Tomo has published a privacy policy, echoing what I was told by Replika CEO Eugenia Kuyda last month: “We don’t share any information with any third parties and rely on a subscription business model. What users tell Tomo stays private between them and their coach.”

Still, Italy banned the company last year, prohibiting it from using personal data in the country since the bot “may increase the risks for individuals still in a developmental stage or in a state of emotional fragility,” according to Reuters.

The internet is rife with people seeking connections with a digital avatar, even before the rise of generative AI. Even ChatGPT, which expressly forbids users from creating AI assistants to “foster romantic relationships,” couldn’t stop people from creating AI girlfriend chatbots on the GPT Store. 

People continue to crave connection and intimacy, even if the other person happens to be powered by an AI model. But as Mozilla put it, don’t share anything with the bots that you don’t want other people to know.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd