Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2445

関連インシデント

インシデント 4235 Report
Lensa AI's Produced Unintended Sexually Explicit or Suggestive "Magic Avatars" for Women

Feeling Uncomfortable About All These AI-Generated Images? This Might Be Why.
theskimm.com · 2022

In December, our feeds started getting flooded with AI-generated images of our friends. Thanks to the Lensa AI app. It quickly became the top photo app in the Apple app store — one analysis found that more than 20 million people have already downloaded it. But some people feel like the photos aren’t portraying people accurately. In a problematic way. Specifically because the AI image generator seems to be using racist and sexist stereotypes. 

All of this brings up a dark side of AI technology. To get to the bottom of the controversy around these images, our “Skimm This” podcast team sat down with Ina Fried, chief technology correspondent for Axios, about the AI images taking social media by storm.

How does an AI image generator work?

We’ll use the Lensa AI app as our primary example here. After uploading selfies and paying a $4 service fee, the app generates dozens of artistic images in different styles and settings, with you as the star. Glamorous, right?

But there have been questions about the pictures users are getting back. In addition to the concerns about the AI-generated images using material stolen from artists, people have shared that their avatars have bigger breasts and slimmed waistlines, according to The New York Times. Some even said the app generated nude photos of them. And some Black users found that the app lightened their skin. 

Fried says we shouldn't be surprised that the app generates these kinds of images. “It starts from the moment you click that button that says woman, man, or other,” she said. “It's now making a bunch of gendered assumptions.”

Fried also points out that these assumptions come from human biases. “We have bias in our society, so the data that trains these systems has bias,” she said. “We have to correct for both of those things.”

But now, with this latest trend, that conversation has reached a wider audience. Which Fried says is a good thing. “It's really important now…when we're still talking [about] relatively trivial things like a photo editor, that we understand bias,” she said. “That we really learn to understand why systems are making these decisions and correct for it.”

Maybe I’ll pass on the AI generated images. Anything else I can do?

Call out the bias, said Fried. Because AI technology has so many different applications, from photo apps like Lensa to the bail system. “I think it's really important that we uplift the folks that are saying, ‘Hey, this lightened my skin. This put me in a stereotypical depiction.’ We are at this infancy of AI where we can say, this is okay and this isn't okay,” said Fried. 

If you're curious and want to see your avatar, Fried says there are ways we can use AI technology in a smarter, safer way. “Do some research about the privacy policy” to find out what a company may do with your photos, she said. In some instances, they may be used to train a facial recognition algorithm. “So sometimes it's not doing anything individually bad to you, but it might be doing something collectively that you're not okay with,” she said. 

theSkimm

AI tech is growing rapidly — in more areas of our lives than we might realize. And like society, it’s unfortunately flooded with all kinds of biases. Which can perpetuate racist and sexist stereotypes. No shade to those who have tried the app already, or who still plan to. But maybe take a closer look at the images the app generates before you share them.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 69ff178