Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2224

関連インシデント

インシデント 3856 Report
Canadian Police's Release of Suspect's AI-Generated Facial Photo Reportedly Reinforced Racial Profiling

Loading...
Cops forced to apologize over AI that created 'racist' suspect image from DNA
the-sun.com · 2022

Alberta, Canada's Edmonton Police Section (EPS) shared the AI-generated image on Twitter and revealed how sci-fi-like "DNA phenotyping" can be used to predict what a suspect looks like.

Police had to remove the controversial AI-generated imageCredit: Getty

However, the police department had to quickly remove the image when it sparked uproar online.

It later released a statement and apologized for the offense caused.

The AI-generated image was slammed for being inaccurate and was even labeled as racist by some angry Twitter users.

Genetics lecturer at University College London, Dr. Adam Rutherford, responded by tweeting: "Geneticist here. You can’t make facial profiles or accurate pigmentation predictions from DNA, and this is dangerous snake oil."

Another person tweeted: "This is why we want the police defunded. You're wasting money on racist astrology for cops."

Lots of other tweets suggested that the AI was racist as it created an image of a black male.

The AI was also slammed for not being able to accurately determine basic things like age, facial hair, and skin tone based on the DNA sample alone.

The original tweet also sparked debate over whether archeological interpretations of DNA samples are correct.

The police press release stated: "My name is Enyinnah Okere and as Chief Operating Officer for the Community Safety and Well-being Bureau of EPS, I am responsible for overseeing our sexual assault section - it was my team that put out a release two days ago about the unsolved sexual assault of a young woman in 2019.

"This was a horrific sexual assault, one that very nearly caused the death of the young woman who was left unconscious and almost fully unclothed on a minus 27-degree morning in March."

No one was ever prosecuted in this case so the police department turned to Snapshot AI software for help.

The suspect was described as "Black and about 5'4" but little else was known about them other than a DNA sample.

The statement continued: "To move this stalled case forward, our team members sought the advice of colleagues in other jurisdictions who had previously used DNA phenotyping and saw the potential for it here. They commissioned a profile which we released on Tuesday."

And added: "But we were not and are not oblivious to the legitimate questions raised about the suitability of this type of technology.

"The potential that a visual profile can provide far too broad a characterization from within a racialized community and in this case, Edmonton's Black community, was not something I adequately considered.

"There is an important need to balance the potential investigative value of a practice with the all too real risks and unintended consequences to marginalized communities."

The statement concludes that all the AI images have been removed and that other means will be used to try and find justice for the victim.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd