Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 1411

関連インシデント

インシデント 1083 Report
Skating Rink’s Facial Recognition Cameras Misidentified Black Teenager as Banned Troublemaker

Loading...
Teen turned away from roller rink after AI wrongly identifies her as banned troublemaker
theregister.com · 2021

A Black teenager in the US was barred from entering a roller rink after a facial-recognition system wrongly identified her as a person who had been previously banned for starting a fight there.

Lamya Robinson, 14, had been dropped off by her parents at Riverside Arena, an indoor rollerskating space in Livonia, Michigan, at the weekend to spend time with her pals. Facial-recognition cameras installed inside the premises matched her face to a photo of somebody else apparently barred following a skirmish with other skaters.

Robinson was thus told to leave the premises by staff. She said the person in the image couldn’t possibly be her because she had never been to the skating rink before. Her parents, Juliea and Derrick, are now mulling whether it’s worth suing Riverside Arena or not.

“To me, it's basically racial profiling," Lamya’s mother told Fox 2 Detroit. "You're just saying every young Black, brown girl with glasses fits the profile and that's not right."

One of the arena's managers later called Lamya’s mother to discuss the issue. And in a statement, the biz said: “The software had her daughter at a 97 percent match. This is what we looked at ... if there was a mistake, we apologize for that."

Lots of mistakes to be found

Facial-recognition technology is controversial. Experts in the AI research community, lawyers, and even law enforcement have called Congress to place a moratorium on using the software in the real world. Several projects have shown that the algorithms involved generally struggle with accurately identifying women and people of color, such as Lamya.

The House judiciary committee held a hearing about the effects of facial recognition used in law enforcement just this week. Robert Williams, a man from Detroit, who was wrongly arrested and detained for 30 hours, testified.

“I grew up in Detroit, and I know from that experience that the fact of the matter is that people that look like me have long been more subject to surveillance, heavy policing, and mass incarceration than some other populations,” he said. “I worry that facial recognition technology, even if it works better than it did in my case, will make these problems worse.”

There is no federal-level regulation of the technology in America, however, and Congress seems unlikely to act on the issue. Instead, individual states and cities have their own rules that vary in terms of how and where the facial recognition cameras can be used or not.

In Maine, for example, state officials cannot use the technology and cannot contract third parties to do so. The software cannot be used except in cases involving serious crimes or to search for registered vehicles. Elsewhere, in Portland, Oregon, facial-recognition cameras are not allowed to be used inside any public or private places, from grocery stores to train stations.

Many states, however, are pretty lax about it. Banks in Florida and North Carolina use systems to monitor customers and, in some cases, shoo away homeless people loitering outside.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd