Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 573

関連インシデント

インシデント 3624 Report
Picture of Woman on Side of Bus Shamed for Jaywalking

Loading...
Facial Recognition Flags Woman On Bus Ad For 'Jaywalking' In China
gizmodo.com.au · 2018

Photo: Getty

China’s surveillance system is becoming increasingly omnipresent, with an estimated 200 million cameras and counting. While this state of existence alone is unsettling, it’s even more troubling that the machines are fucking up even the simplest task.

Last week, the face of Dong Mingzhu — the chairwoman of a leading air conditioner manufacturer in China — was displayed on a giant Billboard-sized screen in Ningbo, a major port city in east China’s Zhejiang province, to publicly shame her for breaking a traffic law. Zhejiang is one of the provinces that last year deployed facial recognition technology that humiliates citizens who jaywalk by putting their photos on massive LED screens. But the cameras didn’t catch Mingzhu jaywalking—they identified a photo of her in a bus ad, South China Morning Post reported.

The traffic police in the city reportedly announced in a blog post on Sina Weibo on Wednesday that it deleted the photo and that its surveillance system would be fixed to prevent future misidentifications. And Gree Electric Appliances, the company Mingzhu works for, also reportedly published a blog post on Sina Weibo that same day expressing gratitude for the city’s traffic police and urging people to follow the traffic rules.

While the traffic police were apparently quick to acknowledge and remedy their system’s screwup, and Gree’s response was sympathetic, this incident still signals one glaring issue with the mass adoption of AI-based recognition systems: The technology is still laughably flawed. This is far from the first incident in which an algorithm failed to detect the nuance of the human world around it, and there’s yet to be a massively deployed AI-system that’s proven to be perfect.

Mistakenly flagging someone for jaywalking because a machine mistook a moving bus ad for an actual three-dimensional human isn’t itself that dangerous, but shame billboards are hardly the only facial recognition systems proliferating China. In fact, research firm IHS Markit forecasts that China will buy more than three-quarters of servers made specifically for combing through surveillance footage for faces, the New York Times reports.

“In the past, it was all about instinct,” Shan Jun, the deputy chief of the police at the railway station in Zhengzhou, where a police officer identified a heroin smuggler using facial recognition glasses, told the Times. “If you missed something, you missed it.”

Machines aren’t capable of instinct, and aside from not being able to differentiate subtleties in the physical world (i.e. a photo in an advertisement from a flesh-and-blood person), they’re also not free from bias. It’s easy to imagine how these flaws can go awry not only when used to humiliate jaywalkers, but also to socially rank citizens and identify criminal suspects.

[South China Morning Post]

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd