Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 573

Associated Incidents

Incident 3624 Report
Picture of Woman on Side of Bus Shamed for Jaywalking

Loading...
Facial Recognition Flags Woman On Bus Ad For 'Jaywalking' In China
gizmodo.com.au · 2018

Photo: Getty

China’s surveillance system is becoming increasingly omnipresent, with an estimated 200 million cameras and counting. While this state of existence alone is unsettling, it’s even more troubling that the machines are fucking up even the simplest task.

Last week, the face of Dong Mingzhu — the chairwoman of a leading air conditioner manufacturer in China — was displayed on a giant Billboard-sized screen in Ningbo, a major port city in east China’s Zhejiang province, to publicly shame her for breaking a traffic law. Zhejiang is one of the provinces that last year deployed facial recognition technology that humiliates citizens who jaywalk by putting their photos on massive LED screens. But the cameras didn’t catch Mingzhu jaywalking—they identified a photo of her in a bus ad, South China Morning Post reported.

The traffic police in the city reportedly announced in a blog post on Sina Weibo on Wednesday that it deleted the photo and that its surveillance system would be fixed to prevent future misidentifications. And Gree Electric Appliances, the company Mingzhu works for, also reportedly published a blog post on Sina Weibo that same day expressing gratitude for the city’s traffic police and urging people to follow the traffic rules.

While the traffic police were apparently quick to acknowledge and remedy their system’s screwup, and Gree’s response was sympathetic, this incident still signals one glaring issue with the mass adoption of AI-based recognition systems: The technology is still laughably flawed. This is far from the first incident in which an algorithm failed to detect the nuance of the human world around it, and there’s yet to be a massively deployed AI-system that’s proven to be perfect.

Mistakenly flagging someone for jaywalking because a machine mistook a moving bus ad for an actual three-dimensional human isn’t itself that dangerous, but shame billboards are hardly the only facial recognition systems proliferating China. In fact, research firm IHS Markit forecasts that China will buy more than three-quarters of servers made specifically for combing through surveillance footage for faces, the New York Times reports.

“In the past, it was all about instinct,” Shan Jun, the deputy chief of the police at the railway station in Zhengzhou, where a police officer identified a heroin smuggler using facial recognition glasses, told the Times. “If you missed something, you missed it.”

Machines aren’t capable of instinct, and aside from not being able to differentiate subtleties in the physical world (i.e. a photo in an advertisement from a flesh-and-blood person), they’re also not free from bias. It’s easy to imagine how these flaws can go awry not only when used to humiliate jaywalkers, but also to socially rank citizens and identify criminal suspects.

[South China Morning Post]

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd