Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 847

Associated Incidents

Incident 4820 Report
Passport checker Detects Asian man's Eyes as Closed

Loading...
Facial recognition software rejects Asian man's passport photo because it thinks his eyes are closed
techspot.com · 2016

Can software be racist? No, though humans can inadvertently design programs that appear racially insensitive, or, as was the case with Microsoft’s Tay chatbot, intentionally turn an AI into a raging hatemonger. Facial recognition systems occasionally fall into the former category; the latest incident saw a New Zealand man of Asian descent have his passport photo rejected because the software thought his eyes were closed.

Twenty-two-year-old Richard Lee was attempting to renew his passport online but was surprised when the photo checker blocked his picture. “Subject eyes are closed,” read the notification, despite the fact they are clearly open. It seems the software was having trouble with Taiwan-born Lee’s epicanthal folds.

The engineering student, who is currently studying Aerospace Engineering and Business Management in Melbourne, contacted the Department of Internal Affairs to find out why the system was having a problem with his picture. A spokesperson told him 'uneven lighting on the face' caused it to be rejected, and explained that up to 20 percent of passport photos submitted online are rejected for various reasons, the most common being that the subject’s eyes are closed.

Lee wasn’t offended by the automated system’s response. "No hard feelings on my part, I've always had very small eyes and facial recognition technology is relatively new and unsophisticated," he told Reuters. "It was a robot, no hard feelings. I got my passport renewed in the end."

The incident shows that while facial recognition has come a long way, there are still bumps in the road. Last year, Google found itself in trouble after its software labeled two black people in a photograph as “gorillas.”

Lee said he saw the humor in his situation. “'Some people get offended way too easily because they're not as confident with their origins... At the end of the day we're all different and of course there are certain situations where you have to stick up and some situations it's just a good laugh.” He even uploaded the image below to Facebook.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd