Citation record for Incident 95

Suggested citation format

Lutz, Roman. (2019-11-06) Incident Number 95. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Editors
95
1
2019-11-06
Sean McGregor

CSET Taxonomy Classifications

Taxonomy Details

Full Description

In January 2021, HireVue announced it would remove AI expression tracking from its platform following a complaint filed by the nonprofit Electronic Privacy Information Center. HireVue is contracted by hundreds of companies to conduct employee screening through automated video and written job interviews. The tool in question tracked users' expressions during video interviews to predict certain employment characteristics. HireVue denies any bias in the algorithm, however decided to remove the feature in response to public outcry.

Short Description

In January 2021, HireVue removed the controversial AI expression tracking tool from its virtual job interview software.

Severity

Unclear/unknown

AI System Description

HireVue's AI-enabled facial expression tracking software. The system was designed to detect "microexpressions" to evaluate the employment characteristics of an applicant.

System Developer

HireView

Sector of Deployment

Administrative and support service activities

Relevant AI functions

Perception, Cognition

AI Techniques

facial recognition, expression tracking

AI Applications

decision support, psychological inference

Location

Global

Named Entities

HireView, Electronic Privacy Information Center, Federal Trade Commission, O’Neil Risk Consulting and Algorithmic Auditing, Kevin Parker, John Davisson

Technology Purveyor

HireView

Beginning Date

2019-01-01

Ending Date

1/2021

Near Miss

Unclear/unknown

Intent

Accident

Lives Lost

No

Data Inputs

recorded video and audio

Incidents Reports

But it’s still using intonation and behavior to assist with hiring decisions.

Job hunters may now need to impress not just prospective bosses but artificial intelligence algorithms too—as employers screen candidates by having them answer interview questions on a video that is then assessed by a machine.

HireVue, a leading provider of software for vetting job candidates based on an algorithmic assessment, said Tuesday it is killing off a controversial feature of its software: analyzing a person’s facial expressions in a video to discern certain characteristics.

Job seekers screened by HireVue sit in front of a webcam and answer questions. Their behavior, intonation, and speech is fed to an algorithm that assigns certain traits and qualities.

HireVue says that an “algorithmic audit” of its software conducted last year shows it does not harbor bias. But the nonprofit Electronic Privacy Information Center had filed a complaint against the company with the Federal Trade Commission in 2019.

HireVue CEO Kevin Parker acknowledges that public outcry over the use of software to analyze facial expressions in video was part of the calculation. “It was adding some value for customers, but it wasn’t worth the concern,” he says.

The algorithmic audit was performed by an outside firm, O’Neil Risk Consulting and Algorithmic Auditing. The company did not respond to requests for comment.

Alex Engler, a fellow at the Brookings Institution who has studied AI hiring, says the idea of using AI to determine someone’s ability, whether it is based on video, audio, or text, is far-fetched. He says it is also problematic that the public cannot vet such claims.

“There are parts that machine learning can probably help with, but fully automated interviews, where you’re making inferences about job performance—that’s terrible,” he says. “Modern artificial intelligence can’t make those inferences.”

HireVue says that about 700 companies, including GE, Unilever, Delta, and Hilton, use its technology. The software requires job applicants to respond to a series of questions in a recorded video. The company’s software then analyzes various characteristics including the language they use, their speech, and, until now, their facial expressions. It then provides an assessment of the applicant’s suitability for a job, as well as a measure of traits including “dependability,” “emotional intelligence,” and “cognitive ability.”

Parker says the company helped screen more than 6 million videos last year, although sometimes this involved simply transcribing answers for an interviewer rather than performing an automated assessment of candidates. He adds that some clients let candidates opt out of automated screening. And he says HireVue has developed ways to avoid penalizing candidates with spotty internet connections, automatically referring those candidates to a human.

AI experts warn that algorithms trained on data from previous job applicants may perpetuate existing biases in hiring. Lindsey Zuloaga, HireVue’s chief data scientist, says the company screens for bias on gender, race, and age by collecting that information in training data and looking for signs of bias.

But she acknowledges that it may be more difficult to know if the system is biased on factors such as income or education level, or if it could be affected by something like a stutter.

“I am surprised they are dropping this, as it was a keystone feature of the product they were marketing,” says John Davisson, senior counsel at EPIC. “That is the source of a lot of concerns around biometric data collection, as well as these bold claims about being able to measure psychological traits, emotional intelligence, social attitudes, and things like that.”

The use of facial analysis to determine emotion or personality traits is controversial; some experts warn that the underlying science is flawed.

Lisa Feldman Barrett, a professor at Northeastern University who studies analysis of emotion, says a person’s face does not on its own reveal emotion or character. “Just by looking at someone smiling, you can’t really tell anything about them except maybe that they have nice teeth,” she says. “It is a bad idea to make psychological inferences, and therefore determine people's outcomes, based on facial data alone.”

Job Screening Service Halts Facial Analysis of Applicants