Incident 140: ProctorU’s Identity Verification and Exam Monitoring Systems Provided Allegedly Discriminatory Experiences for BIPOC Students

Description: An exam monitoring service used by the University of Toronto was alleged by its students to have provided discriminatory check-in experiences via its facial recognition's failure to verify passport photo, disproportionately enhancing disadvantaging stress level for BIPOC students.
Alleged: ProctorU developed an AI system deployed by University of Toronto, which harmed University of Toronto BIPOC students.

Suggested citation format

Hall, Patrick. (2020-06-01) Incident Number 140. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor, Khoa Lam


New ReportNew ReportDiscoverDiscover

Incidents Reports

In response to the George Floyd protests, Meric Gertler, the President of the University of Toronto, condemned “systemic injustices” of anti-Black racism “in the strongest terms possible.”

“Racism is not an issue for racialized communities to fight; it impacts everyone, and it is our collective responsibility to purposefully work to create inclusive spaces that actively support our colleagues,” said Dr. Kelly Hannah-Moffat, UofT’s Vice-President, Human Resources & Equity.

But since these official statements in June, UofT has continued its partnership with exam monitoring services that policymakers, professors, and students say disadvantage BIPOC (Black, Indigenous, and People of Colour) students by causing friction during assessments not faced by students who are white.

In December, The Globe and Mail reported that Chelsea Okankwu, a Concordia University student, faced unexpected conflict verifying her identity at the start of an exam, due to the monitoring software having difficulties identifying her, claiming insufficient lighting.

“I just felt like the only one being disadvantaged, being put in that mindset before an exam,” said Okankwu to The Globe. The Globe noted that the scramble added to her stress at the start of the exam.

The added stress of student monitoring software during exams was echoed by Maame Adjoa, a second-year Global Health major at UofT and the Marketing Director of The Victoria Black Student Network.

“I have observed that during check-in processes, the AI system was often unable to identify my passport and I would be redirected to my human proctor for manual check-ins,” she wrote to The Strand. Her attempts to get the webcam to recognize her passport would take around five of her 15 minutes of check-in time.

She continued: “This in addition to the 360 [degree] room scan, proof of an inactive phone, and a full computer sweep, significantly enhanced the atmosphere of stress already associated with test-taking.”

Okanwu and Adjoa’s experiences with test proctoring software are corroborated by a score of BIPOC students interviewed by The New York Times in September; the San Francisco Chronicle in October; and Reuters in November.

“The likelihood is the people who will be flagged for potentially cheating and have to go through follow-ups are disproportionately likely to be African American and Asian—that goes beyond concerning,” said civil rights attorney Christine Webber to the Chronicle.

These reports of BIPOC subjects facing discriminatory experiences due to facial-recognition software aligns with artificial intelligence research involving UofT, previously reported by the University’s press office in February 2019.

“AI facial-recognition technology has built-in bias,” noted the report’s headline on the UofT Alumni webpage. It continued by lauding Deb Raji for “holding companies accountable” by uncovering this these findings of discrimination in software.

Raji, then a fourth-year engineering student at UofT, co-won a prestigious award for her research on systemic bias with the Massachusetts Institute of Technology’s Media Lab. Raji previously recalled experiences of discrimination due to technology.

“I’d build something at a hackathon and wonder why it couldn’t detect my face, or why an automated faucet can’t detect my hand,” she said to UofT Engineering News. Raji reflected that these biases in technology often occur when data sets used to train artificial intelligence models underrepresent BIPOC individuals.

That same month of UofT’s coverage on Raji, the University’s Online Learning Strategies office published a blog post announcing the University’s partnership with ProctorU and Examity—both exam monitoring services aiming to reduce cheating.

In a separate notice about ProctorU, the office noted that “instructors may choose to use in courses” various “levels of service, from automated authentication to live proctoring” with the service.

When asked how the firm is ensuring a fair experience for BIPOC students who use its software, ProctorU underscored its employment of human proctors to issue final decisions of student identity and verify exam integrity.

“It’s important to know that ProctorU is not an ‘automated proctoring service,’” a spokesperson wrote in an email to The Strand. “In fact, in almost all cases and unlike most other remote proctoring providers, ProctorU uses a diverse group of trained human proctors, assisted by technology, to safeguard exam integrity and assist students.”

ProctorU’s statement contradicts the language of its webpage on technical requirements for Chromebook users. The company notes on the webpage that ProctorU supports Chromebook users “for Automated Proctoring,” but not for “Live Proctoring.”

“The University of Toronto has vetted and approved ProctorU’s systems and protocols for remote test proctoring,” the email continued. “Technology never makes the final decision on a person’s identity, nor on potential integrity violations.”

Examity did not respond to The Strand’s request for comment. Examity notes on its website that it offers an “Automated Proctoring” service “with comprehensive auto authentication.”

Problems with these services have also drawn the attention of US Senators. Six Democratic senators, including former 2020 presidential candidate Elizabeth Warren, wrote an open letter dated to December that highlighted issues of “privacy, accessibility, and equity” with exam monitoring, which cited a New York Times report on an issue of accessibility with a student using ProctorU.

“As we have seen far too often, students have run head-on into the shortcomings of these technologies—shortcomings that fall heavily on vulnerable communities and perpetuate discriminatory biases,” wrote the senators. “Students of color, and students wearing religious dress, like headscarves, have reported issues with the software’s inability to recognize their facial features, temporarily barring them from accessing the software.”

The policymakers also cited an article in the MIT Technology Review by Shea Swauger—a university librarian and academic researcher with the University of Colorado Denver—who has explored how remote proctoring services disadvantage exam-takers who are racialized or have disabilities in the peer-reviewed Hybrid Pedagogy journal.

In an interview with The Strand, Swauger opined that automated testing is not necessary for learning. He noted that university educators can continue to opt for open-book assessments, which he says offer better evaluation for learning instead of memorization-based tests.

He also encouraged assessment through the completion of long-term projects by students, where they receive incremental feedback from educators over time. He notes that project-based learning is far less vulnerable to cheating than exams, and also set the stage for students to gain “meaningful experiences” in their education.

Swauger also spoke in favour of support for undergraduate research experiences—which UofT currently offers—as “meaningful, or tangible [work] connected to what [students] want to do in their profession or discipline” as another way to assess and evaluate students.

Not all exams at UofT are proctored. UofT’s Centre for Teaching Support & Innovation has also acknowledged drawbacks to online proctored exams. A webpage by the Centre notes: “Consider online proctoring only for unique situations or needs given the additional logistical challenges and potential equity issues related to accessing the needed technology.”

However, this advisory is framed as a recommendation instead of a regulation. UofT continues to lack a university-wide policy on online proctoring, in contrast with policies set by McGill University.

“McGill doesn’t use proctoring software,” said Dr. Christopher Buddle, Associate Provost of Teaching and Academic Programs, in an interview with the McGill Reporter. “Instead, the University encourages assessments which are less subject to cheating as they require students to engage with the materials and demonstrate their abilities to use and apply the knowledge appropriately. Not only are those more difficult to cheat on, but the assessment is better aligned with desired learning outcomes.”

Asked why UofT officials have continued to use digital proctoring services when McGill University has decided against using this software, a UofT spokesperson wrote to The Strand: “The University continuously monitors issues that arise around e-proctoring. We have read [The Globe’s report] that you shared carefully. Equity, diversity, and inclusion are fundamental to our learning and teaching environments.”

“While the University of Toronto has existing contracts with both Examity and ProctorU, we do not currently have any degree-program departments or Faculties using Examity. Only a small number of units—largely in the professional programs—are using ProctorU,” continued the spokesperson. “It is the decision of individual Faculties or divisions whether to use these services. The University administration’s clear guidance is that e-proctoring should not be the first or only way that divisions seek to ensure academic integrity on assessments.”

“Any issues that are flagged by the e-proctoring system are reviewed in the first instance by instructors,” the spokesperson added, and highlighted that UofT “offers anti-racism and unconscious bias training to all members of our community.”

Dr. George Dei, a professor at UofT’s Ontario Institute for Studies in Education who studies anti-racism in education, maintained emphasis on the importance of critically questioning the use of proctoring software, in email to The Strand.

“We know that technology has a pattern of perpetuating different forms of oppression, including racism,” he wrote. “COVID has unveiled many inequities in education and testing is yet another one.”

BIPOC students face disadvantages with exam monitoring software at the University of Toronto