Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2387

関連インシデント

インシデント 4244 Report
Universities' AI Proctoring Tools Allegedly Failed Canada's Legal Threshold for Consent

Loading...
Online exam proctoring software during the pandemic: The quest to minimize student privacy risks
priv.gc.ca · 2022

Organization

University of Ottawa

Published

2022

Project leader(s)

Céline Castets-Renard, Professor, Faculty of Law – Civil Law Section, University of Ottawa

Summary

This project examines how, during the COVID-19 pandemic, many universities used exam proctoring tools to compensate for the inability to conduct in-person exams.

The researchers found that while many surveillance processes and companies are operating in the industry, most tools use artificial intelligence techniques such as data mining and facial recognition to detect suspicious behaviour that could constitute cheating.

These companies use the personal information collected as part of their mission to monitor university exams. They also use it for secondary purposes of improving artificial intelligence tools. To do this, they obtain consent from students, but the conditions under which it is collected are not conducive to the expression of free, clear and individual consent. In addition, the separation of public and private sector privacy laws makes it difficult to characterize these outsourcing companies. Enforcing their potential liability as a principal under the Personal Information Protection and Electronic Documents Act (PIPEDA) is difficult in the context of enforcing provincial public sector laws.

Furthermore, this project demonstrates that control over the conditions of data collection and retention is made more difficult by the fact that many technology companies are U.S.-based and subject to U.S. law, and even require the transfer of data to the United States.

The researchers make five recommendations to address these issues, which can be found in the conclusion of the research report.

Project deliverables are available in the following language(s)

French

  • Research report (HTML document)
  • Round table "Quels enjeux juridiques des logiciels de surveillance d'examen ?" (HTML document)

English

  • Article: "Online test proctoring software and social control: Is the legal framework for personal information and AI protective enough in Canada?" (HTML document)

OPC-funded project

This project received funding support through the Office of the Privacy Commissioner of Canada's Contributions Program. The opinions expressed in the summary and report(s) are those of the authors and do not necessarily reflect those of the Office of the Privacy Commissioner of Canada. Summaries have been provided by the project authors. Please note that the projects appear in their language of origin.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd