Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2120

関連インシデント

インシデント 3364 Report
UK Home Office's Sham Marriage Detection Algorithm Reportedly Flagged Certain Nationalities Disproportionately

Loading...
'Sham marriages' and algorithmic decision-making in the Home Office
publiclawproject.org.uk · 2021

The Home Office is using an algorithm to determine whether a marriage should be investigated as a ‘sham’, according to documents obtained by the Public Law Project under the Freedom of Information Act.

PLP is concerned that this algorithm may be flawed and discriminatory because some nationalities seem more likely to be targeted for investigation than others.

Get in touch

Our Research Team is keen to hear from anyone who feels they have been unfairly targeted by a sham marriage investigation, and from any organisations working in this area. You can contact Jack Maxwell, one of our Research Fellows, at j.maxwell@publiclawproject.org.uk.

Background

The UK has a detailed legal framework for targeting ‘sham marriages’: where couples get married to avoid immigration law, rather than because they have a genuine relationship. As part of this framework, registrars are required to refer proposed marriages to the Home Office if either or both of the parties is subject to immigration control.

Since at least April 2019, the Home Office has used an algorithm to triage these referrals. The algorithm gives each couple a ‘Red’ or ‘Green’ rating. ‘Red’ means that the Home Office should investigate the couple ‘to rule out or identify sham activity’, while ‘Green’ means that an investigation is not warranted. Media reports indicate that sham marriage investigations can be highly invasive.

The core elements of the Home Office’s algorithm are set out below:

PLP is concerned that this algorithm may be flawed and discriminatory. In particular, the Home Office’s documents show that some nationalities, including Bulgarian, Greek, Romanian and Albanian people, have their marriages rated ‘Red’ at a much higher rate than others. The Home Office has – so far – refused to disclose all of the ‘risk factors’ used by the algorithm to rate a case. The risks of discrimination in algorithmic decision-making are well-known. In August last year, the Home Office scrapped an algorithm it used to help decide visa applications, in the face of allegations that it was racially discriminatory.

PLP’s Research Team continues to investigate this algorithm, as part of our ongoing work on tracking automated government.

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd