インシデント 335の引用情報

Description: UK Home Office's algorithm to assess visa application risks explicitly considered nationality, allegedly caused candidates to face more scrutiny and discrimination.
推定: UK Visas and Immigration UK Home Officeが開発し、UK Visas and Immigrationが提供したAIシステムで、UK visa applicants from some countriesに影響を与えた

インシデントのステータス

インシデントID
335
レポート数
8
インシデント発生日
2015-03-01
エディタ
Khoa Lam
Legal action to challenge Home Office use of secret algorithm to assess visa applications
foxglove.org.uk · 2017

It has come to light that the Home Office is using a secretive algorithm, which it describes as digital “streaming tool,” to sift visa applications. So far they have refused to disclose much information about how the algorithm works, hiding…

AI system for granting UK visas is biased, rights groups claim
theguardian.com · 2019

Immigrant rights campaigners have begun a ground-breaking legal case to establish how a Home Office algorithm that filters UK visa applications actually works.

The challenge is the first court bid to expose how an artificial intelligence pr…

The use of Artificial Intelligence by the Home Office to stream visa applications
kingsleynapley.co.uk · 2019

The growth of technology has brought a great deal of efficiency and security to almost all organisations and businesses. But such progress may have taken a slightly wrong turn as the reliance on artificial intelligence by the Home Office as…

Update: papers filed for judicial review of the Home Office’s visa algorithm
foxglove.org.uk · 2020

Foxglove is supporting the Joint Council for the Welfare of Immigrants (JCWI) to challenge the Home Office’s use of a secret algorithm to sift visa applications, which it describes as a digital “streaming tool”.

We share JCWI’s concerns tha…

UK commits to redesign visa streaming algorithm after challenge to 'racist' tool
techcrunch.com · 2020

The U.K. government is suspending the use of an algorithm used to stream visa applications after concerns were raised the technology bakes in unconscious bias and racism.

The tool had been the target of a legal challenge. The Joint Council …

Home Office says it will abandon its racist visa algorithm - after we sued them
foxglove.org.uk · 2020

Home Office lawyers wrote to us yesterday, to respond to the legal challenge which we’ve been working on with the Joint Council for the Welfare of Immigrants (JCWI)

We were asking the Court to declare the streaming algorithm unlawful, and…

Home Office drops 'racist' algorithm from visa decisions
bbc.com · 2020

The Home Office has agreed to stop using a computer algorithm to help decide visa applications after allegations that it contained "entrenched racism".

The Joint Council for the Welfare of Immigrants (JCWI) and digital rights group Foxglove…

We won! Home Office to stop using racist visa algorithm
jcwi.org.uk · 2020

We are delighted to announce that the Home Office has agreed to scrap its 'visa streaming' algorithm, in response to legal action we launched with tech-justice group Foxglove.

From Friday, 7 August, Home Secretary Priti Patel will suspend t…

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents