Incident 335: UK Visa Streamline Algorithm Allegedly Discriminated Based on Nationality

Description: UK Home Office's algorithm to assess visa application risks explicitly considered nationality, allegedly caused candidates to face more scrutiny and discrimination.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Incident Stats

Incident ID
335
Report Count
8
Incident Date
2015-03-01
Editors
Khoa Lam
Legal action to challenge Home Office use of secret algorithm to assess visa applications
foxglove.org.uk · 2017

It has come to light that the Home Office is using a secretive algorithm, which it describes as digital “streaming tool,” to sift visa applications. So far they have refused to disclose much information about how the algorithm works, hiding…

AI system for granting UK visas is biased, rights groups claim
theguardian.com · 2019

Immigrant rights campaigners have begun a ground-breaking legal case to establish how a Home Office algorithm that filters UK visa applications actually works.

The challenge is the first court bid to expose how an artificial intelligence pr…

The use of Artificial Intelligence by the Home Office to stream visa applications
kingsleynapley.co.uk · 2019

The growth of technology has brought a great deal of efficiency and security to almost all organisations and businesses. But such progress may have taken a slightly wrong turn as the reliance on artificial intelligence by the Home Office as…

Update: papers filed for judicial review of the Home Office’s visa algorithm
foxglove.org.uk · 2020

Foxglove is supporting the Joint Council for the Welfare of Immigrants (JCWI) to challenge the Home Office’s use of a secret algorithm to sift visa applications, which it describes as a digital “streaming tool”.

We share JCWI’s concerns tha…

UK commits to redesign visa streaming algorithm after challenge to 'racist' tool
techcrunch.com · 2020

The U.K. government is suspending the use of an algorithm used to stream visa applications after concerns were raised the technology bakes in unconscious bias and racism.

The tool had been the target of a legal challenge. The Joint Council …

Home Office says it will abandon its racist visa algorithm - after we sued them
foxglove.org.uk · 2020

Home Office lawyers wrote to us yesterday, to respond to the legal challenge which we’ve been working on with the Joint Council for the Welfare of Immigrants (JCWI)

We were asking the Court to declare the streaming algorithm unlawful, and…

Home Office drops 'racist' algorithm from visa decisions
bbc.com · 2020

The Home Office has agreed to stop using a computer algorithm to help decide visa applications after allegations that it contained "entrenched racism".

The Joint Council for the Welfare of Immigrants (JCWI) and digital rights group Foxglove…

We won! Home Office to stop using racist visa algorithm
jcwi.org.uk · 2020

We are delighted to announce that the Home Office has agreed to scrap its 'visa streaming' algorithm, in response to legal action we launched with tech-justice group Foxglove.

From Friday, 7 August, Home Secretary Priti Patel will suspend t…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.