Associated Incidents

In the past, artificial intelligence (AI) use in public services has caused uproars. For instance, in the Netherlands, tax authorities used technology to find fraud but made many mistakes. This resulted in a hefty fine of €3.7m, pushing thousands of families into poverty.
A recent inquiry by The Guardian unveiled that government officials and civil servants in a minimum of eight Whitehall departments and some police forces in the United Kingdom employed AI to make significant determinations regarding benefit allocations.
They used AI and complex algorithms to make decisions on welfare, immigration, and criminal justice matters. The tools are used to determine benefits, approve marriage licenses, identify potential fraud, and flag fake marriages, among other tasks.
AI racially discriminated
The Guardian demonstrated particular tools, such as an algorithm from the Department for Work and Pensions, inaccurately cut benefits for many. Furthermore, the Metropolitan Police's facial recognition software exhibited racial bias, favoring white faces over black ones in specific conditions.
The Home Office's algorithm designed to detect fake marriages has disproportionately targeted individuals from specific nationalities.
Usually, AI comprehends big sets of existing data. However, its creators may not fully grasp how it processes the information. If the source of the data it learns from is biased, AI might make biased decisions, experts cautioned.
Shameem Ahmad, the chief executive of the Public Law Project favoring the technology, stated that AI has tremendous potential for social good.
"For instance, we can make things more efficient. But we cannot ignore the serious risks," Ahmad added. "Without urgent action, we could sleep-walk into a situation where opaque automated systems are regularly, possibly unlawfully, used in life-altering ways, and where people will not be able to seek redress when those processes go wrong."
Tech identifying fake marriages
The Home Office said that it employed AI in e-gates at airports for passport scanning, aiding passport applications, and in their "sham marriage triage tool" to identify potential fake marriages for further scrutiny.
However, The Guardian investigation found that the tool disproportionately flags up people from Albania, Greece, Romania, and Bulgaria.
Also, the Department of Work and Pensions (DWP) operates an "integrated risk and intelligence service" that employs an algorithm to identify fraud and errors in benefits claims.
According to Labour MP Kate Osamor, this algorithm might have led to a myriad of Bulgarians having their benefits wrongly suspended and falsely accused of potential fraud in recent years.
The DWP emphasized that the algorithm doesn't consider nationality in its calculations. A spokesperson further stated:
"We are cracking down on those who try to exploit the system and shamelessly steal from those most in need as we continue our drive to save the taxpayer £1.3bn next year."