Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 3970

関連インシデント

インシデント 7401 Report
Department for Work and Pensions (DWP) AI Systems Allegedly Discriminate Against Single Mothers

Loading...
DWP accused of discriminating against single mums with Universal Credit changes
mirror.co.uk · 2024

The Universal Credit system is disproportionately impacting working single mothers, it has been claimed.

Universal Credit is managed by the Department for Work and Pensions (DWP) and is the most common UK benefit with 6.7million people claiming as of April 2024. Since its launch in 2013, Universal Credit has been a "digital by default" system, says the Big Issue. This is because it uses a range of automated processes to determine people's eligibility and entitlement and to detect fraud.

A recent Freedom of Information (FOI) request from the University of Edinburgh revealed that nearly half of in work Universal Credit claimants are single parents and the vast majority of them - just under 90% - are women. University researchers note that automation within the Universal Credit system could lead to a system based on discrimination and surveillance.

Morgan Currie, a senior lecturer in data and society at the University of Edinburgh told the Big Issue: "A certain subsection of the population, those on low incomes and with disabilities, are being put under surveillance by systems that simply don't affect the rest of the population."

Morgan explained that the three most common problems with the benefit's automated processes include mistakes caused by invalid information on a person's earnings, hardship due to delayed childcare reimbursement, and inconsistency on Universal Credit calculation dates and paydays. All of which "overwhelmingly affect" working single mothers.

Researchers told the Big Issue that the FOI points to "clear risks" of discrimination due to the way Universal Credit algorithms work. This is because they are based on historical information to make predictions about future events.

Anna Dent, head of research at Careful Trouble told the publication: "Existing biases can be encoded in automated decision making through the data that systems are trained on. If, say, a system to spot fraud is trained on data about historical fraud cases there is a risk that institutional biases will be baked in -- if disabled people, people from certain ethnic backgrounds or single parents, for example, have been disproportionately targeted in the past, there will be a higher percentage of fraud cases involving these groups, which will then encode the same bias into any system trained on that data."

An example of algorithmic issues is that Universal Credit's algorithm fails to factor in how often people are paid. This can often lead to an overestimation - or underestimation - of someone's earnings which can cause financial turbulence. For people on lower incomes, the inconsistency of payments can potentially force people into debt, whereas the month before they could scrape by.

A study by the University of Bath published in April found that over half of Universal Credit households had varied payments by £400 or more from one month to the next. For around a quarter, this amount was £600. Anna added: "[Because] single mums are disproportionately represented among working claimants, around 40%, the flaws with the payment algorithm are more likely to affect this group. What this means is that single working mums are especially likely to encounter income volatility and financial instability due to the means-testing calculation."

Automation is becoming more common within Government department decision making. However, researchers have said that the algorithmic discrimination runs alongside the Government's lack of transparency which means it is "near impossible" to challenge its decisions - even if they are wrong.

Currently, there are eight algorithms recorded and explained on the government's Algorithmic Recording Transparency Record. According to the Public Law Project, there are as many as 40 being used across local and main Government departments. Researchers explained that even though there is legislation which contains policies to safeguard people from algorithmic discrimination, it's hard to challenge automated discrimination without knowledge of how the algorithm is being used.

A DWP spokesperson said: "We will review the system so that everyone, including single mums, can benefit from a welfare system that makes work pay and tackles inequality."

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd