インシデント 11の引用情報

Description: An algorithm developed by Northpointe and used in the penal system is two times more likely to incorrectly label a black person as a high-risk re-offender and is two times more likely to incorrectly label a white person as low-risk for reoffense according to a ProPublica review.
推定: Northpointeが開発し提供したAIシステムで、Accused Peopleに影響を与えた

インシデントのステータス

インシデントID
11
レポート数
15
インシデント発生日
2016-05-23
エディタ
Sean McGregor

CSETv0 分類法のクラス

分類法の詳細

Full Description

An algorithm developed by Northpointe and used in the penal system is shown to be inaccurate and produces racially-skewed results according to a review by ProPublica. The review shows how the 137-question survey given following an arrest is inaccurate and skewed against people of color. While there is not question regarding race in the survey, the algorithm is two times more likely to incorrectly label a black person as a high-risk re-offender (False Positive) and is also two times more likely to incorrectly label a white person as low-risk for reoffense (False Negative) than actual statistics support. Overall, the algorithm is 61% effective at predicting reoffense. This system is used in Broward County, Florida to help judges make decisions surrounding pre-trial release and sentencing post-trial.

Short Description

An algorithm developed by Northpointe and used in the penal system is two times more likely to incorrectly label a black person as a high-risk re-offender and is two times more likely to incorrectly label a white person as low-risk for reoffense according to a ProPublica review.

Severity

Unclear/unknown

Harm Distribution Basis

Race

Harm Type

Harm to civil liberties, Other:Reputational harm; False incarceration

AI System Description

An algorithm, developed by Northpointe designed to assign a risk score associated with a person's likelihood of reoffending after their original arrest.

System Developer

Northpointe

Sector of Deployment

Public administration and defence

Relevant AI functions

Perception, Cognition

AI Techniques

law enforcement algorithm, crime prediction algorithm

AI Applications

risk assesment, crime projection

Location

Broward County, Florida

Named Entities

ProPublica, Northpointe, COMPAS, Broward County, FL

Technology Purveyor

Northpointe

Beginning Date

2016-01-01T00:00:00.000Z

Ending Date

2019-01-01T00:00:00.000Z

Near Miss

Harm caused

Intent

Accident

Lives Lost

No

Infrastructure Sectors

Government facilities

Data Inputs

137-question survey

CSETv1 分類法のクラス

分類法の詳細

Harm Distribution Basis

race

Sector of Deployment

law enforcement, public administration

privacyinternational.org · 2016

Computer programs that perform risk assessments of crime suspects are increasingly common in American courtrooms, and are used at every stage of the criminal justice systems to determine who may be set free or granted parole, and the size o…

How We Analyzed the COMPAS Recidivism Algorithm
propublica.org · 2016

Across the nation, judges, probation and parole officers are increasingly using algorithms to assess a criminal defendant’s likelihood of becoming a recidivist – a term used to describe criminals who re-offend. There are dozens of these ris…

U.S. Courts Are Using Algorithms Riddled With Racism to Hand Out Sentences
mic.com · 2016

For years, the criminal justice community has been worried. Courts across the country are assigning bond amounts sentencing the accused based on algorithms, and both lawyers and data scientists warn that these algorithms could be poisoned b…

Machine Bias - ProPublica
propublica.org · 2016

On a spring afternoon in 2014, Brisha Borden was running late to pick up her god-sister from school when she spotted an unlocked kid’s blue Huffy bicycle and a silver Razor scooter. Borden and a friend grabbed the bike and scooter and tried…

The Hidden Discrimination In Criminal Risk-Assessment Scores
npr.org · 2016

The Hidden Discrimination In Criminal Risk-Assessment Scores

Courtrooms across the country are increasingly using a defendant's "risk assessment score" to help make decisions about bond, parole and sentencing. The companies behind these sco…

Even algorithms are biased against black men
theguardian.com · 2016

One of my most treasured possessions is The Art of Computer Programming by Donald Knuth, a computer scientist for whom the word “legendary” might have been coined. In a way, one could think of his magnum opus as an attempt to do for compute…

Are criminal risk assessment scores racist?
brookings.edu · 2016

Imagine you were found guilty of a crime and were waiting to learn your sentence. Would you rather have your sentence determined by a computer algorithm, which dispassionately weights factors that predict your future risk of crime (such as …

A New Program Judges If You’re a Criminal From Your Facial Features
vice.com · 2016

Like a more crooked version of the Voight-Kampff test from Blade Runner, a new machine learning paper from a pair of Chinese researchers has delved into the controversial task of letting a computer decide on your innocence. Can a computer k…

ProPublica Is Wrong In Charging Racial Bias In An Algorithm
acsh.org · 2018

Predicting the future is not only the provenance of fortune tellers or media pundits. Predictive algorithms, based on extensive datasets and statistics have overtaken wholesale and retail operations as any online shopper knows. And in the l…

A Popular Algorithm Is No Better at Predicting Crimes Than Random People
theatlantic.com · 2018

Caution is indeed warranted, according to Julia Dressel and Hany Farid from Dartmouth College. In a new study, they have shown that COMPAS is no better at predicting an individual’s risk of recidivism than random volunteers recruited from t…

Algorithmic Injustice
thenewatlantis.com · 2018

Don’t blame the algorithm — as long as there are racial disparities in the justice system, sentencing software can never be entirely fair.

For generations, the Maasai people of eastern Africa have passed down the story of a tireless old man…

digitalethics.org · 2018

Although crime rates have fallen steadily since the 1990s, rates of recidivism remain a factor in the areas of both public safety and prisoner management. The National Institute of Justice defines recidivism as “criminal acts that resulted …

New York City Takes on Algorithmic Discrimination
aclu.org · 2018

Invisible algorithms increasingly shape the world we live in, and not always for the better. Unfortunately, few mechanisms are in place to ensure they’re not causing more harm than good.

That might finally be changing: A first-in-the-nation…

Yes, artificial intelligence can be racist
vox.com · 2019

Open up the photo app on your phone and search “dog,” and all the pictures you have of dogs will come up. This was no easy feat. Your phone knows what a dog “looks” like.

This modern-day marvel is the result of machine learning, a form of a…

Can you make AI fairer than a judge? Play our courtroom algorithm game
technologyreview.com · 2019

As a child, you develop a sense of what “fairness” means. It’s a concept that you learn early on as you come to terms with the world around you. Something either feels fair or it doesn’t.

But increasingly, algorithms have begun to arbitrate…

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents