Incident 40: COMPAS Algorithm Performs Poorly in Crime Recidivism Prediction

Description: Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), a recidivism risk-assessment algorithmic tool used in the judicial system to assess likelihood of defendants' recidivism, is found to be less accurate than random untrained human evaluators.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Equivant developed and deployed an AI system, which harmed Accused People.

Incident Stats

Incident ID
40
Report Count
22
Incident Date
2016-05-23
Editors
Sean McGregor

CSETv0 Taxonomy Classifications

Taxonomy Details

Full Description

In 2018, researchers at Dartmouth College conducted a study comparing the Correctional Offender Management Profiling for Alternative Sanctions' (COMPAS), a recidivism risk-assessment algorithmic tool, and 462 random untrained human subjects' ability to predict criminals' risk of recidivism. Researchers gave the subjects descriptions of defendents, highlighting seven pieces of information, and asked subjects to rate the risk of a defendant's recidivism from 1-10. The pooled judgment of these untrained subjects' was accurate 67% of the time, compared to COMPAS's accuracy rate of 65%.

Short Description

Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), a recidivism risk-assessment algorithmic tool used in the judicial system to assess likelihood of defendants' recidivism, is found to be less accurate than random untrained human evaluators.

Severity

Minor

Harm Type

Harm to social or political systems

AI System Description

predictive self-assessment algorithm that produces scores correlating to subject's recidivism risk

System Developer

Equivant

Sector of Deployment

Public administration and defence

Relevant AI functions

Perception, Cognition, Action

AI Techniques

law enforcement algorithm

AI Applications

risk assessment

Location

USA

Named Entities

Dartmouth College, Equivant

Technology Purveyor

Equivant

Beginning Date

2018-01-17T08:00:00.000Z

Ending Date

2018-01-17T08:00:00.000Z

Near Miss

Near miss

Intent

Accident

Lives Lost

No

Infrastructure Sectors

Government facilities

Data Inputs

Questionnaire consisting of 137 factors like age, prior convictions, criminal records

CSETv1 Taxonomy Classifications

Taxonomy Details

Harm Distribution Basis

race, nation of origin, citizenship, immigrant status

Sector of Deployment

law enforcement, public administration

Inspecting Algorithms for Bias

Inspecting Algorithms for Bias

technologyreview.com

How We Analyzed the COMPAS Recidivism Algorithm
propublica.org · 2016

← Read the story

Across the nation, judges, probation and parole officers are increasingly using algorithms to assess a criminal defendant’s likelihood of becoming a recidivist – a term used to describe criminals who re-offend. There are do…

Inspecting Algorithms for Bias
technologyreview.com · 2017

It was a striking story. “Machine Bias,” the headline read, and the teaser proclaimed: “There’s software used across the country to predict future criminals. And it’s biased against blacks.”

ProPublica, a Pulitzer Prize–winning nonprofit ne…

When a Computer Program Keeps You in Jail
nytimes.com · 2017

The criminal justice system is becoming automated. At every stage — from policing and investigations to bail, evidence, sentencing and parole — computer systems play a role. Artificial intelligence deploys cops on the beat. Audio sensors ge…

ProPublica Is Wrong In Charging Racial Bias In An Algorithm
acsh.org · 2018

Predicting the future is not only the provenance of fortune tellers or media pundits. Predictive algorithms, based on extensive datasets and statistics have overtaken wholesale and retail operations as any online shopper knows. And in the l…

Mechanical Turkers out-predicted COMPAS, a major judicial algorithm
theverge.com · 2018

Our most sophisticated crime-predicting algorithms may not be as good as we thought. A study published today in Science Advances takes a look at the popular COMPAS algorithm — used to assess the likelihood that a given defendant will reoffe…

A Popular Algorithm Is No Better at Predicting Crimes Than Random People
theatlantic.com · 2018

Caution is indeed warranted, according to Julia Dressel and Hany Farid from Dartmouth College. In a new study, they have shown that COMPAS is no better at predicting an individual’s risk of recidivism than random volunteers recruited from t…

Are programs better than people at predicting reoffending?
economist.com · 2018

IN AMERICA, computers have been used to assist bail and sentencing decisions for many years. Their proponents argue that the rigorous logic of an algorithm, trained with a vast amount of data, can make judgments about whether a convict will…

The accuracy, fairness, and limits of predicting recidivism
science.org · 2018

Algorithms for predicting recidivism are commonly used to assess a criminal defendant’s likelihood of committing a crime. These predictions are used in pretrial, parole, and sentencing decisions. Proponents of these systems argue that big d…

Software 'no more accurate than untrained humans' at judging reoffending risk
theguardian.com · 2018

Program used to assess more than a million US defendants may not be accurate enough for potentially life-changing decisions, say experts

The credibility of a computer program used for bail and sentencing decisions has been called into quest…

Bail Algorithms Are As Accurate As Random People Doing an Online Survey
motherboard.vice.com · 2018

Algorithms that assess people’s likelihood to reoffend as part of the bail-setting process in criminal cases are, to be frank, really scary.

We don’t know very much about how they work—the companies that make them are intensely secretive ab…

sciencedaily.com · 2018

A widely-used computer software tool may be no more accurate or fair at predicting repeat criminal behavior than people with no criminal justice experience, according to a Dartmouth College study.

The Dartmouth analysis showed that non-expe…

Crime-Predicting Algorithms May Not Fare Much Better Than Untrained Humans
wired.com · 2018

The American criminal justice system couldn’t get much less fair. Across the country, some 1.5 million people are locked up in state and federal prisons. More than 600,000 people, the vast majority of whom have yet to be convicted of a crim…

Common Computer Program Predicts Recidivism as Poorly as Humans
inverse.com · 2018

Just like a professional chef or a heart surgeon, a machine learning algorithm is only as good as the training it receives. And as algorithms increasingly take the reigns and make decisions for humans, we’re finding out that a lot of them d…

Algorithms Are No Better at Predicting Repeat Offenders Than Inexperienced Humans
futurism.com · 2018

Predicting Recidivism

Recidivism is the likelihood of a person convicted of a crime to offend again. Currently, this rate is determined by predictive algorithms. The outcome can affect everything from sentencing decisions to whether or not …

Study Finds Crime-Predicting Algorithm Is No Smarter Than Online Poll Takers
gizmodo.com.au · 2018

In a study published Wednesday, a pair of Dartmouth researchers found that a popular risk assessment algorithm was no better at predicting a criminal offender's likelihood of reoffending than an internet survey of humans with little or no r…

Criminal Sentencing Algorithm No More Accurate Than Random People on the Internet
pbs.org · 2018

Receive emails about upcoming NOVA programs and related content, as well as featured reporting about current events through a science lens. Email Address Zip Code Subscribe

An “unbiased” computer algorithm used for informing judicial decisi…

digitalethics.org · 2018

Although crime rates have fallen steadily since the 1990s, rates of recidivism remain a factor in the areas of both public safety and prisoner management. The National Institute of Justice defines recidivism as “criminal acts that resulted …

Can Racial Bias Ever Be Removed From Criminal Justice Algorithms?
psmag.com · 2018

(Photo: Joe Raedle/Getty Images)

Dozens of people packed into a Philadelphia courtroom on June 6th to voice their objections to a proposed criminal justice algorithm. The algorithm, developed by the Pennsylvania Commission on Sentencing, wa…

AI is convicting criminals and determining jail time, but is it fair?
weforum.org · 2018

When Netflix gets a movie recommendation wrong, you’d probably think that it’s not a big deal. Likewise, when your favourite sneakers don’t make it into Amazon’s list of recommended products, it’s probably not the end of the world. But when…

privacyinternational.org · 2019

In a study of COMPAS, an algorithmic tool used in the US criminal justice system , Dartmouth College researchers Julia Dressel and Hany Farid found that the algorithm did no better than volunteers recruited via a crowdsourcing site. COMPAS,…

upturn.org · 2020

PRELIMINARY STATEMENT AND STATEMENT OF INTEREST

Independent and adversarial review of software used in the

criminal legal system is necessary to protect the courts from

unreliable evidence and to ensure that the introduction of new

technolo…

slaw.ca · 2020

Recidivism risk assessment is the process of determining the likelihood that an accused, convicted, or incarcerated persons will reoffend. The process is aimed at assisting in the determination of the appropriate limitation on the freedom o…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents