Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 5195

Associated Incidents

Incident 107515 Report
New Orleans Police Reportedly Used Real-Time Facial Recognition Alerts Supplied by Project NOLA Despite Local Ordinance

Loading...
Rights Groups Raise Alarm Over First US City's Broad Use of Facial Recognition Tracking
commondreams.org · 2025

Amid a Washington Post investigation and pushback from civil liberties defenders, New Orleans police recently paused their sweeping---and apparently unlawful---use without public oversight of a private network of over 200 surveillance cameras and facial recognition technology to track and arrest criminal suspects.

On Monday, the Post published an exposé detailing how the New Orleans Police Department (NOPD) relied on real-time facial recognition technology provided by Project NOLA, a nonprofit organization operating out of the University of New Orleans, to locate and apprehend suspects.

"Facial recognition technology poses a direct threat to the fundamental rights of every individual and has no place in our cities."

Project NOLA's website says the group "operates the largest, most cost-efficient, and successful networked [high definition] crime camera program in America, which was created in 2009 by criminologist Bryan Lagarde to help reduce crime by dramatically increasing police efficiency and citizen awareness."

The Post's Douglas MacMillan and Aaron Schaffer described Project NOLA as "a surveillance method without a known precedent in any major American city that may violate municipal guardrails around use of the technology."

As MacMillan and Schaffer reported:

Police increasingly use facial recognition software to identify unknown culprits from still images, usually taken by surveillance cameras at or near the scene of a crime. New Orleans police took this technology a step further, utilizing a private network of more than 200 facial recognition cameras to watch over the streets, constantly monitoring for wanted suspects and automatically pinging officers' mobile phones through an app to convey the names and current locations of possible matches.

This, despite a 2022 municipal law limiting police use of facial recognition. That ordinance reversed the city's earlier outright ban on the technology and was criticized by civil liberties advocates for dropping a provision that required permission from a judge or magistrate commissioner prior to use.

"This is the facial recognition technology nightmare scenario that we have been worried about," Nathan Freed Wessler, deputy director with the ACLU's Speech, Privacy, and Technology Project, told the Post. "This is the government giving itself the power to track anyone---for that matter, everyone---as we go about our lives walking around in public."

Since 2023, Project NOLA---which was paused last month amid the Post's investigation---has contributed to dozens of arrests. Proponents including NOPD and city officials credit the collaboration with Project NOLA for a decrease in crime in the city that had the nation's highest homicide rate as recently as 2022. Project NOLA has even been featured in the true crime series "Real Time Crime."

New Orleans Police Commissioner Anne Kirkpatrick told Project NOLA last month that its automated alerts must be shut off until she is "sure that the use of the app meets all the requirements of the law and policies."

Critics point to racial bias in facial recognition algorithms, which disproportionately misidentify racial minorities, as a particular cause for concern. According to one landmark federal study published in 2019, Black, Asian, and Native American people were up to 100 times likelier to be misidentified by facial recognition algorithms than white people.

The ACLU said in a statement that Project NOLA "supercharges the risks":

Consider Randal Reid, for example. He was wrongfully arrested based on faulty Louisiana facial recognition technology, despite never having set foot in the state. The false match cost him his freedom, his dignity, and thousands of dollars in legal fees. That misidentification happened based on a still image run through a facial recognition search in an investigation.

"We cannot ignore the real possibility of this tool being weaponized against marginalized communities, especially immigrants, activists, and others whose only crime is speaking out or challenging government policies," ACLU of Louisiana executive director Alanah Odoms said. "These individuals could be added to Project NOLA's watchlist without the public's knowledge, and with no accountability or transparency on the part of the police departments."

"Facial recognition technology poses a direct threat to the fundamental rights of every individual and has no place in our cities," Odoms asserted. "We call on the New Orleans Police Department and the city of New Orleans to halt this program indefinitely and terminate all use of live-feed facial recognition technology."

Correction: This article has been updated to accurately reflect the context of New Orleans Police Commissioner Anne Kirkpatrick's quote.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd