Incident 74: Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT

Suggested citation format

McGregor, Sean. (2020-01-30) Incident Number 74. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Editors
74
9
2020-01-30
Sean McGregor, Khoa Lam

CSET Taxonomy Classifications

Taxonomy Details

Full Description

In June 2020, the Detroit Police Department wrongfully arrested Robert Julian-Borchak Williams after facial recognition techonology provided by DataWorks Plus had mistaken Williams for a black man who was recorded on a CCTV camera stealing. This incident is cited as an instance where facial recognition continues to possess racial bias, especially towards the Black and Asian population.

Short Description

The Detroit Police Department wrongfully arrest a black man due to its faulty facial recognition program provided by Dataworks Plus.

Severity

Moderate

Harm Distribution Basis

Race

Harm Type

Harm to civil liberties

AI System Description

DataWorks Plus facial recognition software was provided to the Detroit Police Department and focuses on biometrics storage and matching, including fingerprints, palm prints, irises, tattoos, and mugshots.

System Developer

DataWorks Plus

Sector of Deployment

Public administration and defence

Relevant AI functions

Perception, Cognition, Action

AI Techniques

facial recognition, machine learning, environmental sensing

AI Applications

Facial recognition, environmental sensing, biometrics, image recognition, speech recognition

Location

United States (Detroit, Michigan)

Named Entities

Detroit Police Department, DataWorks Plus

Technology Purveyor

DataWorks Plus

Beginning Date

06/2020

Ending Date

06/2020

Near Miss

Harm caused

Intent

Accident

Lives Lost

No

Data Inputs

biometrics, images, camera footage

Incidents Reports

Detroit police wrongfully arrested Robert Julian-Borchak Williams in January 2020 for a shoplifting incident that had taken place two years earlier. Even though Williams had nothing to do with the incident, facial recognition technology used by Michigan State Police “matched” his face with a grainy image obtained from an in-store surveillance video showing another African American man taking US$3,800 worth of watches.

Two weeks later, the case was dismissed at the prosecution’s request. However, relying on the faulty match, police had already handcuffed and arrested Williams in front of his family, forced him to provide a mug shot, fingerprints and a sample of his DNA, interrogated him and imprisoned him overnight.

Experts suggest that Williams is not alone, and that others have been subjected to similar injustices. The ongoing controversy about police use of Clearview AI certainly underscores the privacy risks posed by facial recognition technology. But it’s important to realize that not all of us bear those risks equally.

Training racist algorithms

Facial recognition technology that is trained on and tuned to Caucasian faces systematically misidentifies and mislabels racialized individuals: numerous studies report that facial recognition technology is “flawed and biased, with significantly higher error rates when used against people of colour.”

This undermines the individuality and humanity of racialized persons who are more likely to be misidentified as criminal. The technology — and the identification errors it makes — reflects and further entrenches long-standing social divisions that are deeply entangled with racism, sexism, homophobia, settler-colonialism and other intersecting oppressions.

A France24 investigation into racial bias in facial recognition technology.

How technology categorizes users

In his game-changing 1993 book, The Panoptic Sort, scholar Oscar Gandy warned that “complex technology [that] involves the collection, processing and sharing of information about individuals and groups that is generated through their daily lives … is used to coordinate and control their access to the goods and services that define life in the modern capitalist economy.” Law enforcement uses it to pluck suspects from the general public, and private organizations use it to determine whether we have access to things like banking and employment.

Gandy prophetically warned that, if left unchecked, this form of “cybernetic triage” would exponentially disadvantage members of equality-seeking communities — for example, groups that are racialized or socio-economically disadvantaged — both in terms of what would be allocated to them and how they might come to understand themselves.

Some 25 years later, we’re now living with the panoptic sort on steroids. And examples of its negative effects on equality-seeking communities abound, such as the false identification of Williams.

Pre-existing bias

This sorting using algorithms infiltrates the most fundamental aspects of everyday life, occasioning both direct and structural violence in its wake.

The direct violence experienced by Williams is immediately evident in the events surrounding his arrest and detention, and the individual harms he experienced are obvious and can be traced to the actions of police who chose to rely on the technology’s “match” to make an arrest. More insidious is the structural violence perpetrated through facial recognition technology and other digital technologies that rate, match, categorize and sort individuals in ways that magnify pre-existing discriminatory patterns.

Structural violence harms are less obvious and less direct, and cause injury to equality-seeking groups through systematic denial to power, resources and opportunity. Simultaneously, it increases direct risk and harm to individual members of those groups.

Predictive policing uses algorithmic processing of historical data to predict when and where new crimes are likely to occur, assigns police resources accordingly and embeds enhanced police surveillance into communities, usually in lower-income and racialized neighbourhoods. This increases the chances that any criminal activity — including less serious criminal activity that might otherwise prompt no police response — will be detected and punished, ultimately limiting the life chances of the people who live within that environment.

And the evidence of inequities in other sectors continues to mount. Hundreds of students in the United Kingdom protested on Aug. 16 against the disastrous results of Ofqual, a flawed algorithm the U.K. government used to determine which students would qualify for university. In 2019, Facebook’s microtargeting ad service helped dozens of public and private sector employers exclude people from receiving job ads on the basis of age and gender. Research conducted by ProPublica has documented race-based price discrimination for online products. And search engines regularly produce racist and sexist results.

Perpetuating oppression

These outcomes matter because they perpetuate and deepen pre-existing inequalities based on characteristics like race, gender and age. They also matter because they deeply affect how we come to know ourselves and the world around us, sometimes by pre-selecting the information we receive in ways that reinforce stereotypical perceptions. Even technology companies themselves acknowledge the urgency of stopping algorithms from perpetuating discrimination.

To date the success of ad hoc investigations, conducted by the tech companies themselves, has been inconsistent. Occasionally, corporations involved in producing discriminatory systems withdraw them from the market, such as when Clearview AI announced it would no longer offer facial recognition technology in Canada. But often such decisions result from regulatory scrutiny or public outcry only after members of equality-seeking communities have already been harmed.

It’s time to give our regulatory institutions the tools they need to address the problem. Simple privacy protections that hinge on obtaining individual consent to enable data to be captured and repurposed by companies cannot be separated from the discriminatory outcomes of that use. This is especially true in an era when most of us (including technology companies themselves) cannot fully understand what algorithms do or why they produce specific results.

Privacy is a human right

Part of the solution entails breaking down the current regulatory silos that treat privacy and human rights as separate issues. Relying on a consent-based data protection model flies in the face of the basic principle that privacy and equality are both human rights that cannot be contracted away.

Even Canada’s Digital Charter — the federal government’s latest attempt to respond to the shortcomings of the current state of the digital environment — maintains these conceptual distinctions. It treats hate and extremism, control and consent, and strong democracy as separate categories.

To address algorithmic discrimination, we must recognize and frame both privacy and equality as human rights. And we must create an infrastructure that is equally attentive to and expert in both. Without such efforts, the glossy sheen of math and science will continue to camouflage AI’s discriminatory biases, and travesties such as that inflicted on Williams can be expected to multiply.

AI technologies — like police facial recognition — discriminate against people of colour

"Note: In response to this article, the Wayne County prosecutor’s office said that Robert Julian-Borchak Williams could have the case and his fingerprint data expunged. “We apologize,” the prosecutor, Kym L. Worthy, said in a statement, adding, “This does not in any way make up for the hours that Mr. Williams spent in jail.”

On a Thursday afternoon in January, Robert Julian-Borchak Williams was in his office at an automotive supply company when he got a call from the Detroit Police Department telling him to come to the station to be arrested. He thought at first that it was a prank.

An hour later, when he pulled into his driveway in a quiet subdivision in Farmington Hills, Mich., a police car pulled up behind, blocking him in. Two officers got out and handcuffed Mr. Williams on his front lawn, in front of his wife and two young daughters, who were distraught. The police wouldn’t say why he was being arrested, only showing him a piece of paper with his photo and the words “felony warrant” and “larceny.”

His wife, Melissa, asked where he was being taken. “Google it,” she recalls an officer replying.

The police drove Mr. Williams to a detention center. He had his mug shot, fingerprints and DNA taken, and was held overnight. Around noon on Friday, two detectives took him to an interrogation room and placed three pieces of paper on the table, face down.

“When’s the last time you went to a Shinola store?” one of the detectives asked, in Mr. Williams’s recollection. Shinola is an upscale boutique that sells watches, bicycles and leather goods in the trendy Midtown neighborhood of Detroit. Mr. Williams said he and his wife had checked it out when the store first opened in 2014.

The detective turned over the first piece of paper. It was a still image from a surveillance video, showing a heavyset man, dressed in black and wearing a red St. Louis Cardinals cap, standing in front of a watch display. Five timepieces, worth $3,800, were shoplifted.

“Is this you?” asked the detective.

The second piece of paper was a close-up. The photo was blurry, but it was clearly not Mr. Williams. He picked up the image and held it next to his face.

“No, this is not me,” Mr. Williams said. “You think all black men look alike?”

Mr. Williams knew that he had not committed the crime in question. What he could not have known, as he sat in the interrogation room, is that his case may be the first known account of an American being wrongfully arrested based on a flawed match from a facial recognition algorithm, according to experts on technology and the law.

A faulty system

A nationwide debate is raging about racism in law enforcement. Across the country, millions are protesting not just the actions of individual officers, but bias in the systems used to surveil communities and identify people for prosecution.

Facial recognition systems have been used by police forces for more than two decades. Recent studies by M.I.T. and the National Institute of Standards and Technology, or NIST, have found that while the technology works relatively well on white men, the results are less accurate for other demographics, in part because of a lack of diversity in the images used to develop the underlying databases.

Last year, during a public hearing about the use of facial recognition in Detroit, an assistant police chief was among those who raised concerns. “On the question of false positives — that is absolutely factual, and it’s well-documented,” James White said. “So that concerns me as an African-American male.”

This month, Amazon, Microsoft and IBM announced they would stop or pause their facial recognition offerings for law enforcement. The gestures were largely symbolic, given that the companies are not big players in the industry. The technology police departments use is supplied by companies that aren’t household names, such as Vigilant Solutions, Cognitec, NEC, Rank One Computing and Clearview AI.

Clare Garvie, a lawyer at Georgetown University’s Center on Privacy and Technology, has written about problems with the government’s use of facial recognition. She argues that low-quality search images — such as a still image from a grainy surveillance video — should be banned, and that the systems currently in use should be tested rigorously for accuracy and bias.

“There are mediocre algorithms and there are good ones, and law enforcement should only buy the good ones,” Ms. Garvie said.

About Mr. Williams’s experience in Michigan, she added: “I strongly suspect this is not the first case to misidentify someone to arrest them for a crime they didn’t commit. This is just the first time we know about it.”

In a perpetual lineup

Mr. Williams’s case combines flawed technology with poor police work, illustrating how facial recognition can go awry.

The Shinola shoplifting occurred in October 2018. Katherine Johnston, an investigator at Mackinac Partners, a loss prevention firm, reviewed the store’s surveillance video and sent a copy to the Detroit police, according to their report.

Five months later, in March 2019, Jennifer Coulson, a digital image examiner for the Michigan State Police, uploaded a “probe image” — a still from the video, showing the man in the Cardinals cap — to the state’s facial recognition database. The system would have mapped the man’s face and searched for similar ones in a collection of 49 million photos.

The state’s technology is supplied for $5.5 million by a company called DataWorks Plus. Founded in South Carolina in 2000, the company first offered mug shot management software, said Todd Pastorini, a general manager. In 2005, the firm began to expand the product, adding face recognition tools developed by outside vendors.

When one of these subcontractors develops an algorithm for recognizing faces, DataWorks attempts to judge its effectiveness by running searches using low-quality images of individuals it knows are present in a system. “We’ve tested a lot of garbage out there,” Mr. Pastorini said. These checks, he added, are not “scientific” — DataWorks does not formally measure the systems’ accuracy or bias.

“We’ve become a pseudo-expert in the technology,” Mr. Pastorini said.

In Michigan, the DataWorks software used by the state police incorporates components developed by the Japanese tech giant NEC and by Rank One Computing, based in Colorado, according to Mr. Pastorini and a state police spokeswoman. In 2019, algorithms from both companies were included in a federal study of over 100 facial recognition systems that found they were biased, falsely identifying African-American and Asian faces 10 times to 100 times more than Caucasian faces.

Rank One’s chief executive, Brendan Klare, said the company had developed a new algorithm for NIST to review that “tightens the differences in accuracy between different demographic cohorts.”

After Ms. Coulson, of the state police, ran her search of the probe image, the system would have provided a row of results generated by NEC and a row from Rank One, along with confidence scores. Mr. Williams’s driver’s license photo was among the matches. Ms. Coulson sent it to the Detroit police as an “Investigative Lead Report.”

“This document is not a positive identification,” the file says in bold capital letters at the top. “It is an investigative lead only and is not probable cause for arrest.”

This is what technology providers and law enforcement always emphasize when defending facial recognition: It is only supposed to be a clue in the case, not a smoking gun. Before arresting Mr. Williams, investigators might have sought other evidence that he committed the theft, such as eyewitness testimony, location data from his phone or proof that he owned the clothing that the suspect was wearing.

In this case, however, according to the Detroit police report, investigators simply included Mr. Williams’s picture in a “6-pack photo lineup” they created and showed to Ms. Johnston, Shinola’s loss-prevention contractor, and she identified him. (Ms. Johnston declined to comment.)

‘I guess the computer got it wrong’

Mr. Pastorini was taken aback when the process was described to him. “It sounds thin all the way around,” he said.

Mr. Klare, of Rank One, found fault with Ms. Johnston’s role in the process. “I am not sure if this qualifies them as an eyewitness, or gives their experience any more weight than other persons who may have viewed that same video after the fact,” he said. John Wise, a spokesman for NEC, said: “A match using facial recognition alone is not a means for positive identification.”

The Friday that Mr. Williams sat in a Detroit police interrogation room was the day before his 42nd birthday. That morning, his wife emailed his boss to say he would miss work because of a family emergency; it broke his four-year record of perfect attendance.

In Mr. Williams’s recollection, after he held the surveillance video still next to his face, the two detectives leaned back in their chairs and looked at one another. One detective, seeming chagrined, said to his partner: “I guess the computer got it wrong.”

They turned over a third piece of paper, which was another photo of the man from the Shinola store next to Mr. Williams’s driver’s license. Mr. Williams again pointed out that they were not the same person.

Mr. Williams asked if he was free to go. “Unfortunately not,” one detective said.

Mr. Williams was kept in custody until that evening, 30 hours after being arrested, and released on a $1,000 personal bond. He waited outside in the rain for 30 minutes until his wife could pick him up. When he got home at 10 p.m., his five-year-old daughter was still awake. She said she was waiting for him because he had said, while being arrested, that he’d be right back.

She has since taken to playing “cops and robbers” and accuses her father of stealing things, insisting on “locking him up” in the living room.

Getting help

The Williams family contacted defense attorneys, most of whom, they said, assumed Mr. Williams was guilty of the crime and quoted prices of around $7,000 to represent him. Ms. Williams, a real estate marketing director and food blogger, also tweeted at the American Civil Liberties Union of Michigan, which took an immediate interest.

“We’ve been active in trying to sound the alarm bells around facial recognition, both as a threat to privacy when it works and a racist threat to everyone when it doesn’t,” said Phil Mayor, an attorney at the organization. “We know these stories are out there, but they’re hard to hear about because people don’t usually realize they’ve been the victim of a bad facial recognition search.”

Two weeks after his arrest, Mr. Williams took a vacation day to appear in a Wayne County court for an arraignment. When the case was called, the prosecutor moved to dismiss, but “without prejudice,” meaning Mr. Williams could later be charged again.

Maria Miller, a spokeswoman for the prosecutor, said a second witness had been at the store in 2018 when the shoplifting occurred, but had not been asked to look at a photo lineup. If the individual makes an identification in the future, she said, the office will decide whether to issue charges.

A Detroit police spokeswoman, Nicole Kirkwood, said that for now, the department “accepted the prosecutor’s decision to dismiss the case.” She also said that the department updated its facial recognition policy in July 2019 so that it is only used to investigate violent crimes.

The department, she said in another statement, “does not make arrests based solely on facial recognition. The investigator reviewed video, interviewed witnesses, conducted a photo lineup.”

On Wednesday, the A.C.L.U. of Michigan filed a complaint with the city, asking for an absolute dismissal of the case, an apology and the removal of Mr. Williams’s information from Detroit’s criminal databases.

The Detroit Police Department “should stop using facial recognition technology as an investigatory tool,” Mr. Mayor wrote in the complaint, adding, “as the facts of Mr. Williams’s case prove both that the technology is flawed and that DPD investigators are not competent in making use of such technology.”

Mr. Williams’s lawyer, Victoria Burton-Harris, said that her client is “lucky,” despite what he went through.

“He is alive,” Ms. Burton-Harris said. “He is a very large man. My experience has been, as a defense attorney, when officers interact with very large men, very large black men, they immediately act out of fear. They don’t know how to de-escalate a situation.”

‘It was humiliating’

Mr. Williams and his wife have not talked to their neighbors about what happened. They wonder whether they need to put their daughters into therapy. Mr. Williams’s boss advised him not to tell anyone at work.

“My mother doesn’t know about it. It’s not something I’m proud of,” Mr. Williams said. “It’s humiliating.”

He has since figured out what he was doing the evening the shoplifting occurred. He was driving home from work, and had posted a video to his private Instagram because a song he loved came on — 1983’s “We Are One,” by Maze and Frankie Beverly. The lyrics go:

I can’t understand

Why we treat each other in this way

Taking up time

With the silly silly games we play

He had an alibi, had the Detroit police checked for one."

Wrongfully Accused by an Algorithm

Updated 9:05 p.m. ET Wednesday

Police in Detroit were trying to figure out who stole five watches from a Shinola retail store. Authorities say the thief took off with an estimated $3,800 worth of merchandise.

Investigators pulled a security video that had recorded the incident. Detectives zoomed in on the grainy footage and ran the person who appeared to be the suspect through facial recognition software.

A hit came back: Robert Julian-Borchak Williams, 42, of Farmington Hills, Mich., about 25 miles northwest of Detroit.

In January, police pulled up to Williams' home and arrested him while he stood on his front lawn in front of his wife and two daughters, ages 2 and 5, who cried as they watched their father being placed in the patrol car.

His wife, Melissa Williams, wanted to know where police were taking her husband.

" 'Google it,' " she recalls an officer telling her.

Robert Williams was led to an interrogation room, and police put three photos in front of him: Two photos taken from the surveillance camera in the store and a photo of Williams' state-issued driver's license.

"When I look at the picture of the guy, I just see a big Black guy. I don't see a resemblance. I don't think he looks like me at all," Williams said in an interview with NPR.

"[The detective] flips the third page over and says, 'So I guess the computer got it wrong, too.' And I said, 'Well, that's me,' pointing at a picture of my previous driver's license," Williams said of the interrogation with detectives. " 'But that guy's not me,' " he said, referring to the other photographs.

"I picked it up and held it to my face and told him, 'I hope you don't think all Black people look alike,' " Williams said.

Williams was detained for 30 hours and then released on bail until a court hearing on the case, his lawyers say.

At the hearing, a Wayne County prosecutor announced that the charges against Williams were being dropped due to insufficient evidence.

Civil rights experts say Williams is the first documented example in the U.S. of someone being wrongfully arrested based on a false hit produced by facial recognition technology.

Lawyer: Artificial intelligence 'framed and informed everything'

What makes Williams' case extraordinary is that police admitted that facial recognition technology, conducted by Michigan State Police in a crime lab at the request of the Detroit Police Department, prompted the arrest, according to charging documents reviewed by NPR.

The pursuit of Williams as a possible suspect came despite repeated claims by him and his lawyers that the match generated by artificial intelligence was faulty.

The alleged suspect in the security camera image was wearing a red St. Louis Cardinals hat. Williams, a Detroit native, said he would under no circumstances be wearing that hat.

"They never even asked him any questions before arresting him. They never asked him if he had an alibi. They never asked if he had a red Cardinals hat. They never asked him where he was that day," said lawyer Phil Mayor with the ACLU of Michigan.

On Wednesday, the ACLU of Michigan filed a complaint against the Detroit Police Department asking that police stop using the software in investigations.

In a statement to NPR, the Detroit Police Department said after the Williams case, the department enacted new rules. Now, only still photos, not security footage, can be used for facial recognition. And it is now used only in the case of violent crimes.

"Facial recognition software is an investigative tool that is used to generate leads only. Additional investigative work, corroborating evidence and probable cause are required before an arrest can be made," Detroit Police Department Sgt. Nicole Kirkwood said in a statement.

In Williams' case, police had asked the store security guard, who had not witnessed the robbery, to pick the suspect out of a photo lineup based on the footage, and the security guard selected Williams.

Victoria Burton-Harris, Williams' lawyer, said in an interview that she is skeptical that investigators used the facial recognition software as only one of several possible leads.

"When that technology picked my client's face out, from there, it framed and informed everything that officers did subsequently," Burton-Harris said.

Academic and government studies have demonstrated that facial recognition systems misidentify people of color more often than white people.

One of the leading studies on bias in face recognition was conducted by Joy Buolamwini, an MIT researcher and founder of the Algorithmic Justice League.

"This egregious mismatch shows just one of the dangers of facial recognition technology which has already been shown in study after study to fail people of color, people with dark skin more than white counterparts generally speaking," Buolamwini said.

"The threats to civil liberties posed by mass surveillance are too high a price," she said. "You cannot erase the experience of 30 hours detained, the memories of children seeing their father arrested, or the stigma of being labeled criminal."

Maria Miller, a spokeswoman for the prosecutor's office, said the case was dismissed over insufficient evidence, including that the charges were filed without the support of any live witnesses.

Wayne County Prosecutor Kym Worthy said any case sent to her office that uses facial recognition technology cannot move forward without other supporting evidence.

"This case should not have been issued based on the DPD investigation, and for that we apologize," Worthy said in a statement to NPR. "Thankfully, it was dismissed on our office's own motion. This does not in any way make up for the hours that Mr. Williams spent in jail."

Worthy said Williams is able to have the case expunged from his record.

Williams: "Let's say that this case wasn't retail fraud. What if it's rape or murder?"

According to Georgetown Law's Center on Privacy and Technology, at least a quarter of the nation's law enforcement agencies have access to face recognition tools.

"Most of the time, people who are arrested using face recognition are not told face recognition was used to arrest them," said Jameson Spivack, a researcher at the center.

While Amazon, Microsoft and IBM have announced a halt to sales of face recognition technology to law enforcement, Spivack said that will have little effect, since most major facial recognition software contracts with police are with smaller, more specialized companies, like South Carolina-based DataWorks Plus, which is the company that supplied the Detroit Police Department with its face-scanning software.

The company did not respond to an interview request.

DataWorks Plus has supplied the technology to government agencies in Santa Barbara, Calif., Chicago and Philadelphia.

Facial recognition technology is used by consumers every day to unlock their smartphones or to tag friends on social media. Some airports use the technology to scan passengers before they board flights.

Its deployment by governments, though, has drawn concern from privacy advocates and experts who study the machine learning tool and have highlighted its flaws.

"Some departments of motor vehicles will use facial recognition to detect license fraud, identity theft, but the most common use is law enforcement, whether it's state, local or federal law enforcement," Spivack said.

The government use of facial recognition technology has been banned in half a dozen cities.

In Michigan, Williams said he hopes his case is a wake-up call to lawmakers.

"Let's say that this case wasn't retail fraud. What if it's rape or murder? Would I have gotten out of jail on a personal bond, or would I have ever come home?" Williams said.

Williams and his wife, Melissa, worry about the long-term effects the arrest will have on their two young daughters.

"Seeing their dad get arrested, that was their first interaction with the police. So it's definitely going to shape how they perceive law enforcement," Melissa Williams said.

In his complaint, Williams and his lawyers say if the police department won't ban the technology outright, then at least his photo should be removed from the database, so this doesn't happen again.

"If someone wants to pull my name and look me up," Williams said, "who wants to be seen as a thief?"

'The Computer Got It Wrong': How Facial Recognition Led To False Arrest Of Black Man

On Wednesday morning, the ACLU announced that it was filing a complaint against the Detroit Police Department on behalf of Robert Williams, a Black Michigan resident whom the group said is one of the first people falsely arrested due to facial recognition software.

Williams and lawyers from the ACLU said Detroit police were looking for someone who had broken into a Shinola watch store. They took security camera footage from the store’s owner and put it into the city’s facial recognition software, getting the 42-year-old Williams as a match.

Police arrested him in January in front of his wife, children, and neighbors, Williams said in a first-person recounting of the incident published by The Washington Post. After holding him for 16 hours in a crowded Detroit Detention Center cell, an officer brought him into an interrogation room and showed him the security photos.

According to Williams, he held the photo next to his face to prove it wasn’t him, and one of the officers turned to another and said, “the computer must have gotten it wrong.” His attorney later discovered that the security camera footage was sent to the Michigan State Police, and its facial recognition software pulled up Williams’ driver’s license photo.

The Detroit Police Department did not respond to requests for comment but in a statement to NPR said, “After the Williams case, the department enacted new rules. Now, only still photos, not security footage, can be used for facial recognition and only in the case of violent crimes.”

For years, researchers with the ACLU, MIT and other institutions have consistently proven that facial recognition software is still very inaccurate, particularly when it comes to people with darker skin. Despite the inaccuracy and concerns of many, hundreds of police departments and the FBI use the software on a daily basis to identify people during investigations.

But in a statement, the ACLU said law enforcement never tells people if they have been identified using facial recognition software, and this is the first documented instance where police admitted that their use of the software is what caused the mistake. According to the ACLU, the mistake was only revealed because Williams heard what the officers said during his interrogation and his lawyers were able to push for more information about how he was identified.

“We have long warned that one false match can lead to an interrogation, arrest, and, especially for Black men like Robert, even a deadly police encounter. Given the technology’s flaws, and how widely it is being used by law enforcement today, Robert likely isn’t the first person to be wrongfully arrested because of this technology,” the ACLU said in a statement.

“He’s just the first person we’re learning about.”

Calls for bans increase

Several US cities like San Francisco have outright banned police from using facial recognition software, but Williams, the ACLU, and other experts are now calling for a nationwide ban on use of the software, at least until it can be perfected.

Josh Bohls, CEO and founder of Inkscreen, a content capture company, said facial recognition technology cannot be solely relied on to make arrest determinations and called it “too new and unproven to be determinative of a suspect’s identification.”

He added that police still have not fully fleshed out the legal and privacy implications of using it and said at most, police in Detroit should have simply used it to interview Williams, not arrest him.

According to James McQuiggan, security awareness advocate at KnowBe4, facial recognition uses artificial intelligence (AI) and outlines a face by creating vectors and matrices, but can be limited by the quality of light on the subject.

“People with darker skin tones and an image with low light quality presents a complication that does not appear adequately addressed by facial recognition software. More concerning is there is no proper auditing system in place for these systems when it comes to false positives or misuse,” McQuiggan said, adding that whatever these systems produce should be used as reference points and not sole reasons to arrest people.

Facial recognition software should only be used with high-definition cameras and depth-matching sensors at close range, according to Chris Clements, vice president of solutions architecture at Cerberus Sentinel.

He said use of the technology was still in its infancy and should be treated with low confidence until it can be corroborated by other evidence. Attempting to match a still frame from a low-quality video camera several feet away is likely to produce very low confidence matches, he added.

Just three weeks ago, Comparitech.com’s Paul Bischoff released a study of Amazon’s facial recognition software that found it struggled to even identify politician headshots, which are far clearer than any kind of security footage police are using. Even with crystal clear photos, the software incorrectly matched an average of 32 US Congresspersons to mugshots in an arrest database.

Bischoff said instances like this are prime examples of why a moratorium on police use of face recognition software is needed until regulations are put in place to restrict how it can be used.

“The Detroit Police fed grainy video footage to a face recognition tool–tools that can misidentify people even when clear headshots are used. The suspect was Black, which reinforces the fact that face recognition misidentifies people of color at a higher rate than white people, and thus disproportionately impacts people of color,” Bischoff said.

“Worst of all, the mismatch led police to jump to conclusions and make an arrest without proper due diligence. This is just one case that went public, but police use face recognition behind closed doors all the time, and we’ll keep seeing the same mistakes and abuse of face recognition until proper regulation is in place.”

Multiple advocacy groups are now pushing lawmakers to at least put some laws in place to regulate how the software is used. Like the changes the Detroit Police Department described in its statement, civil rights organizations want limits to be set on what crimes facial recognition can be used for and what accuracy thresholds should be in place.

“What happened to Robert Williams and his family should be a wake up call for lawmakers. Facial recognition is doing harm right now. This is only the first case that has come to light. There are almost unquestionably people sitting in jail right now who were put there because they were falsely accused by a racist computer algorithm. Enough is enough. It’s time for Congress to do their job and ban facial recognition surveillance in the United States,” said Evan Greer, deputy director of rights group Fight for the Future.

Even the companies selling facial recognition software are asking for legislation to govern the technology. In light of recent protests around the globe, Amazon, IBM and other major tech companies agreed to at least a one-year moratorium on allowing their facial recognition programs to be used by police. An Amazon statement said it has “advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology,” and said the US Congress “appears ready to take on this challenge.”

“We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested,” the statement said.

But for Williams, the damage has already been done. In his retelling of what happened, he spoke about the horror of his children watching him being arrested and his fear that he was one of the lucky ones.

“I never thought I’d have to explain to my daughters why Daddy got arrested. How does one explain to two little girls that a computer got it wrong, but the police listened to it anyway? Why is law enforcement even allowed to use such technology when it obviously doesn’t work? I get angry when I hear companies, politicians and police talk about how this technology isn’t dangerous or flawed,” Williams wrote.

“I wouldn’t be surprised if others like me became suspects but didn’t know that a flawed technology made them guilty in the eyes of the law. I wouldn’t have known that facial recognition was used to arrest me had it not been for the cops who let it slip while interrogating me. I keep thinking about how lucky I was to have spent only one night in jail—as traumatizing as it was. Many Black people won’t be so lucky. My family and I don’t want to live with that fear. I don’t want anyone to live with that fear.”

Detroit police admit to first facial recognition mistake after false arrest

Detroit police have used highly unreliable facial recognition technology almost exclusively against Black people so far in 2020, according to the Detroit Police Department’s own statistics. The department’s use of the technology gained national attention last week after the American Civil Liberties Union and New York Times brought to light the case of Robert Julian-Borchak Williams, a man who was wrongfully arrested because of the technology.

In a public meeting Monday, Detroit Police Chief James Craig admitted that the technology, developed by a company called DataWorks Plus, almost never brings back a direct match and almost always misidentifies people.

“If we would use the software only [to identify subjects], we would not solve the case 95-97 percent of the time,” Craig said. “That’s if we relied totally on the software, which would be against our current policy … If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify."

Todd Pastorini, a general manager at DataWorks Plus, told Motherboard that it does not keep statistics on the software's accuracy in real-world use, and it does not specifically instruct law enforcement how to use the software.

"There's no statistics for that," Pastorini said. "The matter is the quality of the probes used. I’m very reluctant based on the last New York Times article I was misquoted or slightly misrepresented based on the context that was used. You might know how a shovel works—you stick it in the ground to pick up dirt and you might use it as a weapon. Facial recognition has been weaponized by the media to some degree. I understand the chief’s comment, but unfortunately many people don’t."

Pastorini likened DataWorks Plus' software to automated fingerprint identification systems, where dozens or hundreds of potential matches are returned. It "does not bring back a single candidate," he said. "It's hundreds. They are weighted just like a fingerprint system based on the probe [and what's in the database]."

The result of this, according to Detroit's own police officers, is that they are ultimately making the decision to question and investigate people based on what the software returns and a detective's judgment. This means that people who may have had nothing to do with a crime are ultimately questioned and investigated by police. In Detroit, this means, almost exclusively, Black people.

So far this year (through June 22), the technology had been used 70 times, according to publicly released data by the Detroit Police Department. In 68 of those cases, the photo fed into the software was of a Black person; in two of the cases, the race was listed as 'U,' which likely means unidentified (in other reports from the police, U stands for unidentified); the Detroit Police Department did not respond to a request to clarify. These photos were largely pulled from social media (31 of 70 cases), or a security camera (18 of 70 cases).

Several cities have banned police from using facial recognition software, which has well-known racial bias issues (and many false-positive issues as well). Detroit, however, has a very public debate in 2019 about the use of facial recognition, and instead decided to regulate its use rather than ban it altogether. Late last year, the city adopted a policy, which bans the use of facial recognition to “surveil the public through any camera or video device,” bans its use on livestream and recorded videos, and restricts (but does not ban) its use at protests. According to the policy, the software must be used only “on a still image of an individual,” and can only be used as part of an ongoing criminal investigation. The software checks images across a state database of photos, which include mugshot images. As part of these regulations, the police department is required to release weekly reports about the use of the technology, which show that it has been almost exclusively used on Black people.

Williams was arrested before the policy went into practice. Craig said during the meeting that the media it ran through DataWorks’ facial recognition system was “a horrible video. It was grainy … it would have never made it under the new policy … if we can’t obtain a good picture, we’re not going to push it through to the detective.”

Craig and his colleague, Captain Aric Tosqui, said that they want to continue using facial recognition because they say it can be a tool to assist investigators even if it doesn’t often lead to arrest. But even when someone isn’t falsely arrested, their misidentification through facial recognition can often lead to an investigator questioning them, which is an inconvenience at best and a potentially deadly situation at worst. According to Tosqui, the technology has been used on a total of 185 cases throughout the years. “The majority of the cases the detective reported back that [the match] was not useful.”

Despite these problems, DataWorks Plus said that it does not guide law enforcement on how to best use the software. "We don't tell our customers how to use the system," Pastorini said. "There’s already law enforcement policies. It is my experience the clearer the image, clearly is going to affect the likelihood of a more solid result."

The Detroit Police Department did not respond to a request for further comment. In recent months, there has been a new movement by city council members to ban the use of the technology.

Detroit Police Chief: Facial Recognition Software Misidentifies 96% of the Time

Racial bias and facial recognition. Black man in New Jersey arrested by police and spends ten days in jail after false face recognition match

Accuracy and racial bias concerns about facial recognition technology continue with the news of a lawsuit filed by a New Jersey man, Nijeer Parks, against local police, the prosecutor and the City of Woodbridge in New Jersey.

According to the New York Times, Nijeer Parks is the third person known to be falsely arrested for a crime he did not commit based on a bad face recognition match. The other two were Robert Williams and Michael Oliver. All three falsely arrested men are black it is reported.

This particular case began on a Saturday in January 2019, when two police officers showed up at the Hampton Inn in Woodbridge (New Jersey) after receiving a report about a man stealing snacks from the gift shop.

The crime

The alleged shoplifter was a black man, nearly 6 feet tall, wearing a black jacket, who was reportedly visiting a Hertz office in the hotel lobby, trying to get the rental agreement for a gray Dodge Challenger extended.

The officers confronted him, and he apologised, according to the police report. According to the New York Times, the suspect said he would pay for the snacks and gave the officers a Tennessee driver’s license.

When the officers checked the license, they discovered it was a fake drivers licence. This, coupled a bag of suspected marijuana in the man’s jacket, resulted in the officers trying to arrest the suspect. But the suspect ran away and drove off in his rental car, hitting a parked police car in the process, as well as a column in front of the hotel.

One of the police officers had to reportedly jump out of the way of the vehicle to avoid being hit.

The rental car was later found abandoned in a parking lot a mile away.

What happened next is that a detective in the Woodbridge Police Department sent the photo from the fake driver’s license to state agencies that had access to face recognition technology.

The next day, state investigators said they had a facial recognition match: which happened to be Nijeer Parks, who lived in Paterson, N.J., 30 miles away, and worked at a grocery store.

The detective compared Parks’s New Jersey state ID with the fake Tennessee driver’s license and agreed it was the same person. After a Hertz employee confirmed that the license photo was of the shoplifter, the police issued a warrant for Parks’s arrest.

“I don’t think he looks like me,” Parks was quoted as saying. “The only thing we have in common is the beard.”

Parks it should be noted has previous criminal convictions for selling drugs.

The arrest

The only problem for the police was that Parks was 30 miles away at the time of the incident, but that did not stop Parks being arrested by local police and spending ten days in jail, coupled paying around $5,000 to defend himself.

Parks was able to get proof from Western Union that he had been sending money at a pharmacy in Haledon (New Jersey), when the incident happened.

Parks told the judge he was willing to go to trial to defend himself. But a few months later in November 2019, his case was dismissed for lack of evidence.

Parks is now reportedly suing the police, the prosecutor and the City of Woodbridge for false arrest, false imprisonment and violation of his civil rights.

“I was locked up for no reason,” Parks reportedly said. “I’ve seen it happen to other people. I’ve seen it on the news. I just never thought it would happen to me. It was a very scary ordeal.”

The case drew the attention of the American Civil Liberties Union (ACLU).

“Multiple people have now come forward about being wrongfully arrested because of this flawed and privacy-invading surveillance technology,” Nathan Freed Wessler, senior staff attorney for the ACLU’s Speech, Privacy, and Technology Project told Silicon UK.

“There are likely many more wrongful interrogations, arrests, and possibly even convictions because of this technology that we still do not know about,” said Freed Wessler. “Unsurprisingly, all three false arrests that we know about have been of Black men, further demonstrating how this technology disproportionately harms the Black community. Law enforcement use of face recognition technology must be stopped immediately.”

Controversial tech

On this side of the pond, the use of facial recognition, especially by authorities, has also proven to be controversial.

In August the Court of the Appeal ruled that the use of automatic facial recognition (APR) by South Wales Police had breached privacy rights, data protection laws and equality legislation.

And in 2019 the Information Commissioner’s Office (ICO) warned that any organisation using facial recognition technology, and which then scans large databases of people to check for a match, is processing personal data, and that “the potential threat to privacy should concern us all.”

Indeed in 2019 an academic study found that 81 percent of ‘suspects’ flagged by Met’s police facial recognition technology were innocent, and that the overwhelming majority of people identified are not on police wanted lists.

And in August 2019, the ACLU civil rights campaign group in the United States ran a demonstration to show how inaccurate facial recognition systems can be.

It ran a picture of every California state legislator through a facial-recognition program that matches facial pictures to a database of 25,000 criminal mugshots.

That test saw the facial recognition program falsely flag 26 legislators as criminals.

Despite that, in July 2019 then Home Secretary Sajid Javid gave his backing to police forces using facial recognition systems, despite growing concern about the technology.

Tech boycott

Facial recognition systems have also been previously criticised in the US after research by the Government Accountability Office found that FBI algorithms were inaccurate 14 percent of the time, as well as being more likely to misidentify black people.

And tech firms have begun boycotting the supplying of the tech to police forces.

Microsoft first refused to install facial recognition technology for a US police force a year or so ago, due to concerns about artificial intelligence (AI) bias.

This boycott was subsequently been joined by Amazon and IBM, among others.

Microsoft has also deleted a large facial recognition database, that was said to have contained 10 million images that were used to train facial recognition systems.

San Francisco banned the use of facial recognition technology, meaning that local agencies, such as the local police force and other city agencies such as transportation would not be able to utilise the technology in any of their systems.

But the police remain in favour of its use.

In February this year, the UK’s most senior police officer, Metropolitan Police Commissioner Cressida Dick, said criticism of the tech was “highly inaccurate or highly ill informed.”

She also said facial recognition was less concerning to many than a knife in the chest.

Facial Recognition Blamed For False Arrest And Jail Time

Teaneck just banned facial recognition technology for police. Here's why

Show Caption Hide Caption Facial recognition program that works even if you’re wearing a mask A Japanese company says they’ve developed a system that can bypass face coverings for facial recognition. Veuer’s Tony Spitz has the details. Buzz60

TEANECK — The Township Council has banned the use of facial recognition software by police in a unanimous vote, joining a nascent movement to banish a technology that has been criticized as potentially biased.

Even as private companies continue to create huge databases of images from social media, and facial recognition is employed for surveillance and airport passenger screening, a growing body of evidence shows that the algorithms do a poorer job of identifying women and Black and Asian faces.

In January, New Jersey Attorney General Gurbir Grewal barred police statewide from using a facial recognition app from a company called Clearview AI, which The New York Times reported had amassed a database of billions of photos from sites like Facebook, YouTube and Twitter. The company licenses its product to police departments.

However, Teaneck is the first town in New Jersey to ban the controversial technology outright, said Councilman Keith Kaplan.

"Now is not the time to let this technology in our municipality," he said. "Some of the people being targeted have absolutely no connection to a crime and are incarcerated based on an algorithm. We need to protect the civil rights of our residents."

In 2019, Paterson resident Nijeer Parks was arrested on charges of shoplifting in Woodbridge after police identified him using facial recognition software, but Parks was nowhere near Woodbridge at the time of the incident. He spent 10 days in jail and is now suing the city and its Police Department.

He was one of several Black men who have been wrongfully arrested based on faulty facial recognition software.

“I don’t think he looks like me,” Parks said when comparing the image to himself at the time of his arrest. “The only thing we have in common is the beard.”

We just banned Facial Recognition use by the @TeaneckNJGov Police Department and Township Officials. https://t.co/DK4e8FLBnq pic.twitter.com/ZkOST8MJs4 — Councilman Keith Kaplan (@Cm_KeithKaplan) February 24, 2021

Facial recognition software is only as good as the underlying algorithm. A study by the National Institute of Standards and Technology, a government agency, involving a variety of commercial software found that Black and Asian faces were 10 to 100 times more likely to be misidentified than Caucasian faces.

In 2019, San Francisco, a hub for the technology revolution, became the first major city to ban facial recognition software.

Other areas, however, aren't convinced that facial technology is a detriment. Late last year, the Boston Globe reported that Massachusetts Gov. Charlie Baker had refused to sign a law banning most government use of facial recognition

In Teaneck, the ban on the technology is for now largely symbolic. The town is just getting around to equipping all of its police officers with body cameras to increase transparency.

"Right now, the benefits of having body cams outweigh the detriments, but having facial technology, finding people based on faulty algorithms, is a bit of a different story," Kaplan said.

Police did not immediately respond to a request for comment.

Teaneck NJ bans facial recognition usage for police, citing bias

A Michigan man has sued Detroit police after he was wrongfully arrested and falsely identified as a shoplifting suspect by the department’s facial recognition software in one of the first lawsuits of its kind to call into question the controversial technology’s risk of throwing innocent people in jail.

Robert Williams, a 43-year-old father in the Detroit suburb of Farmington Hills, was arrested last year on charges he’d taken watches from a Shinola store after police investigators used a facial recognition search of the store’s surveillance-camera footage that identified him as the thief.

Prosecutors dropped the case less than two weeks later, arguing that officers had relied on insufficient evidence. Police Chief James Craig later apologized for what he called “shoddy” investigative work. Williams, who said he had been driving home from work when the 2018 theft had occurred, was interrogated by detectives and held in custody for 30 hours before his release.

Williams’s case sparked a public outcry about the fast-growing police use of a technology that research has shown often misidentifies people of color. His lawsuit is at least the third in the United States brought by Black men to raise doubts about the software’s accuracy.

The case could heighten the legal challenges for a technology that is largely unregulated in the country, even as it has become a prolific investigative tool used by police forces and federal investigators. While the software has been banned by more than a dozen cities nationwide, it has been cited in a growing number of criminal cases, including in the landmark investigation of rioters at the U.S. Capitol on Jan. 6.

Williams’s attorneys did not make him available for comment Tuesday. But Williams wrote in The Washington Post last year that the episode had left him deeply shaken, in part because his young daughters had watched him get handcuffed in his driveway and put into a police car after returning home from work.

“How does one explain to two little girls that a computer got it wrong, but the police listened to it anyway?” he wrote. “As any other black man would be, I had to consider what could happen if I asked too many questions or displayed my anger openly — even though I knew I had done nothing wrong.”

Sgt. Nicole Kirkwood, a Detroit police spokeswoman, said the department does not comment on pending litigation. But she pointed to comments from Craig last year in which the police chief said the case’s failures “had nothing to do with technology, but certainly had everything to do with poor investigative work.”

“This was clearly sloppy, sloppy investigative work. There’s no other way for me to say it but that way,” Craig told a Detroit Board of Police Commissioners meeting last June that is cited in the lawsuit. “If you just rely solely on facial recognition technology, there’s a high probability that it’s going to misidentify.”

Kirkwood told The Post in a statement last year that the department does not make arrests based solely on a facial recognition search and that, in Williams’s case, investigators had reviewed video, interviewed witnesses, conducted a photo lineup and submitted evidence to prosecutors, who recommended charges against Williams for first-degree retail fraud.

Wayne County prosecutors later said the facial recognition result was not enough evidence to bring charges and that the store security official shown the photo lineup hadn’t been in the store during the crime.

“This case should not have been issued based on the DPD investigation, and for that we apologize,” prosecutor Kym L. Worthy said in a statement. “Thankfully, it was dismissed on our office’s own motion. This does not in any way make up for the hours that Mr. Williams spent in jail.”

Williams’s identification as the thief happened after Detroit detectives sent a blurry, dimly lit image from a surveillance camera to the Michigan State Police, which ran a facial recognition search that pointed to Williams’s old driver’s license photo as a possible match.

But the state police’s “investigative lead report” also said, in all capital letters, that the document was not a positive identification or sufficient probable cause for an arrest. The detective nevertheless submitted the photo to prosecutors as evidence to support an arrest warrant.

The civil suit argues that Williams’s rights were violated under the Fourth Amendment, which bans “unreasonable” police searches, as well as a state civil rights law prohibiting racial discrimination. The lawsuit seeks an unspecified amount for damages as well as policy changes for the Detroit Police Department, which continues to use the software.

Williams is being represented by student attorneys at the University of Michigan Law School’s Civil Rights Litigation Initiative as well as lawyers from the American Civil Liberties Union and the advocacy group’s Michigan affiliate.

One of the student attorneys, Jeremy Shur, said Tuesday that Williams’s daughters, ages 3 and 7, have been “traumatized” by the incident. “When they see police, they wonder if they’re taking Daddy away,” Shur said.

The software’s accuracy is heavily dependent on image quality: Blurry, grainy or dark photos often lead to poor results. But even the algorithms used in a facial recognition search can offer a wide range of effectiveness: Several of those tested in a 2019 federal study were up to 100 times more likely to misidentify the face of a Black or Asian person, compared with a White person.

Williams’s lawsuit is the second accusing Detroit police of making a false facial recognition match: In September, a 26-year-old man named Michael Oliver sued the department, saying his false arrest on a 2019 larceny charge led him to lose his job and spend three days in jail.

The same detective, Donald Bussa, investigated both Oliver and Williams and is named in both lawsuits. Craig has criticized Bussa’s use of a “blurry” photo and said the department has worked to change the facial recognition policies that led to the arrest.

In a third lawsuit, filed in January, a man named Nijer Parks sued New Jersey police and prosecutors, saying he was held in jail for 10 days after he was falsely accused of stealing from a hotel gift shop in 2019. All three cases are ongoing.

Defenders of the technology said it should be used solely to generate leads for police, not as the lone piece of evidence, and that officers should not rely too heavily on its results or apply it to every low-level crime. The Detroit department’s policy has since been changed to allow the use of facial recognition software only in cases of violent crime.

But critics argue that officers who put too much trust in the systems’ findings — or who alter the search images in hopes of achieving better results, as researchers have found evidence of in some police departments — could end up placing the burden of proof on innocent people who may not be told what investigative techniques were used as the basis for their arrest.

Both the Detroit and Michigan state police have a contract with a South Carolina-based company, DataWorks Plus, that makes facial recognition software. The company did not immediately respond to requests for comment.

The Detroit department is also among hundreds of police agencies that have used Clearview AI, a facial recognition tool that searches through a large database of photos taken from across the Internet, according to a BuzzFeed News report earlier this month based on data from a confidential source. Neither the Detroit police nor Clearview have confirmed the report, and it does not appear Clearview was used in Williams’s case.

Clare Garvie, a senior associate at Georgetown Law’s Center on Privacy and Technology who has studied facial recognition software and is not involved in the lawsuit, said the lawsuits have helped shed light on investigative and technological breakdowns that would otherwise remain unseen.

But she expressed concern that slow court proceedings and a patchwork approach to regulation could lead to more cases of facial recognition misidentification before the existing damage could be addressed.

Garvie also worried that the cases shifted the costs of police failures to the people who had been falsely identified — and to the general public, who both live in fear of false arrests and end up paying to defend or settle the cases in court.

“There’s the burden of somebody after the fact, who’s already been injured by a misidentification, to inform the public of what happened to them,” she said. Then “the taxpayer bears the burden of the mistake.”

Wrongfully arrested man sues Detroit police over false facial recognition match

ROBERT WILLIAMS WAS doing yard work with his family one afternoon last August when his daughter Julia said they needed a family meeting immediately. Once everyone was inside the house, the 7-year-old girl closed all the blinds and curtains and then told her sister and parents that she'd figured it out: Wooly Willy, a character from her toy, had stolen the watches that got her dad arrested.

“She was like ‘We need to get to the bottom of this,’” her mother Melissa says. More recently, Melissa says, Julia has said she believes people who wear shirts that say “Detroit” represent the people who arrested her father.

Williams was arrested in January 2020 for allegedly stealing five watches from a Shinola store in Detroit, after he was wrongfully identified by facial recognition software. He was among the first people known to be wrongfully accused because of the software, which is an increasingly common tool for police. Michael Oliver and Nijeer Parks were wrongly arrested in 2019 after also being misidentified by facial recognition technology.

All three cases were eventually dropped, but in Parks’ case, that took almost a year, including 10 days in jail. The cases shared some commonalities. Oliver and Parks both had prior criminal records. Oliver and Williams were investigated by the same Detroit detective. All three men are fathers, and all three are Black. “It’s not a coincidence,” Parks says.

Law enforcement in nearly every US state now has access to facial recognition software. The Georgetown Law Center on Privacy and Technology says images of one in two US adults are in facial recognition databases used to identify criminal suspects. Critics say police rely too heavily on the technology, particularly since research has shown it misidentifies women and people of color more often than white men. Yet in most of the US, neither police nor prosecutors are required to tell people accused of crimes if facial recognition has played a role in an investigation.

WIRED spoke with Williams, Oliver, and Parks to better understand how the arrests changed their lives and the lives of people around them. Each says the fallout extended beyond the time they spent in jail to affect relationships with family, friends, coworkers, and neighbors.

The three men knew relatively little about facial recognition before their arrests, but today they want to ban—or at least suspend—its use in criminal investigations.

When Williams was arrested, he told Melissa, Julia, and his 4-year-old daughter Rosie that he’d be right back, but he was held by police for 30 hours. Julia still cries when she sees video of her dad being arrested on their front lawn. Her parents wonder how much the experience affected her.

In testimony before Congress last summer, Williams said he and his wife considered getting Julia a therapist, and that thought bubbled up again late last year amid worry that the arrest was still top of mind for her. Her parents said Julia still seems worried that police will take her father away again.

“I don’t think she knows that nothing else is going to happen to me about this case,” Williams says. “I can see how that stuck with her, but I don’t know how to look at it from a 7-year-old’s perspective.”

Accused of Stealing from a Hotel Gift Shop

Parks, who is now 34, was accused of shoplifting snacks and candy from a Hampton Inn gift shop in Woodbridge, New Jersey in January 2019. According to police reports, the shoplifter left a fake Tennessee driver’s license at the scene and nearly hit a police officer with a car while evading arrest. The photo from the fake ID was sent to a real-time crime center, which used a facial recognition system to identify Parks as a “high-profile” match. Days later, after police went to his grandmother’s home looking for him, Parks walked into a police station and attempted to clear his name. He was arrested instead.

Three days after his arrest, Parks appeared in court for the first time. When he wasn’t released, he began to wonder if he would spend years behind bars. He didn’t know how much time he faced, but Parks figured the charges of assault, theft, and eluding arrest could mean a long sentence; according to the complaint, the maximum sentence could have been 25 years.

As Parks considered his options, his previous conviction for drug-related charges weighed heavily. “That’s when it started hitting me, like a plea deal might not be bad even if I didn’t do it,” he says, “because with a trial there’s more [time], and me being a convicted felon, my time is doubled.”

Defense attorneys and legal experts say some people wrongly accused by facial recognition agree to plea deals, suggesting wrongful arrests are more common than generally realized.

Parks spent 10 days in the Middlesex County Corrections Center after his arrest. About six months later, he got a new phone, and while going through old photos, he found a screenshot of a receipt for a Western Union money transfer to his fiancee at roughly the same time as the hotel shoplifting. The Western Union was in Paterson, New Jersey, 30 miles from the hotel, proving he didn’t commit the crime.

Still, it took several more months before the charges were dropped. Parks calls himself lucky. “I could be talking to you from prison right now trying to explain my innocence,” he says. “I just don’t want that to happen to anybody else.”

In March 2021, Parks filed suit in federal court in New Jersey against the director of the Woodbridge Police Department, other local officials, and Idemia, maker of the facial recognition system that identified him, alleging false arrest, false imprisonment, violation of his rights against improper search and seizure, and cruel and unusual punishment. The lawsuit alleges that police didn’t use traditional investigative techniques, such as subjecting a photo of Parks to an in-person or photo lineup for witnesses. The lawsuit also alleges that police failed to obtain DNA or fingerprint evidence left at the scene by the suspect that could have eliminated Parks as a suspect. The lawsuit seeks lost wages and emotional damages. A trial date has not been set.

After his arrest, Parks didn’t tell many people, in part because of his prior record. “When you’re trying to do the right thing and change and do things differently and then something like that happens, people look at you like ‘Did you really do it?’” he says.

A small group of close family and friends knew about his ordeal, and for Parks, the false accusation divided those closest to him into two groups: people who stood by him after the arrest, and family and friends who didn’t want to be around him. In part because of the arrest, Parks says he’s no longer with his fiancée.

“Some people came back and apologized and said ‘It looked like you and so you know I just took it for what it was,’” he says. “Sometimes things happen.”

Parks says he didn’t discuss the arrest with his 10-year-old son while he fought the case, but they discussed it after watching a 60 Minutes segment that aired in May 2021 on facial recognition use in criminal investigations. His son questioned why his father was arrested, and Parks said they discussed how Black men have to act differently around police, a rite of passage for Black families sometimes referred to as The Talk. It was the first time they had that type of conversation.

“I said as Black men there’s certain things we can’t do in police presence,” Parks said.

In response to the lawsuit, the mayor of Woodbridge, director of the Woodbridge Police Department, and officers involved in the case denied allegations. A lawyer representing the Woodbridge County Corrections Department also denied allegations that Parks was subject to excessive force.

Facial recognition manufacturer Idemia did not respond to comment about the accusation of malice or shocking disregard that merit awarding punitive damages to Parks.

The Broken Smartphone

Oliver, 28, says his biggest fear after his arrest was going to trial and losing. He was arrested in Ferndale, Michigan, during a traffic stop in July 2019, two months after Detroit police issued a warrant for his arrest for allegedly grabbing a smartphone from a teacher recording a fight outside a school and throwing it on the ground. Oliver was at work when the crime occurred. As a result of the arrest, Oliver says, he lost his job painting car parts and it took about a year for his life to return to normal.

“I’ve got a son, I’ve got my family, I’ve got my own little house, paying all my bills, so once I got arrested and I lost my job, it was like everything fell, like everything went down the drain,” Oliver says.

Oliver was identified by facial recognition software based on a screenshot shared with police from the video by the teacher. The teacher initially identified a former student as the suspect but later picked Oliver from a photo lineup. But Oliver's public defender, Patrick Nyenhuis, told Detroit’s WXYZ TV that he realized quickly when he met Oliver at a pretrial hearing that Oliver doesn’t resemble the man in the video. Oliver has several tattoos, while the person in the video has no visible tattoos. Wayne County prosecutors ultimately agreed and dropped the charges.

Nyenhuis said the detective investigating the case appeared to take shortcuts, including failing to question Oliver or review a video of the incident before his arrest.

In October 2020, Oliver sued the city of Detroit and Detective Donald Bussa in federal court in Michigan seeking damages for emotional distress and economic loss. The suit alleges Bussa did not accurately represent facts in the warrant, including the teacher’s initial identification of a former student, and that the detective didn’t contact multiple witnesses or the school where the fight took place.

In the suit, Oliver asks for an order barring Detroit police from using facial recognition technology until disparities are resolved in how the technology performs on people of different races, ethnicities, and skin tones. If a facial recognition program is used in an investigation, the lawsuit asks that investigating officers be required to inform judges reviewing arrest warrants that the quality of an image can impact the accuracy of its results.

In court filings, Patrick Cunningham, a lawyer representing Bussa and the city of Detroit, denied all allegations in Oliver’s lawsuit, including that investigators relied on facial recognition and that Oliver was falsely arrested.

Bussa, the detective in the Oliver case, also investigated the crime that led to Williams’ arrest. Lawyers representing Oliver and Williams say what happened to their clients reflects both an overreliance on facial recognition and poor investigative work.

Court documents state that discovery in the case will continue until June. Oliver’s attorney, David Robinson, wants police to reveal how many images the facial recognition program returned besides Oliver’s. He’s also seeking records about the technology’s accuracy in identifying people of color in a city where the majority of people are Black or brown.

Stolen Watches and a Videotape

Williams, 43, was accused of stealing $3,600 in watches from a Shinola store. He spent 30 hours in the Detroit Detention Center in a cell with a dozen other people. A live Instagram video of him singing slow jams while 50 miles away around the time of the theft proved he didn’t commit the crime, and charges against him were dropped two months after his arrest.

But that wasn’t the end of the story for the Williams family. At home, the arrest led to games of cops and robbers starring Robert as the robber for a few months and affected his relationship with neighbors. Williams had been arrested in the afternoon on his front lawn in full view of his neighbors. Some neighbors didn’t realize what had occurred until stories about his arrest began to emerge months later. Conversations with other neighbors came more than a year later, after a 60 Minutes interview.

Williams has had multiple strokes since his arrest and wonders whether they are related to stress associated with the case or the deaths of two siblings since then. Still, he thinks of himself as lucky when he considers what might have happened if police had charged him with a more serious crime than theft and he had felt pressured to take a plea deal.

“Nijeer Parks said he was ready to take a plea, and I’m thinking what would I have done if I was in a pinch and it would’ve been just cheaper for me to take a probation charge than to try and fight a case that I know I wasn't guilty of,” he says.

In April, Williams filed a suit in federal court in Michigan against former police chief James Craig, the city of Detroit, and Bussa. The suit claims Bussa didn’t investigate alibis for Oliver or Williams and relied entirely on facial recognition software.

In court filings, Williams alleges that Bussa was told by a Shinola representative that the company doesn’t like its employees to appear in court and that a store manager refused to participate. So Bussa showed six photos to a security guard who wasn’t at the store on the day of the theft, the filings say, and the guard identified Williams. It’s not known whether the other photos were the product of facial recognition technology.

“The technology got relied on so heavily that they didn't even do any investigative work to find the person,” Williams says. “Nobody ever asked me from any police department ‘Where were you on the day of the crime?’”

Shannon Washburn, CEO of Shinola, says the company is “dismayed” at Williams’ treatment, adding “none of our employees chose to participate in the case.”

Craig later told the Detroit Board of Police Commissioners that police became aware of the mistake four days after Williams’ arrest, when Bussa reviewed security camera footage and realized that the watch thief was a different person. Craig said Bussa then notified prosecutors.

Detroit police did not respond to requests for comment. In a court filing, a lawyer representing Craig, Bussa, and the city of Detroit denied all the allegations in Williams’ suit.

In July 2019, Craig told the police commissioners that police would never use facial recognition identification as the sole reason for an arrest. Less than a week later, Oliver was arrested. In September, the commissioners adopted a policy instructing officers to use the technology only for investigations involving home invasions or violent crimes, like homicides. The board also made violations of the policy a fireable offense. Williams was arrested four months later for shoplifting.

The policy includes requirements for random audits and annual reports reviewing use of the technology. A coalition of civil rights and community groups, including the Arab-American Civil Rights League and the ACLU of Michigan, opposed the new policy due to concerns that the tech will disproportionately impact immigrants and communities of color.

Craig, who is now a leading Republican candidate for Michigan governor, has acknowledged that the system identifies the wrong person more than 90 percent of the time. After the Williams arrest came to light, Craig apologized and told the board of police commissioners that the false arrest was the result of “sloppy, sloppy investigative work” and poor management by a supervisor, not flaws in facial recognition or policy. He also said the facial recognition searches that led to accusations against Oliver and Williams took place before police adopted the policy governing use of the technology.

After news of Williams’ arrest became public, Wayne County prosecutor Kym Worthy apologized in a statement and said her office declined to adopt Craig’s proposed facial recognition policy, citing studies that the technology can be especially inaccurate for people of color. As a result of what happened to Oliver and Williams, Worthy says both she and a prosecuting attorney must now approve any charges based on facial recognition before they are filed.

Williams says he probably would have supported using facial recognition in criminal investigations before his arrest, but now he supports a moratorium. Tests by the National Institute for Standards and Technology and researchers like Joy Buolamwini demonstrate that software from many companies selling facial recognition has a history of misidentifying young girls with dark skin, like his daughters. Studies show facial recognition systems are also less accurate at recognizing other groups, including people of Asian descent or those who do not conform to gender norms.

Williams has asked lawmakers in Detroit and Washington, DC, to ban or delay use of the technology. Melissa Williams says she and her husband regularly attend meetings and talk to journalists. Though Williams testified before a congressional committee, he was denied opportunities to speak at length at meetings of the Detroit City Council and Detroit Board of Police Commissioners.

“It’s definitely changed our life,” Melissa Williams says. “It's something that’s still our everyday now, and it's just crazy how it came out of nowhere and it's a part of us now.”

Williams’ lawyer, ACLU senior staff attorney Phil Mayor, says Williams waited nearly a year to file a lawsuit because his primary goal was to get the city to stop using the technology.

However, in September 2020 the Detroit City Council renewed a facial recognition contract for two years with DataWorks Plus, a South Carolina-based company that also provides facial recognition services to the New York Police Department, where police also faced multiple accusations that their use of facial recognition led to false arrests.

Other efforts to change police practices in Detroit have hit obstacles. In August, voters rejected a ballot measure that would have revised the city’s charter to, among other things, increase public oversight of surveillance technology contracts.

How Wrongful Arrests Based on AI Derailed 3 Men's Lives