Incident 248: Automated License Plate Camera Notified Police about a Previously Stolen Rental Car that was Returned, Causing an Innocent Person to be Detained at Gunpoint in California

Description: In Oakland, a previously stolen rental car that was returned but allegedly not updated in the police database was pinged by an automated license plate reader (ALPR) camera, leading to police’s wrongful detainment of an innocent person reportedly using excessive force and improper conduct.
Alleged: Vigilant Solutions developed an AI system deployed by Contra Costa County Sheriff, which harmed Brian Hofer.

Suggested citation format

Lam, Khoa. (2018-11-23) Incident Number 248. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Khoa Lam


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

Incident Reports

SAN PABLO, Calif. (KTVU) - Brian Hofer has worked for the last half decade on defending citizens’ rights to privacy and creating tighter oversight surrounding the use of mass surveillance techniques and technology.

As chair of Oakland’s Privacy Advisory Commission, Hofer, 41, has railed against what he describes as the seemingly arbitrary use of Automated License Plate Readers -- cameras that ping police and private agencies by matching plate numbers with "vehicles of interest."

So the irony is not lost on him when he said he and his brother, a 23-year-old political science student at UC Berkeley, were detained, sometimes at gunpoint, on Nov. 25 when a license plate reader near the San Pablo Lytton Casino off Interstate Highway 80 alerted police that they were riding in a stolen car.

“They picked the wrong guy,” Hofer said. “I’m seeing guns pointed at us. It was like a movie. It was executioner style. I was just totally confused about what was going on. They were obviously getting an alert that we were bad guys.”

Turns out though, that while the rental car he was driving had indeed been stolen from San Jose in October, either the police or the rental car agency hadn't updated the proper authorities that the white Getaround Kia had been recovered and should therefore be removed from the “hot list” database. Eventually, the three Contra Costa County sheriff’s deputies straightened the situation out, allowing Hofer and his brother to go home. No one was taken into custody.

But in December, Hofer filed a federal suit against the deputies, alleging his civil rights had been violated, a warrantless search of his car had been conducted and excessive force had been used.  “They didn’t check my ID,” he said. “They didn’t say, ‘Hey, what are you guys doing, or what’s your name?’

It’s not clear exactly how the error occurred, or why the license plate reader cameras, operated by Vigilant Solutions in Livermore, tagged Hofer’s rental car as “hot” as he returned from a family Thanksgiving trip.

But Hofer points out that even though police tout the cameras as essential crime-fighting tools that can capture criminals quickly, the technology is not worth the misreads and mistakes, which he points out have led to wrongful detentions, invasions of privacy and potentially costly lawsuits.

The exact numbers of errors like the one that happened in Hofer’s case aren’t known. That’s  because a 2 Investigates Public Records Request to police agencies in several cities including San Francisco and Fremont, for example, reveal that these departments do not conduct these type of audits.

In addition, 2 Investigates has learned that while the San Francisco-based Northern California Regional Intelligence Center does random samplings of the 28 agencies in its jurisdiction to ensure that police officers are not abusing the system, the agency has never conducted a formal audit that the public can see.

The Contra Costa County Sheriff declined an on-camera interview regarding what happened to Hofer, but did issue this statement:

“The Deputy Sheriffs involved in this case followed procedure and acted appropriately. The vehicle was reported stolen. As the car was occupied, a high risk enforcement stop was conducted. Once it was confirmed the driver was not a suspect, he was removed from the patrol car and the handcuffs removed. He was released at the scene.” 

Stolen cars are considered felony stops, and it is standard procedure for police to pull out their weapons in these types of situations, law enforcement officers told 2 Investigates. 

In an email, sheriff’s spokesman Jimmy Lee added that his agency does not employ its deputies with body camera or dash cam videos, so the encounter was not captured on film.

Hofer took some photos after the fact, showing his brother’s ripped jeans and that his finger had been slightly injured.

And even though he and his brother were let go, Hofer described a 40-minute ordeal where they thought they might not make it out alive. He noted that others in his situation might not have been so lucky.

“I definitely have the privilege of skin color,” he said. “I don’t have a criminal background. I’ve never had a problem with local police.” And still, Hofer said, the fact that “somebody could pull a gun on your because of an alert that a computer system gave them. They’re just pulling guns and going cowboy on us. It’s a pretty terrifying position to be in.”

Police agencies should beware that the license plate reader technology, like any technology, should always be followed up with good old-fashioned detective work, said Mike Sena, executive director of NCRIC, a central repository that collects and shares data among 28 police agencies.

He said that even though the accuracy rate of the license plate readers is about 90 percent, “law enforcement should not take action just because they receive an alert.” In 10 percent of the cases, he pointed out, the cameras have led to mistakes.

Sena added: “That alert is just the pointer to say, ‘look at the license plate in a little more detail.’ Call it into a dispatcher and make sure it is actually wanted or connected with a subject of an investigation.”

Sena said he has “absolutely” seen cases where police have seen the alert, didn’t confirm the plate and took action “and it was the wrong car.”

Still, Sena stressed that in his opinion, the cameras are extremely valuable.

“The great benefits outweigh the risks right now,” Sena said. “The way we deploy the system really allows them to reduce crime in their community. It allows them to stop the person who has stolen the car. It allows them to catch the rapist, the robber, the murderer much more easily that they had been able to before.”

For example, this fall, Fremont police touted the license plate readers at a community meeting where they had 100 successes in 2017 including the arrests of an arsonist at Costco and another homicide suspect involved at a hotel. The police department did a six-month sampling ahead of the meeting, but acknowledged it does not regularly audit the usefulness of the technology. 

“Without the use of the cameras, the suspects would have never been identified and arrested so quickly, and the evidence may have been lost forever,” Fremont Det. Jason Valdes said at the meeting.

Sena added that in some cases, the technology has actually exonerated people, or given potential suspects alibis.

But there is no way for the public to know just how effective the license plate reader technology is in capturing criminals.

The Piedmont Police Department is the only agency within NCRIC to report its “efficacy metrics” to the public. For example, in 2017, Piedmont said of 7,500 “hits,” 39 cars were recovered and 28 arrests were made. The small city provided the second largest number of license plate readers to NCRIC -- 21.3 million from December 2016 to October 2017, according to Oakland Privacy, a nonprofit.

But no other police department has compiled data to see how efficacious the technology is, Hofer said.

And according to anecdotal stories reported by 2 Investigates, Hofer isn’t the only one to get detained by mistake because his car matched a description on the police hot list.

On Jan. 10, a thief swapped plates on an innocent driver’s car, causing Piedmont police to detain that man at gunpoint. The man was eventually let go and police issued a warning to the public to be mindful to check their license plates for that type of criminal behavior.

In 2009, Denise Green, a San Francisco Muni driver, was detained at gunpoint when a license plate reader mistakenly identified her car as stolen. She filed suit against police and settled the case for nearly $500,000.  

It’s not as though Hofer is completely against the technology.

“If your kid gets kidnaped, or car gets stolen, the cameras can work,” he said. “But there is no reason to hold onto this for more than a week. And there’s no reason to hold onto data for people not suspected of any crime.”

For a while, ICE was theoretically able to track the patterns of undocumented immigrants driving around California in cities that use the license plate reader cameras, raising alarms for groups like the American Civil Liberties Union and the Electronic Frontier Foundation. But as of January 2017, that relationship formally ended when California became a sanctuary state.

“It’s listed on all our sites,” Sena said. “This shall not be used for immigration purposes. It’s got to be used for some crime or criminal activity and we enumerate the types of crimes it has to involve for a person to get access to it.”

The cost of these programs vary by department. The Alameda City Council decided to put on hold spending more than $500,000 on the license plate readers and the San Pablo Police Department spent $1.3 million, as two examples. Sena said on average, it costs $20,000 per patrol car for a “simple setup” of the readers.

Hofer sees it as his civic duty to question authority and to protect the U.S. Constitution. His biggest passion is to protect the Fourth Amendment’s prohibition against unreasonable search and seizure. He has drafted legislation on this topic across the country and have been invited to speak and consult alongside the ACLU.

And his personal experience has only made him more convinced that his battle against this type of technology, without building in more transparency and restrictions, is more important than ever.

“This is happening more frequently than it should be,” he said. “They’re not ensuring the accuracy of their data and people’s lives are literally at risk.”

Privacy advocate sues CoCo sheriff's deputies after license plate readers target his car stolen

Brian Hofer and his brother were on their way home from a Thanksgiving visit, headed toward Oakland, Calif., on Interstate 80 when he saw the flashing lights. Police officers directed him off the highway and into a shopping center. That’s around the time the guns came out.

According to Mr. Hofer, he was escorted out of his car and cuffed. From the back of a squad car, he recalls watching officers, guns drawn, push his handcuffed brother to his knees. Hofer says the officers pointed a gun at the back of his brother’s head. “I was terrified,” Mr. Hofer told me. “I’m sitting ice-cold and saying nothing because I do not want any itchy trigger fingers.”

After a few minutes, officers told Mr. Hofer the car he’d been driving — a rental using the app Getaround — had been reported stolen earlier that year. From the back of the squad car, Hofer attempted to explain the situation, allowing police to use the Getaround app to find his paperwork and contact the company. After roughly 40 minutes, police verified Hofer’s identity and he and his brother were released.

So what happened? According to police, Mr. Hofer’s car was flagged by a fixed stationary camera near the tiny city of Hercules, Calif. The stationary camera, operated by a company called Vigilant Solutions, scanned the license plate of Mr. Hofer’s car, which matched the number to a “hot list” registry of stolen vehicles. Within minutes Vigilant’s cameras pinged law enforcement — quickly enough that they were able to pull Mr. Hofer over just miles up the road.

Mr. Hofer’s harrowing journey highlights the pitfalls of automated policing, where one piece of bad information can lead to a guns-drawn confrontation. In one respect, the system worked as intended: Mr. Hofer’s car had indeed previously been stolen. But because the “hot list” database of stolen vehicles hadn’t been properly updated to show the car was no longer stolen, the license-plate scan triggered law enforcement.

Detainments like Mr. Hofer’s are a growing reality for millions of Americans, whose movements are being constantly tracked by an array of surveillance cameras, some of which actively contact law enforcement. In California, the cameras have resulted in a number of notable traffic stops of criminals, in some cases leading police to murder and arson suspects. But according to an estimate from the Northern California Regional Intelligence Center, the machines have a troubling 10 percent error rate.

Mr. Hofer happens to be the chairman of Oakland’s Privacy Advisory Commission. In addition to filing a federal lawsuit against the Contra Costa County Sheriff’s Department for the detainment, he’s speaking out against the surveillance technology. “The error rate of this technology is incredibly alarming,” he told me. “If one in 10 innocent people end up stopped with a gun pulled on them, that is a lot of potential for abuse.”

Despite his role as a privacy advocate, Mr. Hofer isn’t afraid of technological innovation. His main concern is preserving the right to be anonymous in public. “If we allow law enforcement to rewind life and search through our every interaction, our relationship to public life is forever altered,” he said. “And I simply don’t understand the idea that if we use enough technology, we can achieve a zero percent crime rate. I reject that because that’s going to lead to extreme overpolicing.”

Mr. Hofer hopes his lawsuit and continuing work will help slow the use of surveillance technology by showing how digital automated systems can have outsize impact in the physical realm.

“They built a system to mitigate harm, and yet I ended up with guns pulled on me due to faulty data,” he said. “And it’s more proof that we’ve built this invisible layer behind the scenes that leads to real-world consequences.”

When License-Plate Surveillance Goes Horribly Wrong