Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 1789

関連インシデント

インシデント 2482 Report
Reported Automated License Plate Reader Alert on Previously Stolen Rental Car Leads to Alleged Wrongful Detainment in California

Loading...
When License-Plate Surveillance Goes Horribly Wrong
nytimes.com · 2019

Brian Hofer and his brother were on their way home from a Thanksgiving visit, headed toward Oakland, Calif., on Interstate 80 when he saw the flashing lights. Police officers directed him off the highway and into a shopping center. That’s around the time the guns came out.

According to Mr. Hofer, he was escorted out of his car and cuffed. From the back of a squad car, he recalls watching officers, guns drawn, push his handcuffed brother to his knees. Hofer says the officers pointed a gun at the back of his brother’s head. “I was terrified,” Mr. Hofer told me. “I’m sitting ice-cold and saying nothing because I do not want any itchy trigger fingers.”

After a few minutes, officers told Mr. Hofer the car he’d been driving — a rental using the app Getaround — had been reported stolen earlier that year. From the back of the squad car, Hofer attempted to explain the situation, allowing police to use the Getaround app to find his paperwork and contact the company. After roughly 40 minutes, police verified Hofer’s identity and he and his brother were released.

So what happened? According to police, Mr. Hofer’s car was flagged by a fixed stationary camera near the tiny city of Hercules, Calif. The stationary camera, operated by a company called Vigilant Solutions, scanned the license plate of Mr. Hofer’s car, which matched the number to a “hot list” registry of stolen vehicles. Within minutes Vigilant’s cameras pinged law enforcement — quickly enough that they were able to pull Mr. Hofer over just miles up the road.

Mr. Hofer’s harrowing journey highlights the pitfalls of automated policing, where one piece of bad information can lead to a guns-drawn confrontation. In one respect, the system worked as intended: Mr. Hofer’s car had indeed previously been stolen. But because the “hot list” database of stolen vehicles hadn’t been properly updated to show the car was no longer stolen, the license-plate scan triggered law enforcement.

Detainments like Mr. Hofer’s are a growing reality for millions of Americans, whose movements are being constantly tracked by an array of surveillance cameras, some of which actively contact law enforcement. In California, the cameras have resulted in a number of notable traffic stops of criminals, in some cases leading police to murder and arson suspects. But according to an estimate from the Northern California Regional Intelligence Center, the machines have a troubling 10 percent error rate.

Mr. Hofer happens to be the chairman of Oakland’s Privacy Advisory Commission. In addition to filing a federal lawsuit against the Contra Costa County Sheriff’s Department for the detainment, he’s speaking out against the surveillance technology. “The error rate of this technology is incredibly alarming,” he told me. “If one in 10 innocent people end up stopped with a gun pulled on them, that is a lot of potential for abuse.”

Despite his role as a privacy advocate, Mr. Hofer isn’t afraid of technological innovation. His main concern is preserving the right to be anonymous in public. “If we allow law enforcement to rewind life and search through our every interaction, our relationship to public life is forever altered,” he said. “And I simply don’t understand the idea that if we use enough technology, we can achieve a zero percent crime rate. I reject that because that’s going to lead to extreme overpolicing.”

Mr. Hofer hopes his lawsuit and continuing work will help slow the use of surveillance technology by showing how digital automated systems can have outsize impact in the physical realm.

“They built a system to mitigate harm, and yet I ended up with guns pulled on me due to faulty data,” he said. “And it’s more proof that we’ve built this invisible layer behind the scenes that leads to real-world consequences.”

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd