Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2362

関連インシデント

インシデント 43021 Report
Lawyers Denied Entry to Performance Venue by Facial Recognition

Loading...
Madison Square Garden using facial recognition technology to remove lawyers from shows
nme.com · 2022

The Madison Square Garden group of venues has been using facial recognition technology to remove lawyers who are in the process of suing them from events.

As reported in Rolling Stone, the venue have been using the technology to identify lawyers present at any of their New York venues, and removing them if they work for a firm that is in the process of suing MSG.

Confirming the tactic, a representative for Madison Square Garden Entertainment said: “MSG instituted a straightforward policy that precludes attorneys from firms pursuing active litigation against the Company from attending events at our venues until that litigation has been resolved. While we understand this policy is disappointing to some, we cannot ignore the fact that litigation creates an inherently adversarial environment.”

The report details that among the lawyers to have been removed from MSG shows include Grant & Eisenhofer employee Barbara Hart, who was asked to leave a Brandi Carlile gig with her husband, and Davis, Saperstein and Solomon employee Kelly Conlon, who was kicked out of a show at affiliated venue Radio City Music Hall.

Both Hart and Conlon’s firms are in the process of pursuing legal cases against MSG, but neither lawyer is directly involved themselves.

Evan Greer of organisation Fight For The Future opposes the tactic, telling Rolling Stone: “This is the perfect example to show that these tools can be used in ways that are really alarming.

“In some ways, this is kind of an innocuous case — it’s not like [Conlon] was arrested. But the reality is that this was a corporation with what amounts to a petty grievance, using a deeply invasive surveillance apparatus in a way that left a mom sitting outside while her kid went into a concert.”

Last year, Kathleen Hanna and Tom Morello were among a number of musicians protesting new Amazon palm scanners at music venues.

The technology, recently implemented “as a form of “convenient” ticketless entry” at a number of US venues including the famous Red Rocks Amphitheater in Colorado, is “an example of biometric data collection, and could turn Red Rocks into the site of ICE raids, police violence, and false arrests,” the letter states.

“Palm scans and other forms of biometric data collection, like facial recognition, are tools of state violence,” Siena Mann of the Colorado Immigrant Rights Coalition added in a press release. “Once the databases are created, police and DHS will find ways to access them.”

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd