Description: Two plaintiffs alleged that DHS personnel used purportedly AI-enabled surveillance to identify people recording immigration enforcement and threatened to add them to a "domestic terrorist" database or watch list. The complaint reportedly includes video in which an agent appears to reference a "database," and another alleged warning that agents would "come to your house later tonight." DHS has reportedly denied that such a database exists while acknowledging threat monitoring.
Editor Notes: A copy of the lawsuit associated with the reporting in this incident is accessible at: https://protectdemocracy.org/wp-content/uploads/2026/02/Hilton-v-Noem-ECF-1-Complaint-.pdf.
Entities
View all entitiesAlleged: Unknown surveillance technology developers developed an AI system deployed by United States Department of Homeland Security and Unknown surveillance technology developers, which harmed Elinor Hilton , Colleen Fagan , General public , General public of the United States , Legal observers and Democracy.
Alleged implicated AI systems: Unknown surveillance technology , Facial recognition technology and Automated license plate reader (ALPR)
Incident Stats
Incident ID
1390
Report Count
1
Incident Date
2026-01-21
Editors
Daniel Atherton
Incident Reports
Reports Timeline
Loading...
A new lawsuit alleges that the Department of Homeland Security (DHS) is using artificial intelligence to identify bystanders who are recording federal immigration enforcement operations and then adding those people to a secret database.
Two…
Variants
A "variant" is an AI incident similar to a known case—it has the same causes, harms, and AI system. Instead of listing it separately, we group it under the first reported incident. Unlike other incidents, variants do not need to have been reported outside the AIID. Learn more from the research paper.
Seen something similar?
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Similar Incidents
Did our AI mess up? Flag the unrelated incidents


