Incident 628: Fake Biden Voice in Robocall Misleads New Hampshire Democratic Voters
Responded
Description: A robocall imitating President Joe Biden's voice urged New Hampshire Democrats not to vote in the primary, misleadingly stating that their votes were more crucial in the November election. This incident undermines the democratic process.
Entities
View all entitiesAlleged: unknown developed and deployed an AI system, which harmed President Joe Biden , New Hampshire voters , Kathy Sullivan and Democracy.
Incident Stats
Incident ID
628
Report Count
2
Incident Date
2024-01-22
Editors
Daniel Atherton
Incident Reports
Reports Timeline
theguardian.com · 2024
- View the original report at its source
- View the report at the Internet Archive
A prominent New Hampshire Democrat wants the makers of a robocall mimicking the voice of Joe Biden and encouraging Democrats not to vote in the primary on Tuesday "prosecuted to the fullest extent" for attempting "an attack on democracy" it…
reuters.com · 2024
- View the original report at its source
- View the report at the Internet Archive
Reuters post-incident response
Jan 23 (Reuters) - A fake robocall urging Democrats not to vote in New Hampshire's presidential primary on Tuesday is unlikely to affect the results, the secretary of state said as voters cast ballots in the northern New England state.
"I d…
Variants
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.