Incident 628: Fake Biden Voice in Robocall Misleads New Hampshire Democratic Voters

Responded
Description: A robocall imitating President Joe Biden's voice urged New Hampshire Democrats not to vote in the primary, misleadingly stating that their votes were more crucial in the November election. This incident undermines the democratic process.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: unknown developed and deployed an AI system, which harmed President Joe Biden , New Hampshire voters , Kathy Sullivan and Democracy.

Incident Stats

Incident ID
628
Report Count
2
Incident Date
2024-01-22
Editors
Daniel Atherton
Democrats sound alarm over fake Biden ‘bunch of malarkey’ robocall
theguardian.com · 2024

A prominent New Hampshire Democrat wants the makers of a robocall mimicking the voice of Joe Biden and encouraging Democrats not to vote in the primary on Tuesday "prosecuted to the fullest extent" for attempting "an attack on democracy" it…

Fake Biden robocall not expected to affect NH primary, official says
reuters.com · 2024
Reuters post-incident response

Jan 23 (Reuters) - A fake robocall urging Democrats not to vote in New Hampshire's presidential primary on Tuesday is unlikely to affect the results, the secretary of state said as voters cast ballots in the northern New England state.

"I d…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.