Incident 723: Instagram Algorithms Reportedly Directed Children's Merchandise Ad Campaign to Adult Men and Sex Offenders

Description: An Instagram ad campaign for children's merchandise was intended to reach adult women but was instead predominantly shown to adult men, including convicted sex offenders, due to Instagram's algorithmic targeting. This failure is reported to have led to direct solicitations for sex with a 5-year-old model in the ads.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Meta developed an AI system deployed by Meta and Instagram, which harmed Instagram users , Instagram sellers and Children.

Incident Stats

Incident ID
723
Report Count
2
Incident Date
2024-05-13
Editors
Daniel Atherton
On Instagram, a Jewelry Ad Draws Solicitations for Sex With a 5-Year-Old
nytimes.com · 2024

When a children's jewelry maker began advertising on Instagram, she promoted photos of a 5-year-old girl wearing a sparkly charm to users interested in parenting, children, ballet and other topics identified by Meta as appealing mostly to w…

The Influencer Is a Young Teenage Girl. The Audience Is 92% Adult Men.
wsj.com · 2024

The mom started the Instagram account three years ago as a pandemic-era diversion---a way for her and her daughter, a preteen dancer, to share photos with family, friends and other young dancers and moms. The two bonded, she said, as they p…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents