Incident 166: Networking Platform Giggle Employs AI to Determine Users’ Gender, Allegedly Excluding Transgender Women

Description: A social networking platform, Giggle, allegedly collected, shared to third-parties, and used sensitive information and biometric data to verify whether a person is a woman via facial recognition, which critics claimed to be discriminatory against women of color and harmful towards trans women.


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History
Alleged: Kairos developed an AI system deployed by Giggle, which harmed trans women and women of color.

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor, Khoa Lam
This girls-only app uses AI to screen a user’s gender — what could go wrong? · 2020

A new social app called Giggle is pitching itself as a girls-only networking platform. To sign up, users have to take a selfie. And while that might not sound too invasive, the app then uses “bio-metric gender verification software” to dete…

A social media app just for 'females' intentionally excludes trans women — and some say its face-recognition AI discriminates against women of color, too · 2022

An app marketed towards "females" has faced a barrage of online criticism for excluding transgender women with its use of artificial intelligence.

Giggle, which first launched in early 2020, according to The Verge, uses facial recognition t…


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents