Incident 315: Facial Recognition Service Abused to Target Russian Porn Actresses

Description: The facial recognition software FindFace allowing its users to match photos to people’s social media pages on Vkontakte was reportedly abused to de-anonymize and harass Russian women who appeared in pornography and alleged sex workers.
Alleged: NtechLab developed and deployed an AI system, which harmed Russian pornographic actresses and Russian sex workers.

Suggested citation format

Lam, Khoa. (2016-04-09) Incident Number 315. in Lam, K. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Khoa Lam


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

Incident Reports

Facial Recognition Service Becomes a Weapon Against Russian Porn Actresses

The developers behind “FindFace,” which uses facial recognition software to match random photographs to people’s social media pages on Vkontakte, say the service is designed to facilitate making new friends. Released in February this year, …


A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.