Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4365

Associated Incidents

Incident 8655 Report
Fake AI 'Nudify' Sites Reportedly Linked to Malware Distribution by Russian Hacker Collective FIN7

Loading...
Breach Roundup: AI 'Nudify' Sites Serve Malware
bankinfosecurity.com · 2024

"Nudify" websites promising fake pornographic content based on a real-life photo may serve up malware alongside the sexual abuse.

Researchers from Silent Push in research published Wednesday observed the Russia-based, financially motivated threat group commonly tracked as Fin7 running a network of websites that promise to to digitally undress women. The sites, many under the brandname aiNude.ai, embed a Trojan or infostealer in a web extension or other file that users are directed to download. 404 Media reported that some of the malicious sites allowed users to upload images. "The site did not nudify the image, but did display it on screen. After uploading a photo to nudify, one of the sites then said a 'trial is ready for download.'"

Fin7 is serving up Lumma Stealer, the NetSupport remote access Trojan and Redline credential sealing malware.

The threat actor - also tracked as Carbon Spider, Elbrus and Sangria Tempest - has been active since 2013. Security researchers have found indications of its involvement in deploying REvil and DarkSide ransomware. Microsoft last year said the group has ties to the Clop ransomware gang.

The group runs two versions of nudify sites: one offering a free download of a "Deepnude Generator" tool, and another providing a putative free trial, using search engine optimization tactics to boost rankings of its sites.

Sites that create nude deepfakes have proliferated online along with the public availability of generative AI image models. San Francisco city attorney David Chiu in August sued 16 of the most popular "nudify" websites and apps, accusing them of violtating state and federal laws against sexual abuse and harassment. The FBI in June warned that malicious actors used nude deepfakes as blackmail material.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd