Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Report 4361

Associated Incidents

Incident 8655 Report
Fake AI 'Nudify' Sites Reportedly Linked to Malware Distribution by Russian Hacker Collective FIN7

Russian Hackers Are Using Fake AI "Nudify" Sites to Steal Data
futurism.com · 2024

Multiple sites masquerading as "nudify" services, which use AI to deepfake clothed photographs into often nonconsensual nudes, have been linked to a notorious Russian hacker collective that was believed to be dead.

As 404 Media reports, Zach Edwards of the cybersecurity firm Silent Push said that the Russian group Fin7 seems to be behind several websites that use variations of the name "AINude.ai" to trick their mostly male victims into giving them their info without their knowledge.

"The deepfake AI software may have an audience of mostly men with a decent amount who use other AI software or have crypto accounts," Edwards told 404. "There's a specific type of audience who wants to be on the bleeding edge of creepy (while ignoring new laws around deepfakes), and who are proactively searching out deepfake AI nude software."

Edwards and his colleagues found that these Fin7-linked AI sites contained "infostealer" malware that the site said was necessary to "nudify" images.

As its name suggests, infostealer malware targets infected machines by stealing their data and sending them off-server to hackers. Using that data, bad actors like Fin7 can threaten to release personal information — unless, of course, their victims pay up.

Mighty Fall

While this scheme is relatively run-of-the-mill for shady porn sites — which the AI nude sites link to as well — perhaps what's most shocking about Silent Push's finding is that the Russian hackers in question are supposed to be defunct.

Last year, the US Department of Justice went as far as to declare that Fin7, an unusually professional outfit that ran fake security fronts and had operatives in both Russia and Ukraine, is "no more" after three of its hackers were charged and sentenced to prison.

As this news makes clear, that declaration was premature. This hack's obvious Dropbox links containing the malware files, however, seem far less sophisticated than Fin7's previous work that involved setting up entire shell companies to get away with their scams.

"They are looking for people who are doing borderline shady things to start with," Edwards told 404, "and then having malware ready to serve to those people who are proactively hunting for something shady."

At the end of the day, it's hard to say who is worse: those trying to almost certainly nudify other peoples' images noncsensually, or those trying to rip the creeps off.

Read the Source

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2024 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • d414e0f