The darker side of artificial intelligence: how the multi-million dollar global fake nude industry operates

An epidemic of commercial applications that use Artificial Intelligence (AI) to generate non-consensual nude images is spreading across the internet—right in front of everyone's eyes. Although some governments are pushing for legal measures to lower their shutters, millions of people continue to access these providers, and their promoters are making millions in profits .
The explicit content generated by these deepfake platforms, which is legally referred to as non-consensual intimate image abuse (NIIA), has become a growing and sophisticated form of digital violence .
This phenomenon, considered collateral damage of generative AI advances, gave rise to a dark ecosystem composed of “nudification” websites, algorithms and bots designed to produce artificial images without the victims' consent.
These explicit shots, once reserved for those with advanced technical knowledge, are now accessible to anyone thanks to platforms that automate these processes. In just a few clicks, they modify clothing, body shape, or pose, generating sexual content with disturbing ease.
Apps are capable of erasing clothes from any photo.
Since the beginning of this year, advertising for platforms offering AI-generated nudes has increased by 2,400% , according to a study by the University of Florida . These campaigns are typically disseminated through major social media platforms, where these sites promote their services with the aim of attracting new customers.
"These images represent a radically different paradigm than those created with Photoshop. Generative AI makes their creation easier, faster, and more realistic," explains Kevin Butler, PhD from the Department of Computer Science at the University of Florida.
And he adds: "Synthetic images have the potential to cause harm. This constitutes a form of sexual abuse against the subject who, by definition, is depicted without consent."
These attacks began by distributing fake photos of pop stars like Taylor Swift or Billie Eilish , spread to public figures like Italy's Prime Minister, Giorgia Meloni , and eventually targeted any individual, regardless of sex or age.
A survey by Save the Children , released a week ago, found that one in five young people said someone had shared AI-generated images of them naked when they were underage and without their consent.
"These figures represent only the tip of the iceberg, as most cases remain unreported , partly due to the lack of reporting and the difficulties in detecting them, which are exacerbated when these incidents occur online," warns Catalina Perazzo, Director of Social and Political Advocacy at Save the Children.
The anonymity, virality, and technical ease with which these images are generated—as Wired points out—contribute to their rapid spread, leaving deep emotional scars and exposing unresolved legal loopholes .
These pornography markets, which operate with apparent legitimacy, host their data on conventional cloud services like Amazon Web Services and Cloudflare, without these providers being aware of how the data is being used. "They are hosted on trusted platforms," revealing a disturbing technological normalization.
At ClothOff, the motto is "undressing photos has never been easier."
An analysis by research site Indicator revealed the alarming scope of the nudist business: in just six months, these portals recorded a monthly average of 18.5 million visitors.
The economic estimates for this leviathan are equally compelling: the combined profits of sites like Deepnude, UndressAI.tools, and PornWorks AI could exceed $36 million annually.
One of the most popular is ClothOff, which receives more than 4 million monthly visitors and proclaims "Get your nude photo processed in just 15 seconds ." The app can be accessed via mobile phone by clicking a button confirming that the user is over 18. It charges approximately $6 for every 144 credits.
The model behind this lucrative system is based on the sale of "credits" or subscriptions that allow users to generate fake images and videos with non-consensual nudity. Payments, channeled through cryptocurrency , provide operators with a double benefit: anonymity and financial stability .
AI "nudification" services have become a lucrative business, sustained by the lack of real control over generative AI, denounces Alexios Mantzarlis, co-founder of Indicator and online security researcher. "Silicon Valley's laissez-faire approach has allowed this market to persist, even when it became clear that its sole use was sexual harassment," he warns.
These materials are often hyperrealistic , maximizing their ability to cause harm and deceive both victims and viewers.
While early nudists focused almost exclusively on images, technological advancements have allowed for the manipulation and fabrication of altered videos , taking the phenomenon to a new level of danger and scope.
The business isn't without risks for those who try to take advantage of it. Its popularity has led to the emergence of multiple fraudulent sites , some operated by hackers offering apps modified with malware to capture personal and financial data.
Without technical knowledge you can manipulate a photo to your liking.
Permanently removing this material is very difficult ; abusive content continues to reappear, prolonging the emotional stress and perceived vulnerability of victims. To combat this digital scourge, the United States and Great Britain have launched legal and regulatory actions .
President Donald Trump signed the bipartisan "Take It Down" Act, which criminalizes the non-consensual posting of intimate images and requires their removal from platforms that have featured them.
Meta also recently announced it would file a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules for running ads on its platforms.
A recent survey by Thorn, a nonprofit organization that combats online child exploitation, found that 6% of American teens have been victims of fake nudes . The phenomenon, already a cause for concern due to its psychological impact, becomes even more serious when it intersects with crimes like sextortion .
The FBI warned of a "horrific increase" in cases targeting minors, primarily boys between the ages of 14 and 17. According to the agency, these threats have led to an "alarming number of suicides" due to the shame of being exposed.
One such tragedy occurred this year in Kentucky, where a teenager took his own life after receiving extortion messages demanding $3,000 to keep a fake, nude image of himself created using artificial intelligence (AI) secret. His parents discovered the messages after his death.
Clarin