The Dark Side of AI: Websites Offering Deepfake Nudes of Women and Girls
As AI-powered image generators become more accessible, a disturbing trend has emerged: websites that digitally remove the clothes of people in photos. One such site has an unsettling feature that provides a glimpse into how these apps are being misused: two feeds of user-uploaded photos intended to “nudify” the subjects.
Shocking Display of Intended Victims
The image feeds on the site are a shocking display of intended victims, including photos of girls who were clearly minors. Other photos showed adults, with captions indicating they were female friends or strangers. The site’s homepage does not display any fake nude images to visitors who aren’t logged in.
Cryptocurrency and Pricing
To create and save deepfake nude images, users are asked to log in using a cryptocurrency wallet. While pricing isn’t currently listed, a 2022 video posted by an affiliated YouTube page showed that users could buy credits to create deepfake nudes, starting at 5 credits for $5. WIRED learned about the site from a post on a subreddit about NFT marketplace OpenSea, which linked to the YouTube page. After being contacted by WIRED, YouTube terminated the channel, and Reddit banned the user.
Cloudflare’s Involvement
The site’s IP address, which went live in February 2022, belongs to internet security and infrastructure provider Cloudflare. Company spokesperson Jackie Dutton noted the difference between providing a site’s IP address and hosting its contents, which Cloudflare does not do. WIRED notified the National Center for Missing & Exploited Children, which helps report cases of child exploitation to law enforcement.
The Reality of AI-Generated Nonconsensual Imagery
Mary Anne Franks, a professor at the George Washington University School of Law who has studied the problem of nonconsensual explicit imagery, highlights a grim reality:
There’s gonna be all kinds of sites like this that are impossible to chase down, and most victims have no idea that this has happened to them until someone happens to flag it for them.
Disturbing User-Submitted Photos
The website reviewed by WIRED has feeds with apparently user-submitted photos on two separate pages: “Home” and “Explore.” Several of the photos clearly showed girls under the age of 18, including a young girl with a flower in her hair standing against a tree and a girl in what appears to be a middle or high school classroom. Captions on the images indicated they include photos of friends, classmates, romantic partners, and even strangers.
Celebrities and Influencers Targeted
Many of the photos showed influencers who are popular on TikTok, Instagram, and other social media platforms. The most-viewed people on the site include actor Jenna Ortega, singer-songwriter Taylor Swift, and an influencer and DJ from Malaysia. Swift and Ortega have been targeted with deepfake nudes before, triggering discussions about the impacts of deepfakes and the need for greater legal protections for victims.
Legal Implications and CSAM
In the US, no federal law specifically targets the distribution of fake, nonconsensual nude images, although a handful of states have enacted their own laws. However, AI-generated nude images of minors fall under the same category as other child sexual abuse material (CSAM), according to Jennifer Newman, executive director of the NCMEC’s Exploited Children’s Division. In 2023, NCMEC received about 4,700 reports that “somehow connect to generative AI technology.”
Cryptocurrency Wallet Integration and NFTs
The deepnude site asks users to log in using either a Coinbase, Metamask, or WalletConnect cryptocurrency wallet to create and save deepfake nude images. Coinbase has launched an internal investigation into the site’s integration with their wallet, while WalletConnect and ConsenSys-owned Metamask did not respond to requests for comment.
In 2022, the site listed 30 NFTs on OpenSea, a marketplace for NFTs, featuring unedited pictures of female Instagram and TikTok influencers. Buying an NFT with ether cryptocurrency granted access to the website. OpenSea deleted the listings and the account within 90 minutes of being contacted by WIRED.
The Creators Behind the Site
The identity and number of people behind the deepnude website remain unclear. The now-deleted OpenSea account had a profile image identical to a Google Image result for “nerd,” and the account bio claimed the creator’s mantra was to “reveal the shitty thing in this world” and share it with “all douche and pathetic bros.” An archive of the website from March 2022 claimed that the site “was created by 9 horny skill-full people,” with facetious job titles like Horny Director, Scary Stalker, and Booty Director.
The deepnude website and the disturbing trend it represents highlight the urgent need for stronger legal protections and increased awareness about the dangers of AI-generated nonconsensual imagery, particularly when it involves minors.
4 Comments
The erosion of privacy has hit a new low, hasn’t it
In an age where reality is already blurry, do we really need deepfake nudes adding to the chaos
Isn’t it terrifying how technology can shatter privacy with just a few clicks?!
Aria Patel: The line between innovation and violation is thinner than ever, huh.