The Perils of Open Source AI: Nonconsensual Pornography and Deepfakes
Microsoft’s Efforts to Curb Misuse
In an attempt to address the growing concern of AI-generated nonconsensual pornography, Microsoft has implemented additional safeguards in its image generation tool. The company aims to provide a customizable solution while minimizing the potential for abuse.
The Challenges of Open Source Models
Open source AI models, however, remain largely unregulated and accessible to virtually anyone. Despite the well-intentioned efforts of some community members to discourage exploitative uses, the decentralized nature of open source makes it nearly impossible to enforce strict controls, according to experts in the field.
“Open source has powered fake image abuse and nonconsensual pornography. That’s impossible to sugarcoat or qualify,”
says Henry Ajder, an AI expert. Readily available online tutorials even provide instructions on circumventing built-in restrictions in open source models like Stable Diffusion, which gained notoriety in 2022.
The Ease of Customization with LoRAs
Smaller AI models, known as LoRAs, further complicate the issue by enabling users to fine-tune Stable Diffusion models to generate images with specific styles, concepts, or poses. These models, often featuring celebrity likenesses or explicit sexual content, are widely shared on platforms like Civitai, a community-driven marketplace for AI models. While some creators, such as the developer of a Taylor Swift plug-in, urge users to refrain from creating NSFW images, once downloaded, the model’s use is beyond their control.
The Prevalence of Deepfakes and Nonconsensual Pornography
Websites like 4chan, known for their controversial content, have become hotbeds for AI-generated nonconsensual pornography. The vast majority of deepfakes, a staggering 96 percent according to a 2019 study by Sensity AI, are nonconsensual pornography, disproportionately targeting women. Tech companies are exploring AI “watermarking” techniques to combat this issue, but the emotional and psychological toll on victims remains a pressing concern.
The Urgent Need for Safeguards
Experts like Cohen, a consultant based in Toronto, express grave concerns about the impact of AI-generated nonconsensual pornography on young women. The ability for individuals to hyper-target and exploit others is a terrifying prospect, emphasizing the need for a safer online environment for everyone, especially the most vulnerable.
Updated 3-6-2024, 8:30 pm EST: This article has been updated to note that Reddit’s policies forbid AI-generated nonconsensual intimate media.
2 Comments
Open source AI image generators sound cool, but you ever think about what happens when they get too smart
Basil: Open source AI image generators are a Pandora’s box, don’t you think? Wonder if we’re ready to handle what comes out.