The Perils of Open Source AI: Nonconsensual Pornography and Deepfakes
The Double-Edged Sword of Accessible AI Models
As AI-generated images become increasingly realistic, the potential for abuse grows. Open source models, while democratizing access to powerful tools, also enable the creation of nonconsensual pornography and deepfakes. Without built-in safeguards, these models can be easily manipulated by anyone, making it nearly impossible to control their misuse.
“Open source has powered fake image abuse and nonconsensual pornography. That’s impossible to sugarcoat or qualify,” says Henry Ajder, an AI expert and consultant.
The Dangers of LoRAs and AI Model Marketplaces
LoRAs, or Latent Optimization for Representation Adaptation, allow users to fine-tune AI models like Stable Diffusion to generate images with specific styles, concepts, or poses. These models are readily available on platforms like Civitai, where creators share and download them. Despite some creators’ efforts to discourage NSFW use, once downloaded, the models are beyond their control.
The Prevalence of Nonconsensual Pornography and Its Impact on Women
A 2019 study by Sensity AI revealed that 96 percent of deepfakes are nonconsensual pornography, disproportionately targeting women. This creates a hostile and unsafe environment for young women, as individuals can be hyper-targeted by malicious actors. The consequences extend beyond high-profile figures like Taylor Swift, affecting everyday people in profound ways.
Efforts to Combat AI-Generated Abuse
As the threat of AI-generated nonconsensual content grows, various stakeholders are exploring solutions. Some companies, like Microsoft, have introduced new controls to their image generators. Others are investigating AI watermarking techniques to identify manipulated content. However, the open source nature of many AI models makes it challenging to implement comprehensive safeguards.
“I’m generally an optimist, not a doomer, but I also have a daughter, and I want her to grow up in a world that’s safe for everyone else,” says Barak Cohen, a Toronto-based consultant.
As AI technology advances, it is crucial to address the potential for abuse and develop robust measures to protect individuals, particularly women, from the harmful effects of nonconsensual pornography and deepfakes.
4 Comments
The ethical dilemmas these AI tools present are as fascinating as they are worrisome, wouldn’t you agree
Peregrine: Open source AI image generators? Talk about skating on the thin ice of copyright laws!
Open Source AI image generators, a Pandora’s box or the future of creativity? Let’s dive in!
Wrenching open the box of AI image generators, are we ready for what spills out