Google’s Struggle with Deepfake Nudes and Nonconsensual Explicit Imagery
Introduction
A recent Google search for “deepfake nudes Jennifer Aniston” highlighted ongoing issues with intimate images spreading online without consent. This problem has persisted despite various proposals from Google staff and external experts to address it.
Google’s Response to NCEI
Google has made it easier to request the removal of unwanted explicit content. However, victims and their advocates have called for more proactive measures. The company has been cautious about over-regulating the internet or restricting access to legitimate adult content. A Google spokesperson mentioned that multiple teams are working to enhance safeguards against nonconsensual explicit imagery (NCEI).
The Role of AI in NCEI
The rise of AI image generators, some with minimal restrictions, has exacerbated the issue of NCEI. These tools allow almost anyone to create explicit images of individuals, from classmates to celebrities.
Google’s Takedown Efforts
In March, Google received over 13,000 requests to remove links to popular websites hosting explicit deepfakes. The company complied in about 82% of the cases.
New Measures to Combat NCEI
Google is implementing three new measures to reduce the visibility of unwanted explicit images, both real and synthetic:
- Duplicate Prevention: After honoring a takedown request, Google will try to keep duplicates out of search results.
- Filtering Similar Queries: Explicit images will be filtered from results in queries similar to those cited in the takedown request.
- Demotion of Noncompliant Websites: Websites with a high volume of successful takedown requests will be demoted in search results.
“These efforts are designed to give people added peace of mind, especially if they’re concerned about similar content about them popping up in the future,” Higham wrote.
Limitations and Future Steps
Google acknowledges that these measures are not foolproof. Former employees and victims’ advocates believe more can be done. The search engine currently warns US users searching for naked images of children that such content is illegal. While the effectiveness of this warning is uncertain, it is seen as a potential deterrent. However, similar warnings do not appear for searches seeking sexual deepfakes of adults, and this policy will not change, according to a Google spokesperson.
3 Comments
Finally, some responsibility being taken in tech industry.
Too bad for those who thrive on fantasy!.
So much for the freedom of digital artistry.