A significant rise in AI deepfake nude services, as reported by Graphika, highlights a disturbing trend in the use of artificial intelligence for generating non-consensual intimate images, raising serious ethical and legal concerns.
Growing popularity of AI undressing tools
Graphika, a social media analytics company, has reported a worrying increase in the use of “AI undressing” tools. These generative AI tools are being precisely calibrated to remove clothing from images submitted by users, with a staggering 2,408% increase in online references to these services.
Synthetic non-consensual intimate images (NCII) typically involve generating explicit content without the consent of the individuals depicted. These tools pose threats of targeted harassment, sextortion, and the production of child sexual abuse material (CSAM).
The Internet Watch Foundation (IWF), a UK-based internet watchdog firm, has highlighted the challenge in distinguishing between AI-generated pornography and authentic images, complicating efforts to combat online abuse.
International response
The United Nations has described AI-generated media as a “serious and urgent” threat to information integrity. In response, the European Parliament and Council have recently agreed on rules governing the use of AI in the EU.