"Increasing prominence and accessibility of these services will very likely lead to further instances of online harm, such as the creation and dissemination of non-consensual nude images, targeted harassment campaigns, sextortion, and the generation of child sexual abuse material."
New research has found Deepnude apps are exploding in popularity, with undressing websites attracting 24 million unique visitors in September 2023.
Deepnude apps allow users to virtually undress a woman, using AI to manipulate an image or video of a woman to remove her clothing. Many apps only work on women.
A Revealing Picture, a report by intelligence company Graphika, found that while these services previously existed on niche internet forums, the creation and dissemination of "synthetic non-consensual intimate imagery" (NCII) had moved to an automated, scaled and monetised business, with many of the providers using popular social media platforms to market their services.
Creating non-consensual intimate imagery is quicker and easier than ever before. According to Graphika, the increasing capability and accessibility of open-source AI image diffusion models allow a larger number of providers to create realistic content without having to host, maintain and run their own custom image models, which can be expensive and time-consuming.
In many cases, users can begin generating and accessing synthetic NCII within minutes of first visiting a provider’s website or Telegram group, often for no upfront cost.
Graphika warns,
increasing prominence and accessibility of these services will very likely lead to further instances of online harm, such as the creation and dissemination of non-consensual nude images, targeted harassment campaigns, sextortion, and the generation of child sexual abuse material.
The report included advertising material an NCII provider shared on X (formerly Twitter) which promotes the service as a tool for sexual harassment.
A 2020 report by UK domestic violence service Refuge exposed how threats to share intimate or sexual images and videos are increasingly being used as a tool of coercive control and domestic abuse, with a marked increase in survivors reporting such threats by current and former partners.
If women can be targeted and threatened without even having taken or shared an intimate photo, the possibilities for men's technologically-mediated abuse of them grow exponentially.
We wrote about the growing phenomenon of Deepnude apps back in 2020 as a new way for men to prey on women. We noted that the appeal of this technology is in the criminal violation of women, that these images were designed to humiliate women - women who it was understood did not consent to the creation and distribution of these images.
We are working on a campaign against these apps and will update our supporters soon!
See also
Deepnude app taken offline after backlash
New Deepnude app allows men to produce nude images of women
Survivors call for legislation against image based sexual abuse
Add your comment