Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 Inquiry
Criminal Code Amendment (Deepfake Sexual Material) Bill 2024
Legal and Constitutional Affairs Committee
The case for criminalising ‘creation’ alone, not just ‘distribution’
Read moreRape, Child Exploitation + Image-Based Abuse: The Truth About OnlyFans
OnlyFans, a subscriber-only social media platform that allows people to sell pornographic content of themselves, is portrayed as a great way for young women to make money, and a better, safer option than traditional prostitution. Some of the dominant media narratives in support of OnlyFans frame it as an ethical alternative to mainstream porn sites, as the means for sellers to have more control over the content they produce, and even as a confidence-building endeavour for women (see our report Side Hustles and Sexual Exploitation).
We’ve previously exposed how OnlyFans puts women and risk and leaves them vulnerable to exploitation. An increasing number of young female content creators report degrading and violent requests, abusive and predatory treatment, as well as doxxing, image-based abuse and stalking. We also published an account from former OnlyFans recruiter Victoria revealing how women were exploited and degraded on the platform.
But it’s even worse. A growing number of reports expose OnlyFans hosting – and profiting from – illegal and abusive content, including videos of children, rape and non-consensually produced or shared material.
Read moreMade by men: How the term "AI generated" invisiblises sex offenders
Identifying perpetrators and victims of AI abuse
From the increasing popularity of ‘deepnude’ apps that virtually undress women, to the creation of degrading and pornographic content of women (including Taylor Swift), we’re starting to get a glimpse of how AI can be used to sexually exploit and abuse women and children online.
Read morePutting women ‘in their place’: AI abuse of Taylor Swift means all women and girls are at risk
We’ve previously spoken out about the risks of AI to women and girls, including ‘deepfake’ and ‘deepnude’ apps that allow users to digitally manipulate images and videos and create pornographic and degrading images of them.
As these technologies are exploding in popularity, and users can create highly realistic content, women and girls are overwhelmingly at risk of being victimised – as we’ve seen with the creation of pornographic images of Taylor Swift circulated last week.
Read more"Undress any girl you want": The explosion of Deepnude apps puts all women in danger
"Increasing prominence and accessibility of these services will very likely lead to further instances of online harm, such as the creation and dissemination of non-consensual nude images, targeted harassment campaigns, sextortion, and the generation of child sexual abuse material."
New research has found Deepnude apps are exploding in popularity, with undressing websites attracting 24 million unique visitors in September 2023.
Deepnude apps allow users to virtually undress a woman, using AI to manipulate an image or video of a woman to remove her clothing. Many apps only work on women.
Read moreReddit facilitates image-based sexual abuse
Turning a blind eye to survivors
Sign the Global Joint Letter here
In partnership with National Center on Sexual Exploitation (NCOSE)
Read more