Rape, Child Exploitation + Image-Based Abuse: The Truth About OnlyFans
OnlyFans, a subscriber-only social media platform that allows people to sell pornographic content of themselves, is portrayed as a great way for young women to make money, and a better, safer option than traditional prostitution. Some of the dominant media narratives in support of OnlyFans frame it as an ethical alternative to mainstream porn sites, as the means for sellers to have more control over the content they produce, and even as a confidence-building endeavour for women (see our report Side Hustles and Sexual Exploitation).
We’ve previously exposed how OnlyFans puts women and risk and leaves them vulnerable to exploitation. An increasing number of young female content creators report degrading and violent requests, abusive and predatory treatment, as well as doxxing, image-based abuse and stalking. We also published an account from former OnlyFans recruiter Victoria revealing how women were exploited and degraded on the platform.
But it’s even worse. A growing number of reports expose OnlyFans hosting – and profiting from – illegal and abusive content, including videos of children, rape and non-consensually produced or shared material.
Read moreWIN! Gov to legislate against non-consensual deepfake p*rn
Since 2019 we've been highlighting the harms of deepfake and nudifying apps and calling for their removal.
Last week the Government finally announced a crackdown on the creation of deepfake p*rnography.
News.com reported "That included legislation to outlaw the creation and non-consensual distribution of deepfake – a portmanteau of ‘deep learning’ and ‘fake’ – porn, and the sharing of sexually-explicit material using technology such as artificial intelligence."
From the PM's announcement:
The Albanese Government will introduce legislation to ban the creation and non-consensual distribution of deepfake p*rnography. Digitally created and altered sexually explicit material is a damaging form of abuse against women and girls that can inflict deep harm on victims. The reforms will make clear that creating and sharing sexually explicit material without consent, using technology like artificial intelligence will be subject to serious criminal penalties.
Read morePutting women ‘in their place’: AI abuse of Taylor Swift means all women and girls are at risk
We’ve previously spoken out about the risks of AI to women and girls, including ‘deepfake’ and ‘deepnude’ apps that allow users to digitally manipulate images and videos and create pornographic and degrading images of them.
As these technologies are exploding in popularity, and users can create highly realistic content, women and girls are overwhelmingly at risk of being victimised – as we’ve seen with the creation of pornographic images of Taylor Swift circulated last week.
Read more"Undress any girl you want": The explosion of Deepnude apps puts all women in danger
"Increasing prominence and accessibility of these services will very likely lead to further instances of online harm, such as the creation and dissemination of non-consensual nude images, targeted harassment campaigns, sextortion, and the generation of child sexual abuse material."
New research has found Deepnude apps are exploding in popularity, with undressing websites attracting 24 million unique visitors in September 2023.
Deepnude apps allow users to virtually undress a woman, using AI to manipulate an image or video of a woman to remove her clothing. Many apps only work on women.
Read moreSubmission to Social Media and Online Safety inquiry
Our call to rein in big tech 'rogue states'
Read moreSubmission to Canadian parliamentary Ethics Committee: Protection of Privacy and Reputation on Platforms such as Pornhub
Collective Shout welcomed the opportunity to provide a brief to assist Canada's House of Commons Standing Committee on Access to Information, Privacy and Ethics in its examination of the conduct of MindGeek - Pornhub's parent company - including allegations of facilitating and distributing Child Sexual Abuse Material (CSAM), non-consensual sexual activity, non-consensually shared images (image-based abuse or IBA) and content created using victims of sex trafficking.
Read more