We're off to Washington! MTR and Caitlin to address CESE Summit
We are excited to announce that members of the Collective Shout team will heading to Washington, DC next month for the 2024 Coalition to End Sexual Exploitation Summit, hosted by the National Center on Sexual Exploitation (NCOSE) and Phase Alliance. The theme is “The Great Collision: Emerging Tech, Sexual Exploitation, and the Ongoing Pursuit of Dignity.”
Movement Director Melinda Tankard Reist and Campaigns Manager Caitlin Roper will be presenting.
Read more“A tool to sexually exploit women and children”: Why AI will not prevent abuse
Could Artificial Intelligence be the means of preventing sexual exploitation?
A BBC article reporting on new AI sex dolls in a German cyber brothel quoted the co-founder of an agency behind one of the first AI influencers who argued that AI could mitigate ethical concerns and prevent the sexual exploitation of individuals.
On a thread on X (formerly Twitter), our Campaigns Manager Caitlin Roper responded to this claim.
Read moreWIN! Gov to legislate against non-consensual deepfake p*rn
Since 2019 we've been highlighting the harms of deepfake and nudifying apps and calling for their removal.
Last week the Government finally announced a crackdown on the creation of deepfake p*rnography.
News.com reported "That included legislation to outlaw the creation and non-consensual distribution of deepfake – a portmanteau of ‘deep learning’ and ‘fake’ – porn, and the sharing of sexually-explicit material using technology such as artificial intelligence."
From the PM's announcement:
The Albanese Government will introduce legislation to ban the creation and non-consensual distribution of deepfake p*rnography. Digitally created and altered sexually explicit material is a damaging form of abuse against women and girls that can inflict deep harm on victims. The reforms will make clear that creating and sharing sexually explicit material without consent, using technology like artificial intelligence will be subject to serious criminal penalties.
Read more"No way of knowing if the child is fictional": How ‘virtual’ child sexual abuse material harms children
Child sexual abuse material (CSAM) refers to images and videos where children are sexually abused and exploited. It is the photographic or video record of the rape, sexual abuse and torture of children and infants.
Read moreAI images of little boys in fetish gear on Instagram “not actionable”, says eSafety
We’ve discovered AI-generated images of young children on Instagram we believe constitute illegal child sexual abuse material.
The images depict little boys in minimal clothing, some adorned with fetish gear, and their bodies oiled. When we reported the account to eSafety, we were told the content was not actionable given the lack of nudity and “non-sexualised posing”.
Read moreMade by men: How the term "AI generated" invisiblises sex offenders
Identifying perpetrators and victims of AI abuse
From the increasing popularity of ‘deepnude’ apps that virtually undress women, to the creation of degrading and pornographic content of women (including Taylor Swift), we’re starting to get a glimpse of how AI can be used to sexually exploit and abuse women and children online.
Read morePutting women ‘in their place’: AI abuse of Taylor Swift means all women and girls are at risk
We’ve previously spoken out about the risks of AI to women and girls, including ‘deepfake’ and ‘deepnude’ apps that allow users to digitally manipulate images and videos and create pornographic and degrading images of them.
As these technologies are exploding in popularity, and users can create highly realistic content, women and girls are overwhelmingly at risk of being victimised – as we’ve seen with the creation of pornographic images of Taylor Swift circulated last week.
Read more"Undress any girl you want": The explosion of Deepnude apps puts all women in danger
"Increasing prominence and accessibility of these services will very likely lead to further instances of online harm, such as the creation and dissemination of non-consensual nude images, targeted harassment campaigns, sextortion, and the generation of child sexual abuse material."
New research has found Deepnude apps are exploding in popularity, with undressing websites attracting 24 million unique visitors in September 2023.
Deepnude apps allow users to virtually undress a woman, using AI to manipulate an image or video of a woman to remove her clothing. Many apps only work on women.
Read more