Westfield sex store Honey Birdette now backed by Euro mega pimp
*Content warning
Docler Holding buys major stake in Playboy
This is Gyorgy 'Gyuri' Gattyan.
Read moreInput for UN study on technology-facilitated gender-based violence
Submission to the Office of the High Commissioner, Human Rights Council
Read moreSubmission on Human Rights Council forms of sex-based violence against women and girls: new frontiers and emerging issues
AI, "deepfakes" and sex dolls highlighted in our evidence to UN Special Rapporteur on Violence Against Women and Girls
Read moreOnline Safety Act Review Report echoes our recommendations
We were pleased to contribute to the Statutory Review of the Online Safety Act 2021 in June 2024. It was one of 10 submissions we made in a record breaking year in which we also achieved 34 victories!
Read moreWe're off to Washington! MTR and Caitlin to address CESE Summit
We are excited to announce that members of the Collective Shout team will heading to Washington, DC next month for the 2024 Coalition to End Sexual Exploitation Summit, hosted by the National Center on Sexual Exploitation (NCOSE) and Phase Alliance. The theme is “The Great Collision: Emerging Tech, Sexual Exploitation, and the Ongoing Pursuit of Dignity.”
Movement Director Melinda Tankard Reist and Campaigns Manager Caitlin Roper will be presenting.
Read more“A tool to sexually exploit women and children”: Why AI will not prevent abuse
Could Artificial Intelligence be the means of preventing sexual exploitation?
A BBC article reporting on new AI sex dolls in a German cyber brothel quoted the co-founder of an agency behind one of the first AI influencers who argued that AI could mitigate ethical concerns and prevent the sexual exploitation of individuals.
On a thread on X (formerly Twitter), our Campaigns Manager Caitlin Roper responded to this claim.
Read moreWIN! Gov to legislate against non-consensual deepfake p*rn
Since 2019 we've been highlighting the harms of deepfake and nudifying apps and calling for their removal.
Last week the Government finally announced a crackdown on the creation of deepfake p*rnography.
News.com reported "That included legislation to outlaw the creation and non-consensual distribution of deepfake – a portmanteau of ‘deep learning’ and ‘fake’ – porn, and the sharing of sexually-explicit material using technology such as artificial intelligence."
From the PM's announcement:
The Albanese Government will introduce legislation to ban the creation and non-consensual distribution of deepfake p*rnography. Digitally created and altered sexually explicit material is a damaging form of abuse against women and girls that can inflict deep harm on victims. The reforms will make clear that creating and sharing sexually explicit material without consent, using technology like artificial intelligence will be subject to serious criminal penalties.
Read more"No way of knowing if the child is fictional": How ‘virtual’ child sexual abuse material harms children
Child sexual abuse material (CSAM) refers to images and videos where children are sexually abused and exploited. It is the photographic or video record of the rape, sexual abuse and torture of children and infants.
Read moreAI images of little boys in fetish gear on Instagram “not actionable”, says eSafety
We’ve discovered AI-generated images of young children on Instagram we believe constitute illegal child sexual abuse material.
The images depict little boys in minimal clothing, some adorned with fetish gear, and their bodies oiled. When we reported the account to eSafety, we were told the content was not actionable given the lack of nudity and “non-sexualised posing”.
Read moreMade by men: How the term "AI generated" invisiblises sex offenders
Identifying perpetrators and victims of AI abuse
From the increasing popularity of ‘deepnude’ apps that virtually undress women, to the creation of degrading and pornographic content of women (including Taylor Swift), we’re starting to get a glimpse of how AI can be used to sexually exploit and abuse women and children online.
Read more