Made by men: How the term "AI generated" invisiblises sex offenders

Identifying perpetrators and victims of AI abuse

From the increasing popularity of ‘deepnude’ apps that virtually undress women, to the creation of degrading and pornographic content of women (including Taylor Swift), we’re starting to get a glimpse of how AI can be used to sexually exploit and abuse women and children online.

Generally, the term Artificial intelligence (AI) refers to the simulation of human intelligence in machines. 'Generative AI' is more specific. It describes the process of using machine learning to create new digital content including text, images, audio and video. (Read more about Generative AI on eSafety's website here.)

As we engage in a much-needed discussion about AI and how to protect women and children from this form of image-based abuse, we need to consider who is behind these violations.

As we argued in our recent submission to eSafety Industry Standards, AI itself is not creating child sexual abuse material (CSAM) or image-based abuse material. AI content is generated by real people who prompt machine learning software. This software is trained on a vast body of digitised images and videos including real child sexual abuse material, images of real children and other real pornography created by real people.

#AIgirl on X and Instagram

For this reason we question the validity of the term “AI generated” in connection with CSAM and image-based abuse material. We know that these types of abuse are highly gendered, with males most often being the perpetrators and consumers of CSAM. Males are also more likely to perpetrate image based sexual abuse, while females are more likely to be victimised by a partner or ex-partner.

We think the term “AI generated” serves to dehumanise the act of creating abuse content and shield sexual offenders - the men creating it - from critique and accountability. In reality, AI is merely another tool men are utilising in their abuse and exploitation of women and children.

We have urged eSafety to ensure its discussions of AI always directly address the reality of how it is created and who its perpetrators and victims are, and will continue to push back against language which shields sex offenders.

Take action

*Sign the petition - Stop AI Image Theft!

We are supporting organisers Chelsea Bonner, Robyn Lawley & Tracey Spicer with their petition to implement regulation for the use of AI images. 

Add your name>> https://chng.it/2tmfHYNsFj

See also

"Undress any girl you want": The explosion of Deepnude apps puts all women in danger


Add your comment

  • Collective Shout
    published this page in News 2024-02-02 10:16:10 +1100

You can defend their right to childhood

A world free of sexploitation is possible!

Fuel the Movement