“A tool to sexually exploit women and children”: Why AI will not prevent abuse

Could Artificial Intelligence be the means of preventing sexual exploitation?

A BBC article reporting on new AI sex dolls in a German cyber brothel quoted the co-founder of an agency behind one of the first AI influencers who argued that AI could mitigate ethical concerns and prevent the sexual exploitation of individuals. 

On a thread on X (formerly Twitter), our Campaigns Manager Caitlin Roper responded to this claim.

CR_tweet.png

AI won’t prevent anyone being sexually exploited. Men are already using it as a tool to sexually exploit women and children - and create porn of them where none exists. If women can be targeted and threatened without even having taken or shared an intimate photo, the possibilities for men's technologically-mediated abuse of them grow exponentially.

AI allows predators to virtually undress women with nudifying apps to degrade, humiliate and violate them.

Research has found Deepnude apps are exploding in popularity, with undressing websites attracting 24 million unique visitors in a single month. 

AI-generated content* can be used to facilitate targeted harassment – see example below of an ad for a nudifying app promoted on Twitter/X. (Source: Graphika report)

Harassment.png

AI can be trained on child sexual abuse material, producing new virtual abuse content made in the likeness of victims it was trained on. This means new fictional CSAM is being created with the likeness of children who have been sexually abused, re-victimising them and compounding their trauma.

There is also no way of knowing the woman or child in AI-generated content* is fictional, and not modelled on an actual woman or child.

Using AI, perpetrators can morph regular images into abuse images. They can modify a person’s age, turning an adult into a child, or turning a non-sexualised image into one depicting a person’s sexual abuse.Morphing.png

AI can also be used in cases of sextortion, where criminals threaten to distribute a young person’s intimate photos for blackmail purposes. Some teens have tragically taken their lives as a result. 

This is how AI is being used. It doesn’t prevent the sexual exploitation of actual women and children, it’s yet another tool for men to abuse them.


*The term “AI generated” serves to dehumanise the act of creating abuse content and shield sexual offenders - the men creating it - from critique and accountability.

In reality, AI is merely another tool men are utilising in their abuse and exploitation of women and children. 

Salter_tweet.pngSee also:

"No way of knowing if the child is fictional": How ‘virtual’ child sexual abuse material harms children

"Undress any girl you want": The explosion of Deepnude apps puts all women in danger

Made by men: How the term "AI generated" invisiblises sex offenders

Michael Salter thread on X


Add your comment

  • Caitlin Roper
    published this page in News 2024-06-13 17:07:08 +1000

You can defend their right to childhood

A world free of sexploitation is possible!

Fuel the Movement