Putting women ‘in their place’: AI abuse of Taylor Swift means all women and girls are at risk

We’ve previously spoken out about the risks of AI to women and girls, including ‘deepfake’ and ‘deepnude’ apps that allow users to digitally manipulate images and videos and create pornographic and degrading images of them. 

As these technologies are exploding in popularity, and users can create highly realistic content, women and girls are overwhelmingly at risk of being victimised – as we’ve seen with the creation of pornographic images of Taylor Swift circulated last week. 

Taylor_Swift.jpeg

A tactic to put women ‘in their place’

It’s no coincidence that as Swift’s popularity has soared - she was recently named Time Magazine’s 2023 Person of the Year and officially a billionaire - she has been targeted. She is a powerful woman, and to some men, this makes her a threat.

Creating and distributing pornographic images of Swift being sexually used by men is a tactic designed to put her ‘in her place’. In a porn culture, perhaps one of the most effective ways to silence a woman, to trivialise her accomplishments and undermine her credibility is to turn her into pornography.

Reducing a woman to an object of men’s sexual gratification allows men to humiliate and degrade her, and to assert their own dominance. It is a punishment for failing to be a compliant woman, and it sends a message to other women - this is what could happen to them if they step ‘out of line’.

All women and girls are at risk

But it’s not only celebrities and public figures being targeted – women and girls everywhere are at risk. 

Already, some high schools students across the world, from New Jersey to Spain, have reported their faces were manipulated by AI and shared online by classmates. Meanwhile, a young well-known female Twitch streamer discovered her likeness was being used in a fake, explicit pornographic video that spread quickly throughout the gaming community.

“It’s not just celebrities [targeted],” said Danielle Citron, a professor at the University of Virginia School of Law. “It’s everyday people. It’s nurses, art and law students, teachers and journalists. We’ve seen stories about how this impacts high school students and people in the military. It affects everybody.” Read more in CNN. 

Harms to children

It's not just women being targeted, but children too. According to children’s charity NSPCC, young people are reaching out to Childline about the harms of AI-generated content. In our own research, we have discovered sexualised and fetish-themed AI-generated content of children posted on Instagram, and are awaiting a response from eSafety.

In our submission on eSafety Industry Standards, we documented our recent discoveries of how AI is being used to facilitate child sexual exploitation:

● On the popular AI frontend platform Chub, child sexual abuse narrative and chat generators have been found. For example, one character was a 14 year old girl confined to a hospital bed in a coma. The character description implied a male doctor's desire to abuse the defenceless child. Another character was designed to generate chats for men to fantasise about raping teen girls with disabilities. Many ‘NSFW’ characters were tagged ‘little sister’ and were designed to generate incest themed child exploitation material.

● Highly realistic sexualised imagery of prepubescent girls distributed on X (formerly Twitter). Content was often tagged #stablediffusion (denoting generation by text-to-image model created by StabilityAI). The content often had extensive reach and engagement (millions of views; thousands of likes and shares) and revealed paedophile networking and other child exploitation activity.

● Instagram has been hosting AI content fetishising young boys.

● Pornified, objectifying and sexualised AI content produced using the likeness of real women and girls. For example, Neural.Love AI ‘art generator’ hosted images of 16 year old ‘model’ and ‘influencer’. The creator titled the images ‘[girl's name] with little clothing’. She is a known victim of child exploitation. We also documented AI child exploitation images created in the likeness of a child version of actor Emma Watson.


The phenomenon of AI-generated pornography of women, and image-based abuse more broadly, communicates something about the nature of pornography. The men who create this imagery are not doing it with the intention of “empowering” women, or in the interest of women’s sexual liberation. That turning women into porn is understood to be such an effective means of shaming, silencing and humiliating them makes clear exactly what porn is and what it does to women.

See also

"Undress any girl you want": The explosion of Deepnude apps puts all women in danger

Submission to eSafety Industry Standards


Add your comment

  • Caitlin Roper
    published this page in News 2024-01-30 10:36:39 +1100

You can defend their right to childhood

A world free of sexploitation is possible!

Fuel the Movement