Shein: Stop selling child sex abuse dolls!
Just two weeks after we announced a victory against Temu after they pulled child sex abuse dolls from sale, we’ve discovered child sex abuse dolls, disembodied replica child heads and legs sold by online retailer Shein.
We've previously called out Shein for objectifying women and sexualising girls in their advertising and through flogging sexualised toys and see-through gstrings for toddler girls and fetishising schoolgirls with 'sexy schoolgirl' costumes. Given their track record, maybe we shouldn't be surprised that now the company is selling replica little girls and body parts for men's sexual use?
We approached the company’s Global Head of Strategic Communications, Peter Pernot-Day, with our concerns, but were ignored.
Child sex abuse dolls normalise and encourage men’s sexual use and abuse of children. These products are illegal and constitute child sexual abuse material.
Take Action
Tell Shein to remove these listings from sale and commit to never selling them again.
Facebook: Shein Official Shein Australia
Twitter/X
Instagram: Shein Official Shein Australia
Shein support
“A tool to sexually exploit women and children”: Why AI will not prevent abuse
Could Artificial Intelligence be the means of preventing sexual exploitation?
A BBC article reporting on new AI sex dolls in a German cyber brothel quoted the co-founder of an agency behind one of the first AI influencers who argued that AI could mitigate ethical concerns and prevent the sexual exploitation of individuals.
On a thread on X (formerly Twitter), our Campaigns Manager Caitlin Roper responded to this claim.
Read more"No way of knowing if the child is fictional": How ‘virtual’ child sexual abuse material harms children
Child sexual abuse material (CSAM) refers to images and videos where children are sexually abused and exploited. It is the photographic or video record of the rape, sexual abuse and torture of children and infants.
Read moreAI images of little boys in fetish gear on Instagram “not actionable”, says eSafety
We’ve discovered AI-generated images of young children on Instagram we believe constitute illegal child sexual abuse material.
The images depict little boys in minimal clothing, some adorned with fetish gear, and their bodies oiled. When we reported the account to eSafety, we were told the content was not actionable given the lack of nudity and “non-sexualised posing”.
Read moreNo evidence child sex abuse dolls prevent abuse: Our response to viral video
A video posted to Twitter featuring comments from a US senator defending child sex abuse dolls has gone viral.
In the video, Kentucky state senator Karen Berg claimed that there was “conclusive” research finding access to child sex abuse dolls could prevent paedophiles sexually abusing children.
This is false. I’ve been researching sex dolls and child sex abuse dolls for years, and I’ve written a book on the topic. “Conclusive” evidence that child sex abuse dolls reduce or prevent sexual abuse does not exist. (And I reject that claim completely.) 🧵
— Caitlin Roper (@caitlin_roper) March 2, 2024
vid - @way2muchJRMC pic.twitter.com/WjIWvoJGis
The video attracted significant negative attention, with the senator issuing a statement in response – and ultimately, voting in support of a motion to outlaw child sex abuse dolls.
So what does the research say about child sex abuse dolls? (Content warning - may be distressing for some readers.)
Read moreMade by men: How the term "AI generated" invisiblises sex offenders
Identifying perpetrators and victims of AI abuse
From the increasing popularity of ‘deepnude’ apps that virtually undress women, to the creation of degrading and pornographic content of women (including Taylor Swift), we’re starting to get a glimpse of how AI can be used to sexually exploit and abuse women and children online.
Read morePutting women ‘in their place’: AI abuse of Taylor Swift means all women and girls are at risk
We’ve previously spoken out about the risks of AI to women and girls, including ‘deepfake’ and ‘deepnude’ apps that allow users to digitally manipulate images and videos and create pornographic and degrading images of them.
As these technologies are exploding in popularity, and users can create highly realistic content, women and girls are overwhelmingly at risk of being victimised – as we’ve seen with the creation of pornographic images of Taylor Swift circulated last week.
Read more