Child sexual abuse material (CSAM) refers to images and videos where children are sexually abused and exploited. It is the photographic or video record of the rape, sexual abuse and torture of children and infants.
‘Virtual’ child sexual abuse material is produced without the use of living children, that claims to be depicting fictional children. It can include computer-generated/AI-generated* depictions of children, cartoons or drawings, and child sex abuse dolls. Under Australian law, this material constitutes illegal child sexual abuse material. The Commonwealth Criminal Code prohibits the sale, production, possession and distribution of offensive and abusive material that depicts a person, or is a representation of a person, who is or appears to be under 18.
While some paedophile rights groups and defenders of virtual child sexual abuse material - or what they call “fantasy sexual outlets” - claim its use is victimless, this material harms children in a number of ways.
How children are harmed by men’s production and use of virtual CSAM
Virtual CSAM can be created depicting actual children – including abuse victims
AI software can be trained on existing child sexual abuse material, and as a result, it produces new virtual content that resembles the actual victims it was trained on. This means new fictional CSAM is being created with the likeness of children who have been sexually abused, revictimising them and compounding their trauma.
Through AI, perpetrators can morph regular images into CSAM. An individual’s age can be changed, with a photo of a young adult being morphed to look like a child. A non-sexualised image of a child could be morphed into an image depicting them being sexually abused.
Image from Generative ML and CSAM: Implications and Mitigations (Thiel, Stroebel & Portnoff, 2023)
As some scholars point out, there is no way of knowing that the child depicted in virtual CSAM is fictional. The child depicted could be modelled on an actual child (just as we have documented in the production of child sex abuse dolls).
Perpetrators can also use AI to modify real child sexual abuse images so they appear to be fake, editing them to look like cartoons or sketches and in the process, concealing the abuse of an actual child.
Increasingly realistic virtual CSAM impedes law enforcement investigations
Technological advances in Artificial Intelligence software has led to the development of increasingly realistic content. Experts believe that AI-generated CSAM will soon be indistinguishable from actual depictions of children being sexually abused.
The implications of this are huge. If images and videos of children being sexually abused cannot be distinguished from virtual CSAM, it will significantly hinder law enforcement’s ability to investigate them – and harder to protect actual child abuse victims.
The onslaught of virtual CSAM will likely overwhelm reporting systems. Given the highly realistic nature of virtual CSAM, investigators will require new tools and specialised expertise for the difficult and time-consuming process of trying to discern between images depicting real children and fictional ones. This could lead to prosecutors pursuing charges against offenders less frequently.
While in Australia, virtual child sexual abuse material is illegal, in the US, if the content does not depict an actual child it is deemed “protected speech” and is legal – meaning the burden of proof is on the Government to prove the child depicted is real, and CSAM producers could claim their abuse images are computer-generated and escape conviction.
Virtual CSAM can be used to groom children for sexual abuse
Just as CSAM has been used by predators to groom children for sexual abuse, virtual CSAM can be used in the same way. Virtual CSAM can include popular children’s cartoon characters depicted engaging in sex acts and appearing happy.
Virtual CSAM could increase the demand for CSAM and lead to escalation in offenders
Patterns of escalation in child sexual offenders have been documented in research literature. Some offenders start by consuming CSAM – images and videos of children being sexually abused – and progress to contact offending, where they sexually abuse children themselves. Experts argue virtual CSAM consumption could function in the same way, with virtual content operating as the starting point (especially as it may be seen as a less risky option) and escalating as the individual becomes desensitised over time, and virtual CSAM is no longer exciting.
According to research, perpetrators’ consumption of CSAM “can reinforce existing fantasies and can be used to give oneself permission to act on them”.
In a qualitative study with convicted CSAM offenders, one offender suggested the images gave them permission: “[i]t made me want to do the things I wanted to do. It gave me more courage to do them ... knowing that I’ve seen it on there ... they were doing it ... I can do it”.
Virtual CSAM normalises men’s sexual abuse of children
Ultimately depictions of child sexual abuse, including virtual content, serve to normalise the sexual use and abuse of children and encourage offenders. As the UN Special Rapporteur on the sale and sexual exploitation of children, including child prostitution, child pornography and other child sexual abuse material wrote in her 2020 report:
The increased accessibility and availability of child sexual abuse material online appears to normalise this crime and may encourage potential offenders and increase the severity of abuse. This includes new phenomena, such as drawings and virtual representations of non-existing children in a sexualised manner, widely available on the Internet.
The increasing social acceptance of early sexualisation is exacerbated by the widespread dissemination of child sexual abuse material on the Internet and the production of highly realistic representations of non-existing children. This objectification of children comforts offenders in their actions.
We need your help to fight porn's new super weapon AI. With your help we can:
- Call out Big Tech companies profiting from AI images which exploit women and children
- Call for a global ban on nudifying/undressing apps
- Lobby for stronger penalties in our On-Line Safety regulations
*We think the term “AI generated” serves to dehumanise the act of creating abuse content and shield sexual offenders - the men creating it - from critique and accountability. In reality, AI is merely another tool men are utilising in their abuse and exploitation of women and children. Read more here.
See also:
“It comforts offenders in their actions”: The problem with ‘virtual’ child sexual abuse material
Putting women ‘in their place’: AI abuse of Taylor Swift means all women and girls are at risk
Generative ML and CSAM: Implications and Mitigations (Thiel, Stroebel & Portnoff, 2023)
Add your comment