We’ve discovered AI-generated images of young children on Instagram we believe constitute illegal child sexual abuse material.
The images depict little boys in minimal clothing, some adorned with fetish gear, and their bodies oiled. When we reported the account to eSafety, we were told the content was not actionable given the lack of nudity and “non-sexualised posing”.
When we asked eSafety what material would qualify as category 1 child sexual abuse material (and warrant removal) they told us “Material that describes or depicts child sexual abuse, or any other exploitative or offensive description or depiction involving a person who appears to be a child under 18 years.” We believe that under the definition provided, these images should qualify.
Speaking to the Daily Telegraph, Campaigns Manager Caitlin Roper said even though the images may not show a child naked or being abused, it was still exploitative and should be considered child sexual abuse material.
“We come across images that are somewhat sexualised or inappropriate, like a child in a bathing suit, or up-close image of a child’s crotch area, that attract a lot of comments from predatory adult men,” she said.
Ms Roper pointed out that accounts posting exploitative images of children often had thousands of adult male followers, who would describe how they wanted to sexually abuse the child in the comments.
“We know that these images are being posted and used for men’s sexual gratification, and that many end up on paedophile forums.”
These images – along with eSafety’s unwillingness to act – highlight the growing threat of AI and deepfake technology to women and children, as predatory men continue to use AI as a tool to sexually exploit and abuse them.
See also:
Made by men: How the term "AI generated" invisiblises sex offenders
"Undress any girl you want": The explosion of Deepnude apps puts all women in danger
Putting women ‘in their place’: AI abuse of Taylor Swift means all women and girls are at risk
Amendment to the Online Safety (Basic Online Safety Expectations) Determination 2023: Our submission
Add your comment