AI images of little boys in fetish gear on Instagram “not actionable”, says eSafety

We’ve discovered AI-generated images of young children on Instagram we believe constitute illegal child sexual abuse material.

1.png

The images depict little boys in minimal clothing, some adorned with fetish gear, and their bodies oiled. When we reported the account to eSafety, we were told the content was not actionable given the lack of nudity and “non-sexualised posing”.

2.png

When we asked eSafety what material would qualify as category 1 child sexual abuse material (and warrant removal) they told us “Material that describes or depicts child sexual abuse, or any other exploitative or offensive description or depiction involving a person who appears to be a child under 18 years.” We believe that under the definition provided, these images should qualify.

Speaking to the Daily Telegraph, Campaigns Manager Caitlin Roper said even though the images may not show a child naked or being abused, it was still exploitative and should be considered child sexual abuse material.

“We come across images that are somewhat sexualised or inappropriate, like a child in a bathing suit, or up-close image of a child’s crotch area, that attract a lot of comments from predatory adult men,” she said.

Ms Roper pointed out that accounts posting exploitative images of children often had thousands of adult male followers, who would describe how they wanted to sexually abuse the child in the comments.

“We know that these images are being posted and used for men’s sexual gratification, and that many end up on paedophile forums.”

These images – along with eSafety’s unwillingness to act – highlight the growing threat of AI and deepfake technology to women and children, as predatory men continue to use AI as a tool to sexually exploit and abuse them.

See also:

Made by men: How the term "AI generated" invisiblises sex offenders

"Undress any girl you want": The explosion of Deepnude apps puts all women in danger

Putting women ‘in their place’: AI abuse of Taylor Swift means all women and girls are at risk

Amendment to the Online Safety (Basic Online Safety Expectations) Determination 2023: Our submission


Add your comment

  • Caitlin Roper
    published this page in News 2024-04-15 11:45:39 +1000

You can defend their right to childhood

A world free of sexploitation is possible!

Fuel the Movement