WIN: Child sex abuse doll seller pulled from Insta after we pressured execs
*CONTENT WARNING*
In a flip-flop ruling, Instagram has pulled a child sex abuse doll seller’s account. At first, Instagram told us they did NOT remove the account because it did not go against their Community Guidelines. So we took to X to show exactly how it DID breach parent company Meta’s terms of service which prohibit publication of child exploitation material and content which sexualises children.
Read moreWhy are you against child sex abuse dolls and virtual/AI porn depicting children? Isn’t it better that predators use these than sexually abuse real children?
In Australia, this material is illegal. The Commonwealth Criminal Code prohibits the sale, production, possession and distribution of offensive and abusive material that depicts a person, or is a representation of a person, who is or appears to be under 18.
While some people defend the use of virtual child sexual abuse material or child sex abuse dolls as “victimless”, these products serve to normalise and legitimise men’s sexual use and abuse of children. As the United Nations Special Rapporteur on the sale and sexual exploitation of children notes, this material “may encourage potential offenders and increase the severity of the abuse…the objectification of children comforts offenders in their actions.”
A 2019 report by the Australian Institute of Criminology concluded not only that there was no evidence child sex abuse dolls could prevent abuse, but that they could increase the risk of child sexual abuse by desensitising users, bridging the gap between fantasy and reality and could be used to groom children.
In her book Sex Dolls, Robots and Woman Hating Campaigns Manager Caitlin Roper documents a growing number of cases where men found in possession of child sex abuse dolls are sexually offending against children in additional ways. Some incorporate children into their doll use, and commission dolls made in the likeness of children known to them.
There is no evidence that having access to ‘virtual’ or AI CSAM, or replica children to practice sexual abuse, prevents child sexual abuse. Rather, it encourages it.