*Content warning: child exploitation references
UK children’s charity says platforms have duty of care to keep kids safe
[UPDATED] Finally - after action from eSafety - Instagram has deleted three accounts we reported for child exploitation. The accounts posted explicit imagery of young girls in lingerie and swimwear, with with captions like 'hot young teen booty'. One account promoted a 'lingerie contest' using an image of a 14 year old Australian model in a g-string bikini, advertised a link to the deepnude website (read more about this exploitative app here) and sold spots in a private chat group for access to 'uncensored content', referring to images of teens as 'jerk-off material'.
These accounts are exploitative and degrading. This isn't Pornhub - it's mainstream social media.
Girls deserve better than the so-called 'dedication' to child safety Facebook is offering. We've heard enough talk. It's time for Facebook to stop facilitating child exploitation on its platforms.
Just overnight, Facebook announced a 'renewed commitment' to child safety. We hope this isn’t more PR spin.
Last year, Minister for Home Affairs Peter Dutton labelled Facebook cofounder and CEO Mark Zuckerberg ‘morally bankrupt’ for plans to introduce end-to-end encryption across all Facebook-platform private communication tools - a move which would aid sex predators and the exchange of child exploitation material, and endanger children further.
We’ll be watching to see how Facebook’s commitment plays out.
Between April 2017 and October 2019 police recorded over 10,000 incidents of online grooming in England and Wales. Where the communication method was known, fifty-five per cent occurred on Facebook-owned platforms, with more recorded on Instagram than any other individual platform. The UK’s National Society for the Prevention of Cruelty to Children has used this data to bolster a call for regulatory measures to hold social media platforms to account for facilitating these crimes.
Source: NSPCC via BBC
There is no place for grooming or child exploitation on our platforms and we use technology to proactively find and quickly remove it.
Facebook served up similar public relations spin last year in response to our collaborative, international #WakeUpInstagram campaign. But for almost a year, we’ve documented the failures of Facebook’s proactive tech and content-moderating system to keep kids safe from predators, harassment, exploitation and abuse. Earlier this year we pointed out that by giving predators unfettered access to underage girls, Instagram is complicit in normalising the idea that they are available for sexual gratification - an idea that has real-world consequences for girls.
Last month Facebook released its latest Community Standards Enforcement Report, including data on takedown of content that violated Community Guidelines. It reported a proactive takedown rate of 97.7% of content that exploits children, and said that the other 2.3% was removed by human moderators. Facebook said that views of child exploitation content were ‘infrequent’, estimating that there were no more than five views of violating content for every 10,000 views.
These figures related only to the violating content that Facebook actioned. What about all the violating content they didn't action, or the content that wasn't reported, or even found, because it was hidden behind private accounts, in unsaved stories and live posts, or in DMs and messenger rooms? Is Facebook keeping tabs on those views of child exploitation?
On May 20 we reported an account which posts explicit imagery of children - imagery which Facebook's 'proactive tech' failed to detect and remove. At the time of writing - two weeks later - the account (with nearly 2000 followers and 65 posts) is still active and posting child exploitation material. The longer this page is up, the more followers it gains and the more images it posts, the higher the number of views of child exploitation. Will these be counted?
How does Facebook defend its claim that, regarding content that exploits children, they 'do not find enough violating samples to precisely estimate prevalence'? Is turning a blind eye to it really a defense? Whose interests are served by pretending it doesn't exist?
While Facebook has a clear and comprehensive child nudity/exploitation policy, the sexualisation, harassment and exploitation of children on Instagram is rampant. Users don't heed the policy, and moderators don't consistently enforce it.
COVID-19 is having an exacerbating effect. As experts have warned, children are now at more risk of online sexual exploitation than ever. In keeping with this tragic trend, we recently discovered and reported hundreds of Instagram accounts for child exploitation activity - many of which were created after the start of COVID-19 lockdowns.
Just last week we reported a new Instagram account which was posting images of a pre-teen girl dressed in pink lingerie and advertising images for sale on Patreon and Boosty (read more about this - including our action and win - here). Instagram's response to reports is generally slow, though, and most often consists of a standardised reply that says due to COVID-19 they 'can't prioritise all reports right now'.
Undoubtedly, Facebook's 'transparency' data for the COVID-19 period will present outliers. But as our investigations and NSPCC's data show, Instagram's serious child predator and exploitation problems and failure to keep kids safe long predate the current global pandemic.
Facebook recently appointed an Oversight Board to arbitrate content takedown decisions. It's unclear whether this will improve child safety on its platforms. What is clear is that big tech and social media companies like Facebook are part of the child sexual exploitation supply chain. And through its public relations spin, lack of transparency and weak policy enforcement, Facebook is aiding predators and hurting children.
Read more about NSPCC's call for regulatory measures to hold tech and social media heads accountable for child exploitation here.
Speaking out against the exploitation of women and children: Collective Shout cited in NSW Modern Slavery Act report
Report cites Collective Shout and Movement Director Melinda Tankard Reist
Following its 2019 Inquiry into the Modern Slavery Act 2018 and Associated Matters, the New South Wales Legislative Council tabled its final report last month. The Act makes new provisions regarding child sexual exploitation material through the addition of the following offenses to the NSW Crimes Act 1900:
a new aggravated offence in relation to using a child to produce child abuse material, and a new offence of providing information to assist a person to avoid detection for a child abuse material offence.
We welcomed these new provisions which will strengthen capacity for holding offenders to account for sex abuse crimes committed against children.
The report contained several references to Collective Shout’s submission and evidence presented by Movement Director Melinda Tankard Reist who appeared for us at the Inquiry hearing last November, including the following quote from our submission:
New South Wales law must ensure that any person who is facilitating such horrific acts as the abuse on demand of babies and other young children, wherever the abuse might be occurring, through a digital platform operating in or accessed from New South Wales, is subject to an offence, whether the person administering or assisting to administer the digital platform does so intending it be used for that purpose or after becoming aware that it is being used for that purpose fails to take all reasonable steps to prevent that use.
Contrary to our recommendations against watering down the Act in relation to digital platforms the amended Bill now excludes the previously proposed new criminal offence of administering a digital platform used to deal with child abuse material. Conflicts between Federal and State law was cited by the committee. Several contributors to the Inquiry objected to this deletion, noting that its inclusion was stronger than the provisions of Commonwealth legislation to hold digital platforms and Internet Service Providers (ISPs) accountable for hosting child sex abuse material. We shared concerns that its exclusion will provide loopholes for such offenders. As Melinda Tankard Reist pointed out at the hearing:
The child victim is a part of the slavery supply chain. Combating modern slavery has to include combating the global epidemic of this pay-per-view torture of children in the growing trade of predators commissioning the live sexual abuse of a child viewed via their computer screens and facilitated by their ISPs. ISPs and telcos including Telstra, Optus, iiNet and TPG are providing the infrastructure for the live streaming of the abuse of children to be possible. ISPs are part of a chain which contributes to the distribution of child sexual exploitation material but they have not been brought to account.
Melinda Tankard Reist at the Inquiry hearing, 4 November 2019
In her verbal evidence, MTR commended other contributors to the Inquiry, including International Justice Mission Australia (IJM) and the Coalition Against Trafficking in Women Australia (CATWA), as well as the expert advice of UK Anti-Slavery Commissioner Kevin Hyland. We were pleased to see Hyland’s six-point strategy for ending modern slavery featured in the report, which highlights the need for all nations to mandate - through legislation - a child-exploitation - and human-trafficking-free internet, as well as the outcomes of the proposed legislation which, according to Hyland, include “leading businesses into generating ethical profits free from exploitation and modern slavery”. We anticipate this outcome will have positive consequences for the broader community - women and girls particularly who stand to benefit from corporates who are unwilling to profit from exploitative activities.
While not mentioned in the Final Report, we noted CATWA’s significant observation that the Act’s proposed minimum $50 million annual turnover threshold for mandatory supply chain reporting will fail to capture agents of the sex industry - an industry in which women are at greatest risk of trafficking and slavery. In its submission CATWA concluded:
A requirement for sex industry businesses to report would be a first step in the fight against modern slavery.
We strongly support this recommendation and would advocate for its inclusion in any future amendments.
Read our full submission to the Inquiry here.
Read the full Parliamentary Hearing transcript here.
Read the NSW Legislative Council's final report here.
*Content warning- this content may be distressing*
Child sexual exploitation material, or child sexual abuse material, refers to sexually abusive images of children. It may include photographic or video evidence of the rape, sexual abuse and torture of children and infants.
Virtual or computer-generated child sexual exploitation material is produced without the use of living children, depicting fictional children. Under Australian law, this content constitutes illegal child sexual exploitation material. The Commonwealth Criminal Code prohibits the sale, production, possession and distribution of offensive and abusive material that depicts a person, or is a representation of a person, who is or appears to be under 18. This includes virtual or animated representations of children, as well as child sex dolls.Read more
*Content warning- this article mentions child sexual abuse which may be distressing for some readers*Read more
Earlier this week, online (gaming and entertainment) news outlet Kotaku named Collective Shout’s submission to the review of Australian Classification laws one of the review’s “most important submissions”. Collective Shout was listed among other organisations and corporates that Kotaku called “main players” in the review, including the Australian Council on Children and the Media, Google, Disney and Netflix.
Kotaku cited our recommendations, including our push for an urgent investigation ‘into the Classification Board assigning M or MA15+ ratings to anime and manga genres featuring Child Sexual Abuse Material contrary to Australian law’.
In our submission we highlighted
- the need for an evidence-based approach informed by research that demonstrates the harms of sexual objectification;
- pornography should no longer be treated by default as ‘adult content’, but as commercialised sexual exploitation;
- reliance on parents to control what their children access is unrealistic;
- child and youth development experts who can advise on the ‘possible impact of content with sexualised content or messaging’ should be included in the new regulatory process.
We warned against a self-regulated model, using the failures of the self-regulated advertising industry and its overseeing body, Ad Standards, to uphold community standards and the industry’s Code of Ethics to emphasise the need for an overseer which has powers to enforce rulings.
Read our full submission here.
Media Release: Classification Board approves movies depicting child rape - Collective Shout calls on Communications Minister Fletcher to urgently intervene
Collective Shout has called for an overhaul of Australia’s classification system and a review of recent Classification Board determinations following discovery of illegal animated child sexual abuse material depicting child rape, abuse and exploitation which the Board classified as suitable for audiences as young as 15 - in some cases even younger.
South Australian Centre Alliance Senator Stirling Griff exposed the Board’s deeply disturbing failure to exercise its responsibilities under Australian law in a speech in the Senate Tuesday followed by a Senate motion yesterday.
Senator Griff described anime movies depicting “wide-eyed children, usually in school uniforms, engaged in explicit sexual activities and poses, and often being sexually abused." He called for an immediate review of all Japanese anime movies accessible in Australia.
The Commonwealth Criminal Code prohibits the sale, production, possession and distribution of offensive and abusive material that depicts a person, or is a representation of a person, who is or appears to be under 18.
Senator Griff cited a number of anime series featuring the sexual abuse of children. One of these, Sword Art Online depicts the rape and sexual assault of children. It was given an unrestricted M rating by the Classifications Board, despite the fact it constitutes illegal child exploitation material. According to Senator Griff, the character Asuna is raped by her captor Sugou, who threatens to also rape her in the real world, where she is lying in a hospital room in a catatonic state. Sugou says he will make a recording of the virtual rape to shame her.
Senator Griff said that the Classification Board justified the M rating in its report, stating that the nudity through the film is 'moderate in impact' and 'justified by context'.
We would like to know how Board members could possibly justify the sexual violation of children for entertainment as justifiable in any way.
Other anime series depicting sexual abuse of children as well as strong incest themes were given an MA 15+ rating by the Board, despite also featuring illegal content. In Goblin Slayer children are portrayed as frightened or resisting - at the same time enjoying the sexual abuse inflicted on them.
“The Board has made child sexual exploitation material available for purchase in Australian retail outlets - including mainstream stores like Sanity”, Movement Director Melinda Tankard Reist said. “This has allowed a paedophilic culture to flourish. How can we claim to care about the epidemic of child sexual abuse when child sexual exploitation material is given the tick by our so-called regulatory body?”
Our experience working with child sexual abuse survivors and clinicians supports Senator Griff’s statement that this material is “a gateway to the abuse of actual children” and can be used as a grooming tool to normalise abuse.
“This matter must be immediately referred to the Australian Federal Police,” Tankard Reist said. “And Communications Minister Paul Fletcher needs to take charge of this failed government agency and investigate how it could allow this content to be permitted contrary to Australian law”.
In its submission to the current review of Australian Classification Regulation, Collective Shout provided detailed evidence of systemic failures of the Board over a decade and called for its complete overhaul. Its approval of child sexual exploitation material is just the latest example of a broken system.
Melinda Tankard Reist
[mtr at collectiveshout.org]
[caitlin at collectiveshout.org]
27 February 2020Read more
Instagram is a haven for child predators and a host to the broadcasting of child sex abuse fantasies. Contrary to corporate claims that there's 'no place' for content that exploits or endangers children, exploitative, graphic and degrading comments directed at underage girls are rampant on the Facebook-owned platform.
Our campaign partners at National Center on Sexual Exploitation met with Instagram heads last December to discuss the ways Instagram puts children at risk and how child safety on Instagram could be improved. Instagram said that the matter of predatory comments will be investigated.
Meanwhile, men continue to use Instagram to harass and fetishise little girls. See below for our latest roundup of child sex abuse fantasies and other predatory comments freely broadcast on Instagram. We will continue to expose and call out this predatory behaviour!
Take action today!
Full article published on Mercury
A retiree has been refused bail after police allegedly found a raft of child abuse material on his laptop and a number of fake identification documents.
Over the past 20 years, police allege Mr Seddon regularly travelled to Thailand and the Philippines, describing him as “well-connected” and a “frequent traveller to high risk ports for child exploitation”.
Police also alleged Mr Seddon recently contacted a computer company and asked them how to use the dark web.
Mr Seddon had also allegedly been using YouTube Kids and had searched for “young boy gay porn” before his arrest.
Police allege they also uncovered an “enormous amount of gay pornography” ranging from children to adults.
“Based on the pornography viewed, the accused has a preference for jail scene encounters…sexual torture,” court documents read.
Police allege Mr Seddon had befriended a woman in the Philippines with two boys aged six and eight, who he often gave gifts to and provided for. Police have documentation which shows payments to the Philippines for unknown reasons.Read more
Australian Institute of Criminology releases report on child sex dolls
The Australian Institute of Criminology has released the report ‘Exploring the implications of child sex dolls’ by Rick Brown and Jane Shelling. The report discusses child sex dolls in relation to the sexualisation of children, as an “escalated form of engaging with child pornography”, the normalisation of child sexual abuse and the risk of grooming.
The authors acknowledge that there is very little empirical evidence on the implications of sex dolls and child sex dolls, and therefore also draw on research on child exploitation material and sex offences in considering the implications of sex doll use and ownership.
Potential Harms: Escalation, Desensitisation, Objectification, Commodification and Grooming
The report documents a range of potential harms associated with the production, distribution and use of child sex dolls.
It is possible that use of child sex dolls may lead to escalation in child sex offences, from viewing online child exploitation material to contact sexual offending.
It may also desensitise the user from the potential harm that child sexual assault causes, given that such dolls give no emotional feedback.
The sale of child sex dolls potentially results in the risk of children being objectified as sexual beings and of child sex becoming a commodity.
Finally, there is a risk that child-like dolls could be used to groom children for sex, in the same way that adult sex dolls have already been used.
There is no evidence that child sex dolls have a therapeutic benefit in preventing child sexual abuse.
The authors conclude:
It is ‘reasonable to assume that interaction with child sex dolls could increase the likelihood of child sexual abuse by desensitising the doll user to the physical, emotional and psychological harm caused by child sexual abuse and normalising the behaviour in the mind of the abuser’.
We have previously exposed Wish app and Amazon for selling child sex dolls, along with a range of other replica child body parts marketed for sexual use. In response to our campaign, Wish withdrew these items from sale.
OPEN LETTER ON THE DANGERS OF NORMALISING SEX DOLLS & SEX ROBOTS
A Queensland man found with 14 hours of video and over 500 images of child exploitation material has received an 18 month suspended sentence. Christopher Edward Hunt had become desensitised to violent porn and had been collecting images and videos of children being tortured and engaging in sex acts with animals.
The 31 year old lived with his parents and after police raided his house he admitted to possessing child exploitation material and sexually abusing the family's Staffordshire terrier.Read more