Collective Shout supports Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse
Tech giants must act to stop sexual exploitation
According to data recently shared by the Australian Centre to Counter Child Exploitation (ACCCE), we are in the grips of a global child sexual exploitation epidemic. Much of the abuse is happening online and in plain sight. And Australian men are the third largest consumers of live, online child sexual abuse, according to the Australian Federal Police.
Read moreFacebook-facilitated sexploitation
*Content warning: child exploitation references
UK children’s charity says platforms have duty of care to keep kids safe
[UPDATED] Finally - after action from eSafety - Instagram has deleted three accounts we reported for child exploitation. The accounts posted explicit imagery of young girls in lingerie and swimwear, with with captions like 'hot young teen booty'. One account promoted a 'lingerie contest' using an image of a 14 year old Australian model in a g-string bikini, advertised a link to the deepnude website (read more about this exploitative app here) and sold spots in a private chat group for access to 'uncensored content', referring to images of teens as 'jerk-off material'.
These accounts are exploitative and degrading. This isn't Pornhub - it's mainstream social media.
Girls deserve better than the so-called 'dedication' to child safety Facebook is offering. We've heard enough talk. It's time for Facebook to stop facilitating child exploitation on its platforms.
Just overnight, Facebook announced a 'renewed commitment' to child safety. We hope this isn’t more PR spin.
Last year, Minister for Home Affairs Peter Dutton labelled Facebook cofounder and CEO Mark Zuckerberg ‘morally bankrupt’ for plans to introduce end-to-end encryption across all Facebook-platform private communication tools - a move which would aid sex predators and the exchange of child exploitation material, and endanger children further.
We’ll be watching to see how Facebook’s commitment plays out.
Between April 2017 and October 2019 police recorded over 10,000 incidents of online grooming in England and Wales. Where the communication method was known, fifty-five per cent occurred on Facebook-owned platforms, with more recorded on Instagram than any other individual platform. The UK’s National Society for the Prevention of Cruelty to Children has used this data to bolster a call for regulatory measures to hold social media platforms to account for facilitating these crimes.
Source: NSPCC via BBC
Facebook responded:
There is no place for grooming or child exploitation on our platforms and we use technology to proactively find and quickly remove it.
Facebook served up similar public relations spin last year in response to our collaborative, international #WakeUpInstagram campaign. But for almost a year, we’ve documented the failures of Facebook’s proactive tech and content-moderating system to keep kids safe from predators, harassment, exploitation and abuse. Earlier this year we pointed out that by giving predators unfettered access to underage girls, Instagram is complicit in normalising the idea that they are available for sexual gratification - an idea that has real-world consequences for girls.
Last month Facebook released its latest Community Standards Enforcement Report, including data on takedown of content that violated Community Guidelines. It reported a proactive takedown rate of 97.7% of content that exploits children, and said that the other 2.3% was removed by human moderators. Facebook said that views of child exploitation content were ‘infrequent’, estimating that there were no more than five views of violating content for every 10,000 views.
These figures related only to the violating content that Facebook actioned. What about all the violating content they didn't action, or the content that wasn't reported, or even found, because it was hidden behind private accounts, in unsaved stories and live posts, or in DMs and messenger rooms? Is Facebook keeping tabs on those views of child exploitation?
On May 20 we reported an account which posts explicit imagery of children - imagery which Facebook's 'proactive tech' failed to detect and remove. At the time of writing - two weeks later - the account (with nearly 2000 followers and 65 posts) is still active and posting child exploitation material. The longer this page is up, the more followers it gains and the more images it posts, the higher the number of views of child exploitation. Will these be counted?
How does Facebook defend its claim that, regarding content that exploits children, they 'do not find enough violating samples to precisely estimate prevalence'? Is turning a blind eye to it really a defense? Whose interests are served by pretending it doesn't exist?
While Facebook has a clear and comprehensive child nudity/exploitation policy, the sexualisation, harassment and exploitation of children on Instagram is rampant. Users don't heed the policy, and moderators don't consistently enforce it.
COVID-19 is having an exacerbating effect. As experts have warned, children are now at more risk of online sexual exploitation than ever. In keeping with this tragic trend, we recently discovered and reported hundreds of Instagram accounts for child exploitation activity - many of which were created after the start of COVID-19 lockdowns.
Just last week we reported a new Instagram account which was posting images of a pre-teen girl dressed in pink lingerie and advertising images for sale on Patreon and Boosty (read more about this - including our action and win - here). Instagram's response to reports is generally slow, though, and most often consists of a standardised reply that says due to COVID-19 they 'can't prioritise all reports right now'.
Undoubtedly, Facebook's 'transparency' data for the COVID-19 period will present outliers. But as our investigations and NSPCC's data show, Instagram's serious child predator and exploitation problems and failure to keep kids safe long predate the current global pandemic.
Facebook recently appointed an Oversight Board to arbitrate content takedown decisions. It's unclear whether this will improve child safety on its platforms. What is clear is that big tech and social media companies like Facebook are part of the child sexual exploitation supply chain. And through its public relations spin, lack of transparency and weak policy enforcement, Facebook is aiding predators and hurting children.
Read more about NSPCC's call for regulatory measures to hold tech and social media heads accountable for child exploitation here.
See also:
No Facebook, you're not doing enough to protect children.
Instagram CEO says 'can't post nudity'. Why did we have to repeatedly report content to police?
Instagram hosts sharing of child sex abuse fantasies
Graphic rape comments fuel #WakeUpInstagram campaign
Put respect on the menu: Our call to KFC
Collective Shout's letter to KFC heads
Read moreSubmission on Online Safety Legislation Reform
Collective Shout welcomes the opportunity to contribute to Online Safety legislative reform. We support intentions to consolidate and harmonise current laws and to ensure streamlining and consistency in a range of digital offences. We are especially pleased to see plans for an expansion of protection against cyberbullying, cyber abuse, image-based abuse and seriously harmful content. As the digital landscape is in a constant state of flux, new opportunities arise – and with them new dangers. This necessitates updated legislation to ensure a safer online environment prioritising human rights and community welfare.
Read more“It comforts offenders in their actions”: The problem with ‘virtual’ child sexual abuse material
*Content warning- this content may be distressing*
Child sexual exploitation material, or child sexual abuse material, refers to sexually abusive images of children. It may include photographic or video evidence of the rape, sexual abuse and torture of children and infants.
Virtual or computer-generated child sexual exploitation material is produced without the use of living children, depicting fictional children. Under Australian law, this content constitutes illegal child sexual exploitation material. The Commonwealth Criminal Code prohibits the sale, production, possession and distribution of offensive and abusive material that depicts a person, or is a representation of a person, who is or appears to be under 18. This includes virtual or animated representations of children, as well as child sex dolls.
Read moreRecipe for respect: It's no secret KFC
Women working in food services are prone to sexual harassment. The 2018 National Survey on Workplace Sexual Harassment report found that people employed in accommodation and food services - 60 per cent of whom were women - were "overrepresented as victims of workplace sexual harassment”. A 2019 survey of Shop, Distributive and Allied Employees’ Association members - a group made up primarily of employees from the retail, fast-food and warehouse sectors - showed that nearly half of women had experienced workplace sexual harassment.
Last week KFC gushed about its partnership with icare for a staff-education program aimed at equipping staff with skills to de-escalate customer abuse and reducing its prevalence. Background data confirms that for workers in the fast-food sector, customer abuse is the norm, and is experienced more widely by female workers than male workers.
We know that abuse is borne out of disrespect, and so it’s reasonable to view customer abuse - abuse that tends to affect women more prevalently than men - as another symptom of societal-level disrespect for women. When other research confirms that gender stereotypes and sexually objectifying representations of women in media and advertising diminish our view of and value for women, we’re hard-pressed to understand why - at the same time it invests in employee empowerment - KFC would use casual sexism to flog chicken.
icare’s pilot program involving KFC reportedly resulted in a 48% reduction in cases of customer abuse. But in the wake of KFC’s cataclysmic advertising fail, do young, female employees in KFC outlets have reason to feel empowered at work? KFC has sent the message to men and boys everywhere that ogling a woman’s breasts - an act of sexual harassment - is just a natural, normal thing to do. The message to women and girls? To borrow a pun from another KFC ad campaign, ‘Bucket. Why not?’ - just go with it. This is the antithesis of the message of respect-based, anti-harassment training programs which instruct victims and onlookers to speak out against harassment.
It is always good to provide workers with skills to manage the spectrum of customer misconduct, but young women should not be expected to absorb the consequences of a nationwide ad campaign where sexual objectification and sexual harassment of young women is the punchline.
How can young women feel respected by their employer when KFC is contributing to the very problems they are trying to solve with a "respect and resilience" program? Will they be safe at work when men like this walk through the door?
If KFC has - as it claims - genuine interest in the well-being of young people and empowering its staff, it will retract the ad and commit to marketing its products without endorsing sexual harassment and perpetuating antiquated sexist narratives that contribute to a culture of disrespect for women.
See also:
Attitudes shape behaviour: men defend sexist KFC ad with onslaught of misogynistic abuse
KFC issues non-apology over sexist ad
KFC serves up buckets of sexism
Sexist grooming of boys - brought to you by KFC
Submission on National Inquiry into Workplace Sexual Harassment
Read more
Insta must act on predators: Collective Shout letter to platform heads
(Addressed to Instagram Global Head of Policy)
December 4, 2019
Widespread predation of underage girls on Instagram
We have identified and collected hundreds of comments posted by men on the Instagram pages of underage girls between August and the time of writing. These include grooming-style comments, sexually harassing comments, requests for sexualised content (including nudes), requests for direct messages and for images to be sent via DM. Large numbers of comments are porn-themed. Men describe in explicit detail sexual abuse fantasies involving the underage girls whose identities and locations are often exposed, placing them at risk of further sexual exploitation. (We have compiled some of the most graphic comments here.)
Inadequate tools for keeping children safe
These predatory comments exist in plain view, unmoderated and unremoved. In some cases they have remained for more than a year - including on posts of girls as young as 4-years- old. Our reports to Instagram are often dismissed as not violating its ‘community guidelines’. Recent examples of dimissed comments include: (on a 13-year-old girl’s post) ‘Omg i sooooo wanna (emoji denoting) lick your (emoji denoting) pussy n (emoji dentoing) ass’; (on a child advertising underwear) ‘Ok im going out of line you put it on i take it off’; ( on a 9- year-old girl’s post) ‘please open your legs’ + emojis including kiss-lips; (various comments on underage girls’ posts) eggplant/peach/cucumber/squirt emojis denoting sexual acts. If these comments are not deemed in violation of whatever standards you use to determine whether content you host is in breach or not, perhaps those standards need to be revised?
There is also a serious shortcoming in Instagram’s reporting system. While comments can be reported as ‘harassment/bullying’ or as ‘violence/threat of violence’, there is no option which properly conveys the serious predatory and grooming nature of these comments. Instagram’s reporting system should include “sexualised/predatory/grooming comment directed at a minor” as its own category.
Instagram also needs to take action to prohibit adults from using ‘live’ posts to contact minors. We have viewed a 13-year-old girl’s live posts during which she was bombarded with sexual comments including a request for sex, questions about her underwear, a man telling her he wanted her to give him an erection, another saying he would pay to meet her and others pressuring her to remove her shoes and show her feet.
We note your statement about having “proactive technology” to keep the platform free from content that is harmful to children. Whatever methods Instagram has employed to date are insufficient for dealing with the rampant predation of children that we have documented. (Our response to Facebook’s official statement here.)
Instagram connects predators to children
Instagram is catering to the fantasy and fetishises of child predators. Through its ‘explore’ feature it directly connects predators with more underage girls. This appears to go directly against your own policies prohibiting content which endangers or exploits children . Regardless of the intended purposes of this feature, the reality is that it results in pointing predators directly to the girls they are preying on (virtually or in real life). Instagram should stop its ‘explore’ feature from promoting minors’ pages and connecting predators with children.
Parasite accounts
We have documented several Instagram pages devoted to republishing images of girls and young women in bikinis or in various modeling, gymnastics or dance poses. Posts on these pages frequently tag the child’s account with captions instructing others to follow the child. Some pages mix images of very young girls with porn-themed and pin-up style images of older teens and young women. These accounts serve as magnets for predators pointing them directly to more children. Instagram has responded to our reports of these accounts that they do not go against ‘community standards’. Instagram should properly investigate these parasite pages. Your company should also prohibit the republishing of images of children on pages that also feature pornified images of adults.
Instagram serves as supplier for pedophile forums
We have documented the names of hundreds of underage models, gymnasts and dancers on pedophile forums. We’ve reviewed a selection of the profile pages and found that every one contains direct links to the child’s Instagram account/s. We’ve also reviewed several forum pages and have evidence that Instagram is the primary source of images shared to the forums where men write explicitly about sex-abuse acts they’d like to carry out on individual girls. We have evidence of an individual commenting on an underage girl’s Instagram post to direct others to the forums.
Instagram accounts used to promote sale of images/videos of children
Some underage girls’ accounts are used to advertise monetised website subscriptions where men purchase exclusive photos and videos of children. Other accounts promote links to personal web pages where fans can purchase posters, calendars and photo/video packages. Underage girls also use Instagram to promote accounts on other social media platforms where sexualised content is under even less scrutiny, placing minors in further danger of being targeted by predators. We believe that Instagram must cease acting as an advertising service for individuals selling images/videos of minors.
Conclusion
The evidence we have gathered demonstrates the sexual exploitation of underage girls on Instagram. Your company is allowing predators almost unfettered access to them. Content hosted on your platform violates their right to grow up free from activity that harms them (United Nations Convention on the Rights of the Child, Article 36). Because of how common predatory behaviour has now become on Instagram, girls learn to think of it as normal which sets them up for even more harm.
We urge Instagram to prioritise this issue and act in a way that places the wellbeing of vulnerable young people above the interests of predators and reflects accepted standards of corporate social responsibility and ethical behaviour.
Recommendations
- Revise ‘Community Standards’ so that all sexualised, predatory and grooming-style comments (text, slang, short-hand, hashtags, emojis and graphics) qualify as violations.
- Add ‘sexualised/predatory/grooming comment directed at a minor’ as a reporting category.
- Prohibit adults from using ‘live’ posts to contact minors.
- Update Instagram’s system used to detect and remove sexualised, predatory comments.
- Recognising that Instagram serves as a supply source of images of children for web-based pedophile forums, update all relevant policies, guidelines and help documents (including ‘A Parent’s Guide to Instagram’) so that users are properly informed of the risks of sharing images of children to the platform.
- Stop the ‘explore’ feature from promoting minors’ pages and connecting predators with children.
- Investigate parasite pages that are exclusively devoted to republishing photos of minors, deleting pages where children are sexualised, harassed, groomed or where any type of predatory comments/behaviour is displayed.
- Prohibit the republishing of images of minors on Instagram pages that also feature porn-style images of adults.
See also:
No Facebook, you're not doing enough to protect children.
Why is Instagram letting men share sexual fantasies about 7 year old girls?
Predators Paradise: girls are sexually exploited on Instagram
I was so concerned about porn-themed portrayals of young girls on Instagram I reported to police
eSafety commissioner backs Collective Shout's call for Instagram overhaul
A day of the girl on Instagram: posed, exposed and at risk
‘Sexy girl’: How Instagram allows the offering of young girls as fetishised flesh
Read more
Cinemas respond to Collective Shout calls to withdraw Show Dogs
Collective Shout has called on Australian cinemas not to screen the children’s movie Show Dogs over concerns the film grooms children for sexual abuse.
In response to the global outcry from parents and child advocates, the production company withdrew the film, promising to re-cut it to remove the offending scenes. However, the film has been re-released with only a minor portion of the troubling scenes removed. This is not good enough, so now we’re calling on Australian cinemas to take a stand against child sexual abuse and refuse to screen the film.
At this stage, the film is being promoted by Hoyts, Event Cinemas, Palace Cinemas and Wallis Cinemas. The film is currently not listed on Cineplex, Village Cinemas, Dendy Cinemas, Grand Cinemas and ACE Cinemas websites.
So far, we’ve had responses from several cinemas.
Hoyts has claimed the film “has not confirmed its release date for Australia” and that they do not have any information about the film being shown on their screens yet- however, the film is currently being advertised on the ‘Coming Soon’ section of Hoyts’ website with a release date of 5thJuly.
Dendy responded as follows:
Show Dogs has not been confirmed for any of the Dendy Cinemas sites. We are not planning on showing the film at this stage.
We need your help.
Help us keep the pressure on, and let these cinemas know you don’t want them to screen Show Dogs.
Contact your local cinema here, and let us know if they respond.
Open Letter to Australian Cinemas: Don't screen Show Dogs movie
We are writing to you in regards to the children’s film Show Dogs, due for release 5 July. Upon its release in the US, it attracted substantial criticism from parents and child advocates over concerns of “grooming” children for sexual abuse.
The film tells the story of a police dog going undercover at a dog show. There are reportedly several scenes in which the dog, Max, has to have his genitals inspected. When he is uncomfortable and wants to stop he is told to go to a ‘zen place’. When he submits and allows his genitals to be touched, he is rewarded by advancing to the next level of the show.
In response to the global backlash, the production company withdrew the film, promising to re-cut it to remove the scenes in question. The film has been re-released, however the scenes remain, with only the encouragement to ‘go to a zen place’ (essentially, to dissociate) being removed. The meaning remains intact, that unwanted sexual touching is to be endured and may be rewarded.
Read more