Tech giants STILL aiding predators, putting young girls at risk
Last month, after we called tech giants out on in media and on our socials, child exploitation material disappeared from their platforms.
Twitter suspends pro-pedophilia accounts
Twitter suspended multiple accounts publishing child exploitation material, promoting pedophilia, and broadcasting child rape fantasies. Originally Twitter responded to a campaigner’s reports to say its Safety Policies had not been broken - even when men discussed raping and impregnating pre teen girls, and violently dismembering women. After we tweeted CEO Parag Agrawal, Chair Bret Taylor, major shareholder Vanguard and new owner Elon Musk, asking why Twitter endorsed men’s explicit desires to sexually abuse young girls, a number of accounts were suspended.
Hey @paraga - why are men free to discuss their desires to rape + impregnate girls + dismember women on your platform?
— LSKennedy (@LSKennedy5) April 20, 2022
Do shareholders know this doesn’t break safety policy & that they’re profiting from this content?
Do they endorse this? @elonmusk @Vanguard_Group @btaylor pic.twitter.com/AqtxYhoVrS
X-platform exploitation: Pics of pre-and-young-teen Instagram "influencers" served up as men's masturbation material on Twitter
Twitter originally told us child exploitation did not break safety policies. Several accounts have now been suspended.
Pre-teen 'shout-out' page gone from Instagram
After media coverage of our investigations exposing Instagram’s and Twitter's failure to remove reported child exploitation content, Instagram pulled an account dedicated to promoting pre-teen “models”. With over 33,000 followers, it was a hive of child exploitation activity, and a magnet for predators.
Earlier, Instagram had responded to a campaigner’s report to say the account didn’t go against its Community Guidelines - even though the account published BDSM-themed pics and vids of a prepubescent girl in sexualised poses, clothed in fetish wear and chains. Other posts featured images of prepubescent girls in adult-style swimwear, posing and writhing sexually.
Sexualised images of pre-teens do not violate Instagram's Community Guidelines
(At the time of writing, a backup account with an almost identical user name, had amassed hundreds of followers. A hashtag containing the page's name returned 11,000 posts featuring adultified pre-teen and toddler girls. The Manila-based child photography and modeling page which posted the original BDSM-themed content was still active.)
Social media companies STILL ignoring child exploitation
We have countless other similar examples of Instagram failing to remove child exploitation material we reported. Sometimes, Instagram told us it was too busy to review them and suggested we hide the content.
Instagram gave similar advice when we reported child sex abuse comments made on Reels (short videos) featuring pre-teen girls:
I reported the child sex abuse comment to @instagram on April 18. Three days later it responded:
— LSKennedy (@LSKennedy5) April 26, 2022
Too busy, can’t review. If it upsets me to see predatory men exploiting + making violent sex abuse comments to young girls I can hide them. @mosseri #wakeupinstagram @nickclegg pic.twitter.com/C5ys2d9JHj
This is despite Instagram adding tools specifically for reporting child exploitation activity (which WE asked them for - read about our WIN here) - and claims to prioritise these reports!
The video had 77,000 views at the time we reported the comments. We would have thought this level of engagement on a Reel featuring young girls dancing sexually might have raised red flags for one of the world’s largest, most well resourced and tech-savvy corporate - especially given its claims to care about children, that it doesn't "allow" under 13s to have accounts, and that there’s ‘no place’ for activity which exploits children on its platforms. Perhaps they might have investigated the account, and some of its near-15,000 followers?
Sexualised performances: Instagram Reels featuring pre-teens attract tens-of-thousands of views
They would have seen that it is dedicated to videos of a young girl's sexualised performances for an audience of mostly men. They would have seen the rampant predatory activity - the comments from men about the girl’s body, and wanting to see her naked; one man suggested that one of the girls in a video wanted to be violently and sexually abused (‘you can tell that she wants her a** to be ripped off - zip-mouth emoji/eggplant emoji’).
They would have seen the pedophile networking: requests for ‘cp’ (‘child pornography’, ie child sex abuse material) and invitations to off-site chat groups (‘telegram exchange [username]’) - where men can trade child sex abuse material and engage in sexual discussions involving young girls even more freely. They would have seen that other children are at risk: Reels featuring young friends (and relatives?) - a toddler in some cases - are frequently posted to the account.
They would have seen her Reel made using the ‘Play Girl Dark’ effect - a filter which puts Playboy’s bunny head logo over the creator’s eyes. Instagram might have even considered the wider implications of serving up porn-branded pre-teens to audiences of predatory men who want to sexually abuse them.
Pre-teen girls use Playboy-themed Instagram filter
They would have seen amongst her 15,000 followers hordes of predatory men - some posting violent, degrading pornography to Instagram; others following multitudes of other pre-teen and young teen Instagram “models”, “bloggers” and “influencers”. They also follow 'child'- and children's brand-related hashtags which conjure up galleries filled with half-clad pre-teen girls.
They might have seen the bio of one male-named follower which reads in part: 'Gymnastics Girls..Forbidden and Girls Lover 12 years below'. And the man following the hashtag ‘kidsupskirt’.
They might have removed every last one of these accounts for promoting pedophilia and for activity which exploits children and followed up with reports to police. But instead, Instagram allowed and even endorsed the activity, ignoring our reports and telling us to hide the child exploitation activity and child rape comments if we found them 'upsetting'.
And this is just one account. We follow hundreds more. How many other young girls are being exploited on Instagram?
Tech giants failing to fulfil mandatory reporting requirements
Many supporters have shared with us that they received the same response from Instagram after reporting child exploitation content and accounts: ‘We’ve reviewed it and found that it doesn’t go against our Community Guidelines.’
In late March the New York Times published this article exposing Instagram-parent-company Meta's training manual for content moderators. The document instructed moderators, in cases where the age of the subject of suspected child exploitation material was unknown, to 'err on the side of adults'. You can read more about this here.
Instagram's failure to remove reported child exploitation content means it is not completing mandatory reporting of the accounts associated with the activity. How many child predators are going unreported to the National Center for Missing and Exploited Children (NCMEC is the US-based clearing house for reports of child exploitation and abuse) because Instagram is too busy to review reports, or because moderators decided reported content doesn’t violate its Community Guidelines? How many children are being abused and exploited as a result? Is anyone monitoring Meta's handling of child exploitation reports submitted by its platform users?
The same applies to Twitter. Each time Twitter rejected one of our reports it effectively endorsed pro-pedophilic activities and the distribution of child exploitation material, and left a a web of child exploitation accounts unchecked.
More men, more child sex abuse fantasy discussions involving 12 year old girls. Doesn’t break @twitter’s safety policy.
— LSKennedy (@LSKennedy5) April 26, 2022
Who exactly are you trying to keep safe @paraga?
Asking again - do you endorse this @elonmusk @Vanguard_Group @jack? #twitterhatesgirls pic.twitter.com/4AemdjR8XP
Australian tax payers funding rich tech co’s content moderation
Last month, we had more child exploitation material removed from Instagram - but only after eSafety intervened. In one case, a campaigner first reported an exploitative image of a 12 year old girl to Instagram. The platform reported back that it did ‘not go against our Community Guidelines’. We then reported the post to eSafety which later advised that it asked Instagram to review the content and it had been removed.
Exploitative image of 12 year old girl did not go against Instagram's Community Guidelines
All of this shows that corporates CAN remove child exploitation material. But they wait until they are forced to.
We're pleased to see some of the child exploitation material we exposed gone. But as tech giants continue to rely on reports of child exploitation from community members, and to reject reports when they are made, they aid child predators and put children at even greater risk of harm.
If these companies really care about children and have ‘zero tolerance’ for child exploitation like they claim, they need to demonstrate it.
Read more about our calls to Government to rein in tech corporates aiding and profiting from child exploitation here.
See also
Media features: National coverage of our Instagram + Etsy investigations
‘I would love to f*** you so hard’: Girls deserve better than Instagram
The mainstreaming of child exploitation material on Instagram
Add your comment