We reported 100 pieces of child exploitation content to Instagram - they removed just three
Mega and PayPal named in Instagram's child sex abuse material promos
Read moreTwitter + Instagram remove child exploitation material after we exposed them
Tech giants STILL aiding predators, putting young girls at risk
Read moreMedia features: National coverage of our Instagram + Etsy investigations
Predators target preteens on Instagram + Etsy child sex abuse material
Read moreCollective Shout joins global coalition calling for child protections on Facebook
Fifty-nine child protection organisations and experts sign open letter to Zuckerberg
Read moreWIN! Instagram rolls out new safety measures for teens
We’ve had another #WakeUpInstagram campaign win, with Instagram announcing the following new measures to help protect young people from predators:
Read moreSubmission to European Commission Public Consultation on Child sexual abuse online - detection, removal and reporting
We were pleased to make a submission to the European Commission's public consultation on Child sexual abuse online - detection, removal and reporting.
Read moreWIN: New measures to help protect minors from predators on Instagram
We've had another major #WakeUpInstagram win, with Instagram's rollout of new tools to help protect underage users from predators on the platform including:
Read moreUN Submission: Children's rights in the digital environment
Update: UN adopts General Comment on children's online rights (February 11, 2021)
The United Nations now recognises that children’s rights extend to the online realm. These rights - and the responsibilities they invoke on world governments and corporates regarding issues such as children’s online safety - are outlined in the Committee on the Rights of the Child's General Comment 25.
Read moreFacebook-facilitated sexploitation
*Content warning: child exploitation references
UK children’s charity says platforms have duty of care to keep kids safe
[UPDATED] Finally - after action from eSafety - Instagram has deleted three accounts we reported for child exploitation. The accounts posted explicit imagery of young girls in lingerie and swimwear, with with captions like 'hot young teen booty'. One account promoted a 'lingerie contest' using an image of a 14 year old Australian model in a g-string bikini, advertised a link to the deepnude website (read more about this exploitative app here) and sold spots in a private chat group for access to 'uncensored content', referring to images of teens as 'jerk-off material'.
These accounts are exploitative and degrading. This isn't Pornhub - it's mainstream social media.
Girls deserve better than the so-called 'dedication' to child safety Facebook is offering. We've heard enough talk. It's time for Facebook to stop facilitating child exploitation on its platforms.
Just overnight, Facebook announced a 'renewed commitment' to child safety. We hope this isn’t more PR spin.
Last year, Minister for Home Affairs Peter Dutton labelled Facebook cofounder and CEO Mark Zuckerberg ‘morally bankrupt’ for plans to introduce end-to-end encryption across all Facebook-platform private communication tools - a move which would aid sex predators and the exchange of child exploitation material, and endanger children further.
We’ll be watching to see how Facebook’s commitment plays out.
Between April 2017 and October 2019 police recorded over 10,000 incidents of online grooming in England and Wales. Where the communication method was known, fifty-five per cent occurred on Facebook-owned platforms, with more recorded on Instagram than any other individual platform. The UK’s National Society for the Prevention of Cruelty to Children has used this data to bolster a call for regulatory measures to hold social media platforms to account for facilitating these crimes.
Source: NSPCC via BBC
Facebook responded:
There is no place for grooming or child exploitation on our platforms and we use technology to proactively find and quickly remove it.
Facebook served up similar public relations spin last year in response to our collaborative, international #WakeUpInstagram campaign. But for almost a year, we’ve documented the failures of Facebook’s proactive tech and content-moderating system to keep kids safe from predators, harassment, exploitation and abuse. Earlier this year we pointed out that by giving predators unfettered access to underage girls, Instagram is complicit in normalising the idea that they are available for sexual gratification - an idea that has real-world consequences for girls.
Last month Facebook released its latest Community Standards Enforcement Report, including data on takedown of content that violated Community Guidelines. It reported a proactive takedown rate of 97.7% of content that exploits children, and said that the other 2.3% was removed by human moderators. Facebook said that views of child exploitation content were ‘infrequent’, estimating that there were no more than five views of violating content for every 10,000 views.
These figures related only to the violating content that Facebook actioned. What about all the violating content they didn't action, or the content that wasn't reported, or even found, because it was hidden behind private accounts, in unsaved stories and live posts, or in DMs and messenger rooms? Is Facebook keeping tabs on those views of child exploitation?
On May 20 we reported an account which posts explicit imagery of children - imagery which Facebook's 'proactive tech' failed to detect and remove. At the time of writing - two weeks later - the account (with nearly 2000 followers and 65 posts) is still active and posting child exploitation material. The longer this page is up, the more followers it gains and the more images it posts, the higher the number of views of child exploitation. Will these be counted?
How does Facebook defend its claim that, regarding content that exploits children, they 'do not find enough violating samples to precisely estimate prevalence'? Is turning a blind eye to it really a defense? Whose interests are served by pretending it doesn't exist?
While Facebook has a clear and comprehensive child nudity/exploitation policy, the sexualisation, harassment and exploitation of children on Instagram is rampant. Users don't heed the policy, and moderators don't consistently enforce it.
COVID-19 is having an exacerbating effect. As experts have warned, children are now at more risk of online sexual exploitation than ever. In keeping with this tragic trend, we recently discovered and reported hundreds of Instagram accounts for child exploitation activity - many of which were created after the start of COVID-19 lockdowns.
Just last week we reported a new Instagram account which was posting images of a pre-teen girl dressed in pink lingerie and advertising images for sale on Patreon and Boosty (read more about this - including our action and win - here). Instagram's response to reports is generally slow, though, and most often consists of a standardised reply that says due to COVID-19 they 'can't prioritise all reports right now'.
Undoubtedly, Facebook's 'transparency' data for the COVID-19 period will present outliers. But as our investigations and NSPCC's data show, Instagram's serious child predator and exploitation problems and failure to keep kids safe long predate the current global pandemic.
Facebook recently appointed an Oversight Board to arbitrate content takedown decisions. It's unclear whether this will improve child safety on its platforms. What is clear is that big tech and social media companies like Facebook are part of the child sexual exploitation supply chain. And through its public relations spin, lack of transparency and weak policy enforcement, Facebook is aiding predators and hurting children.
Read more about NSPCC's call for regulatory measures to hold tech and social media heads accountable for child exploitation here.
See also:
No Facebook, you're not doing enough to protect children.
Instagram CEO says 'can't post nudity'. Why did we have to repeatedly report content to police?
Instagram hosts sharing of child sex abuse fantasies
Graphic rape comments fuel #WakeUpInstagram campaign
Instagram CEO says 'can't post nudity'. Why did we have to repeatedly report content to police?
Instagram's 'nudity' rules don't keep kids safe
Recently Instagram CEO Adam Mosseri was questioned by online celebrity news group The Shade Room over the removal of a live post, hosted by Tory Lanez' 'Quarantine Radio' account. In his response Mosseri stated:
Read more