• About
    About Our Team Board Members Ambassadors Partners
  • Campaigns
    Campaigns Wins Submissions Brands
  • Join Us
    Join Us Events Positions Vacant Become a CSR Pledge Partner
  • Request a speaker
  • Resources
    Resources Parents and Carers The Research Porn harms
  • News
    News Media Releases YouTube In the Media
  • Donate
Donate

Pages tagged "sexploitation"


'Not Hugh Hefner’s Playboy’, claims CEO. We agree - it’s actually worse.

Posted on News by Collective Shout · June 06, 2022 1:33 PM · 2 reactions

Playboy exploits murder victim, turns children into porn industry props

Read more

Etsy selling “black sex slave” dolls for men’s sexual use

Posted on News by Caitlin Roper · April 06, 2022 12:15 PM · 1 reaction
Read more

Out of Fashion: #NotBuying Tom Ford’s Beauty Range

Posted on News by Renee Chopping · March 02, 2022 12:49 PM · 1 reaction
Read more

Rebel Sport: Women are more than "boobs" and "bums"

Posted on News by Caitlin Roper · February 21, 2022 5:32 PM · 1 reaction
Read more

Media Release: Collective Shout releases annual XMAS blacklist of corporate offenders

Posted on News by Caitlin Roper · November 14, 2021 11:29 AM · 1 reaction
Read more

Cross ‘em off your Christmas List: Corporate Sexploitation Offenders of 2021

Posted on News by Caitlin Roper · November 14, 2021 11:27 AM · 1 reaction

They don’t deserve your Christmas dollar: Give these companies a miss this year!

Read more

A decade of fighting sexploitation of women in sport

Posted on News by Caitlin Roper · November 04, 2021 10:49 AM · 1 reaction
Read more

Sexploitation of women in sport

Posted on News by Melinda Liszewski · July 21, 2021 2:27 PM

Sexploitation 

As published by www.ausport.gov.au Reprinted with permission

Read more

Alibaba Group: Stop selling child sex abuse dolls

Posted on News by Lyn Kennedy · July 10, 2020 5:55 PM
Read more

Facebook-facilitated sexploitation

Posted on News by Lyn Kennedy · June 03, 2020 3:04 PM

*Content warning: child exploitation references

UK children’s charity says platforms have duty of care to keep kids safe

[UPDATED] Finally - after action from eSafety - Instagram has deleted three accounts we reported for child exploitation. The accounts posted explicit imagery of young girls in lingerie and swimwear, with with captions like 'hot young teen booty'. One account promoted a 'lingerie contest' using an image of a 14 year old Australian model in a g-string bikini, advertised a link to the deepnude website (read more about this exploitative app here) and sold spots in a private chat group for access to 'uncensored content', referring to images of teens as 'jerk-off material'.

These accounts are exploitative and degrading. This isn't Pornhub - it's mainstream social media.

Girls deserve better than the so-called 'dedication' to child safety Facebook is offering. We've heard enough talk. It's time for Facebook to stop facilitating child exploitation on its platforms.

Just overnight, Facebook announced a 'renewed commitment' to child safety. We hope this isn’t more PR spin.

Last year, Minister for Home Affairs Peter Dutton labelled Facebook cofounder and CEO Mark Zuckerberg ‘morally bankrupt’ for plans to introduce end-to-end encryption across all Facebook-platform private communication tools - a move which would aid sex predators and the exchange of child exploitation material, and endanger children further.

We’ll be watching to see how Facebook’s commitment plays out.


Between April 2017 and October 2019 police recorded over 10,000 incidents of online grooming in England and Wales. Where the communication method was known, fifty-five per cent occurred on Facebook-owned platforms, with more recorded on Instagram than any other individual platform. The UK’s National Society for the Prevention of Cruelty to Children has used this data to bolster a call for regulatory measures to hold social media platforms to account for facilitating these crimes.

Source: NSPCC via BBC

Facebook responded:

There is no place for grooming or child exploitation on our platforms and we use technology to proactively find and quickly remove it.

Facebook served up similar public relations spin last year in response to our collaborative, international #WakeUpInstagram campaign. But for almost a year, we’ve documented the failures of Facebook’s proactive tech and content-moderating system to keep kids safe from predators, harassment, exploitation and abuse. Earlier this year we pointed out that by giving predators unfettered access to underage girls, Instagram is complicit in normalising the idea that they are available for sexual gratification - an idea that has real-world consequences for girls. 

Last month Facebook released its latest Community Standards Enforcement Report, including data on takedown of content that violated Community Guidelines. It reported a proactive takedown rate of 97.7% of content that exploits children, and said that the other 2.3% was removed by human moderators. Facebook said that views of child exploitation content were ‘infrequent’,  estimating that there were no more than five views of violating content for every 10,000 views.

These figures related only to the violating content that Facebook actioned. What about all the violating content they didn't action, or the content that wasn't reported, or even found, because it was hidden behind private accounts, in unsaved stories and live posts, or in DMs and messenger rooms? Is Facebook keeping tabs on those views of child exploitation?

On May 20 we reported an account which posts explicit imagery of children - imagery which Facebook's 'proactive tech' failed to detect and remove. At the time of writing - two weeks later - the account (with nearly 2000 followers and 65 posts) is still active and posting child exploitation material. The longer this page is up, the more followers it gains and the more images it posts, the higher the number of views of child exploitation. Will these be counted?

How does Facebook defend its claim that, regarding content that exploits children, they 'do not find enough violating samples to precisely estimate prevalence'? Is turning a blind eye to it really a defense? Whose interests are served by pretending it doesn't exist?

While Facebook has a clear and comprehensive child nudity/exploitation policy, the sexualisation, harassment and exploitation of children on Instagram is rampant. Users don't heed the policy, and moderators don't consistently enforce it.

COVID-19 is having an exacerbating effect. As experts have warned, children are now at more risk of online sexual exploitation than ever. In keeping with this tragic trend, we recently discovered and reported hundreds of Instagram accounts for child exploitation activity - many of which were created after the start of COVID-19 lockdowns.

Just last week we reported a new Instagram account which was posting images of a pre-teen girl dressed in pink lingerie and advertising images for sale on Patreon and Boosty (read more about this - including our action and win - here).  Instagram's response to reports is generally slow, though, and most often consists of a standardised reply that says due to COVID-19 they 'can't prioritise all reports right now'. 

Undoubtedly, Facebook's 'transparency' data for the COVID-19 period will present outliers. But as our investigations and NSPCC's data show, Instagram's serious child predator and exploitation problems and failure to keep kids safe long predate the current global pandemic.  

Facebook recently appointed an Oversight Board to arbitrate content takedown decisions. It's unclear whether this will improve child safety on its platforms. What is clear is that big tech and social media companies like Facebook are part of the child sexual exploitation supply chain. And through its public relations spin, lack of transparency and weak policy enforcement, Facebook is aiding predators and hurting children.

Read more about NSPCC's call for regulatory measures to hold tech and social media heads accountable for child exploitation here.

See also:

No Facebook, you're not doing enough to protect children.

Instagram CEO says 'can't post nudity'. Why did we have to repeatedly report content to police?

Instagram a “predator’s paradise”: Collective Shout joins anti sexploitation groups in global campaign

Instagram hosts sharing of child sex abuse fantasies

Graphic rape comments fuel #WakeUpInstagram campaign

 

 


  • ← Previous
  • 1
  • 2
  • 3
  • 4
  • 5
  • Next →

A world free of sexploitation is possible!


You can defend their right to childhood

Your gift will fuel the fight

Give Today
  • Sign in with Facebook
  • Sign in with Twitter
  • Sign in with Email


Collective Shout

  • Home
  • About
  • FAQ
  • Join
  • Contact

Latest Campaigns

  • Honey Birdette
  • Instagram
  • Pornhub
  • Etsy
  • Westfield

Trending Now

  • 'Lutheran' investor props up strip clubs - Collective Shout
  • WIN: Sex doll flogger 'Rod' removed from Instagram - Collective Shout
  • Objectification...for a cause? PETA's unethical treatment of women - Collective Shout

Follow us on our socials

© 2023 Collective ShoutCreated with Heartburst
Sign in with Facebook, Twitter or email.
ACNC Registered Charity

Collective Shout is a not-for-profit company limited by guarantee and is governed by a board of directors. We are an independent registered charity with no affiliation to religious or political institutions.