• About
    About Our Team Board Members Ambassadors Partners Questions and Answers History of Collective Shout
  • Campaigns
    Campaigns Wins Submissions Brands Age Verification Kids Exposed Child Sex Abuse Dolls Artificial Intelligence (AI) Social Media Playboy Shareholders
  • Join Us
    Join Us Events Become a CSR Pledge Partner
  • Request a speaker
  • Resources
    Resources Parents and Carers The Research Porn Harms Social Media Harms
  • News
    News Media Releases YouTube In the Media Sexual Harassment of Teachers Report
  • Donate
Donate

Pages tagged "Facebook"


Facebook-facilitated sexploitation

Posted on News by Lyn Kennedy · June 03, 2020 3:04 PM

*Content warning: child exploitation references

UK children’s charity says platforms have duty of care to keep kids safe

[UPDATED] Finally - after action from eSafety - Instagram has deleted three accounts we reported for child exploitation. The accounts posted explicit imagery of young girls in lingerie and swimwear, with with captions like 'hot young teen booty'. One account promoted a 'lingerie contest' using an image of a 14 year old Australian model in a g-string bikini, advertised a link to the deepnude website (read more about this exploitative app here) and sold spots in a private chat group for access to 'uncensored content', referring to images of teens as 'jerk-off material'.

These accounts are exploitative and degrading. This isn't Pornhub - it's mainstream social media.

Girls deserve better than the so-called 'dedication' to child safety Facebook is offering. We've heard enough talk. It's time for Facebook to stop facilitating child exploitation on its platforms.

Just overnight, Facebook announced a 'renewed commitment' to child safety. We hope this isn’t more PR spin.

Last year, Minister for Home Affairs Peter Dutton labelled Facebook cofounder and CEO Mark Zuckerberg ‘morally bankrupt’ for plans to introduce end-to-end encryption across all Facebook-platform private communication tools - a move which would aid sex predators and the exchange of child exploitation material, and endanger children further.

We’ll be watching to see how Facebook’s commitment plays out.


Between April 2017 and October 2019 police recorded over 10,000 incidents of online grooming in England and Wales. Where the communication method was known, fifty-five per cent occurred on Facebook-owned platforms, with more recorded on Instagram than any other individual platform. The UK’s National Society for the Prevention of Cruelty to Children has used this data to bolster a call for regulatory measures to hold social media platforms to account for facilitating these crimes.

Source: NSPCC via BBC

Facebook responded:

There is no place for grooming or child exploitation on our platforms and we use technology to proactively find and quickly remove it.

Facebook served up similar public relations spin last year in response to our collaborative, international #WakeUpInstagram campaign. But for almost a year, we’ve documented the failures of Facebook’s proactive tech and content-moderating system to keep kids safe from predators, harassment, exploitation and abuse. Earlier this year we pointed out that by giving predators unfettered access to underage girls, Instagram is complicit in normalising the idea that they are available for sexual gratification - an idea that has real-world consequences for girls. 

Last month Facebook released its latest Community Standards Enforcement Report, including data on takedown of content that violated Community Guidelines. It reported a proactive takedown rate of 97.7% of content that exploits children, and said that the other 2.3% was removed by human moderators. Facebook said that views of child exploitation content were ‘infrequent’,  estimating that there were no more than five views of violating content for every 10,000 views.

These figures related only to the violating content that Facebook actioned. What about all the violating content they didn't action, or the content that wasn't reported, or even found, because it was hidden behind private accounts, in unsaved stories and live posts, or in DMs and messenger rooms? Is Facebook keeping tabs on those views of child exploitation?

On May 20 we reported an account which posts explicit imagery of children - imagery which Facebook's 'proactive tech' failed to detect and remove. At the time of writing - two weeks later - the account (with nearly 2000 followers and 65 posts) is still active and posting child exploitation material. The longer this page is up, the more followers it gains and the more images it posts, the higher the number of views of child exploitation. Will these be counted?

How does Facebook defend its claim that, regarding content that exploits children, they 'do not find enough violating samples to precisely estimate prevalence'? Is turning a blind eye to it really a defense? Whose interests are served by pretending it doesn't exist?

While Facebook has a clear and comprehensive child nudity/exploitation policy, the sexualisation, harassment and exploitation of children on Instagram is rampant. Users don't heed the policy, and moderators don't consistently enforce it.

COVID-19 is having an exacerbating effect. As experts have warned, children are now at more risk of online sexual exploitation than ever. In keeping with this tragic trend, we recently discovered and reported hundreds of Instagram accounts for child exploitation activity - many of which were created after the start of COVID-19 lockdowns.

Just last week we reported a new Instagram account which was posting images of a pre-teen girl dressed in pink lingerie and advertising images for sale on Patreon and Boosty (read more about this - including our action and win - here).  Instagram's response to reports is generally slow, though, and most often consists of a standardised reply that says due to COVID-19 they 'can't prioritise all reports right now'. 

Undoubtedly, Facebook's 'transparency' data for the COVID-19 period will present outliers. But as our investigations and NSPCC's data show, Instagram's serious child predator and exploitation problems and failure to keep kids safe long predate the current global pandemic.  

Facebook recently appointed an Oversight Board to arbitrate content takedown decisions. It's unclear whether this will improve child safety on its platforms. What is clear is that big tech and social media companies like Facebook are part of the child sexual exploitation supply chain. And through its public relations spin, lack of transparency and weak policy enforcement, Facebook is aiding predators and hurting children.

Read more about NSPCC's call for regulatory measures to hold tech and social media heads accountable for child exploitation here.

See also:

No Facebook, you're not doing enough to protect children.

Instagram CEO says 'can't post nudity'. Why did we have to repeatedly report content to police?

Instagram a “predator’s paradise”: Collective Shout joins anti sexploitation groups in global campaign

Instagram hosts sharing of child sex abuse fantasies

Graphic rape comments fuel #WakeUpInstagram campaign

 

 


Racist, violent, misogynist: Does Instagram care about women and girls?

Posted on News by Lyn Kennedy · May 08, 2020 3:20 PM

*Content warning: themes and images are distressing

Read more

Instagram CEO says 'can't post nudity'. Why did we have to repeatedly report content to police?

Posted on News by Lyn Kennedy · April 25, 2020 1:53 AM

Instagram's 'nudity' rules don't keep kids safe

Recently Instagram CEO Adam Mosseri was questioned by online celebrity news group The Shade Room over the removal of a live post, hosted by Tory Lanez' 'Quarantine Radio' account. In his response Mosseri stated: 

Read more

Facebook shuts down anti-women Melb Guys Pal group but others take its place

Posted on News by Caitlin Roper · April 21, 2020 5:48 PM

"Holocaust #2 but instead of jews we target women"

Read more

School girl’s Instagram ‘live’ post becomes sex predator webcam

Posted on News by Lyn Kennedy · March 17, 2020 1:16 PM

How Instagram broadcasts live sex acts to kids

*Content warning: this article describes real events that may be distressing for readers

As a researcher and campaigner advocating for an end to the objectification of women and sexualisation of girls, I follow dozens of underage girls on Instagram. They are aspiring models, gymnasts and dancers. They pose in swimwear and leotards. I watch how they use Instagram to promote a new brand-name bikini, or to exhibit their latest ventures in flexibility: an attempt at oversplits or a contorted backbend. Some of the girls have hundreds of thousands of followers. We - their followers - didn't have to look for them: Instagram’s search , ‘Explore’ and ‘Suggested for you’ features served them to us in an algorithm-procured gallery of pre-pubescents and young teens that caters to the predator’s eye.

An Instagram feed filled with prepubescent girls in bikinis


The setting is a picture of after-school normality: an average kitchen in a home in Australian suburbia. Two girls in school uniform do what kids up and down the eastern seaboard are doing: arriving home after a long day of school, they dump their school bags and head to the kitchen to make a snack. 

One of the girls - 14 according to information on her Instagram account - casually picks up her phone. Still in her uniform (easily providing information about what school she attends and where) she opens Instagram and with a tap of an icon starts a live post: a livestream video that her followers can watch. Instagram even promotes the live broadcast. I, one of her 12,000 algorithm-procured followers - and one of hordes of strangers whose identities, whereabouts and motives for watching a 14 year old girl are unknown - get  an Instagram notification that she’s started a ‘live’. I click her avatar to start the livestream, and instantly I’m transported into the family kitchen. The girl and her friend occupy the foreground, creating a soundtrack with teenage chatter. In the background is a fridge plastered with photos, bills and reminders – artefacts of average family life. 

One of the 50+ viewers makes a request to ‘be in’ the ‘live’. This request is one of Instagram’s built-in live-post features that allows viewers to interact with the host via a simultaneous, live video broadcast which the other viewers can see. The girl accepts the request, smiling curiously at the screen as she scans viewers’ incoming comments. As she does, my screen splits horizontally, making way for the viewer’s live video broadcast. 

The viewer is a man. He is naked. And he is masturbating.

The girl bursts into nervous laughter and steps out of view, leaving viewers to watch the fridge and the man. He repositions his phone to show his genitals from a different angle before his school girl host returns, hand over mouth, and ends his live video. With the screen to herself again, she continues her live post giggling, while, from off-camera, her friend makes a comment to the effect that they shouldn’t be laughing: it’s not funny.  But they don’t appear all that shocked. It’s almost as if this isn’t the first time a stranger has made a sexual approach this way, as though this after-school event is also normal. Parents aren’t told, no alarm is raised. They continue with their live post - even accepting another viewer’s request to be in the live post.

An underage girl has just – with no moderation or intervention from the global multi-billion dollar Facebook-owned platform - broadcast a live video of a naked man masturbating. She and her friend - and fifty other people - just witnessed a serious criminal act, prohibited by Australia’s Commonwealth, state and territory child exploitation material laws.  

Who else witnessed the live sex act? Other school friends? Perhaps younger children – cousins or neighbours who tuned in to catch up on some big-girl news? How widely did Instagram disseminate this piece of child exploitation material that it failed to moderate and helped produce? How many times is this scene being played out in Australia each day? How many kitchens and bathrooms and bedrooms of Australian homes are being infiltrated by predators who want to abuse underage girls in this way? How many men are using Instagram to broadcast live sex acts to children? Has this type of criminal behaviour become ‘normal’ for girls who have been desensitised to predatory advances because sexual objectification, harassment and predation are so entrenched in their everyday, lived experiences? Why - in flagrant disregard of human rights, law, child safety principles and common sense - is Instagram connecting predators to minors?

Four days later our concerns that this event was not a one-off, that predators are targeting underage girls for the purpose of broadcasting live sex acts to them and that this is 'normal' for some girls were confirmed when we found the public Instagram account of a 9 year old girl based in Europe. She had saved a live post to her profile, allowing anyone to watch it for the 24-hour period that followed. We watched the video and saw that it was interupted several times as the young girl accepted requests from different viewers to be in the broadcast. We counted three different viewers who filmed themselves masturbating. We then followed the girl. Within an hour we received a notification from Instagram that she had started a live post. We began viewing the video immediately and within seconds she accepted a viewer's request to be in the broadcast. It was another naked, masturbating man.

In the week since I first saw men masturbating via live videofeed at those girls, I have not been able to erase the images from my mind. These are among the most disturbing things I’ve come across since my colleagues and I began investigating hundreds of predatory approaches to underage girls through their Instagram pages. Within a short time of making a report to Instagram about the 9 year old girl her account was removed. But how many backup accounts does she have? How long until she creates a new account? How long until Instagram reconnects her old followers to her? How long before they're again using Instagram as a webcam to broadcast live sex acts to her and other children? How many other victims are there? And what does the future hold for these girls who have been groomed by Instagram's predators to believe that men exposing and rubbing their genitals at them is normal? Will they be safe from unwanted sexual advances from their bosses and colleagues? From strangers? Will the #MeToo movement mean anything for them? Will others enable men to harass or commit other heinous, sexual crimes against them, the way Instagram did in their childhood?

Our investigation began last July and demonstrated how Instagram serves as a pedophile directory and forum. We reported web-based pedophile forums containing direct links to underages girls’ Instagram accounts, in which pedophiles described violent sex abuse fantasies involving Instagram’s child models, gymnasts and dancers - girls as young as one. We reported multiple examples of child exploitation material. 

In November 2019, Collective Shout, in coalition with the National Centre on Sexual Exploitation in the US and Defend Dignity in Canada, launched #WakeUpInstagram - an international campaign to hold Instagram and Facebook executives accountable for the exploitation and predation of underage girls on their platform to try to leverage our combined weight to force the platforms to act. 

We then wrote to Instagram’s Head of Global Policy with some of our key discoveries, including countless sexualised and predatory comments made by men to underage girls, and to ask Instagram to address its widespread child predator problem. In the letter we made several recommendations and asked Instagram to stop adults from contacting minors during live posts. Having viewed several live posts hosted by girls as young as 11, we knew that men were using ‘lives’ to harass and solicit sexualised content from minors. 

Three months later – while Instagram’s investigations continued - we witnessed yet another example of how Instagram caters to predators and even facilitates criminal behaviour, and how girls’ safety and well-being are sidelined. Instead of safeguarding children, Instagram is bringing child predators - naked and masturbating in real-time - into their homes. 

Instagram’s catchphrase rings of utopian ideals of boundless connectivity: “Bringing you closer to the people and things you love”. When juxtaposed against our discoveries which show that the ‘people’ are often predators, and the ‘things’ they love are underage girls, the slogan rings sinister. Nowhere in society are we fostering connections between child predators and children. In fact we are vigilant in our efforts to prevent such connections. Why are the rules different for social media companies? Shouldn’t Instagram not only stop connecting predators to children but work fastidiously to prevent these connections? 

Instagram does prevent certain adult-child connections: those between parents and their children. According to its “Tips for Parents” Instagram can’t - due to privacy laws - give a parent access to their 13+ year old child’s account. But an Instagram-procured predator who wants to masturbate at a child has the freedom to do so. 

In a timely report, the United Nations Special Rapporteur on the sale and sexual exploitation of children highlighted some of the dangers of social media that we - through the #WakeUpInstagram campaign - are calling on Instagram's corporate leaders to address:

'Offenders, traffickers and criminal groups use Internet tools, such as social media, to identify child victims more easily and establish relationships, subsequently intimidating them into exploitative situations.'

The report further pointed out that offenders are empowered by impunity:

'Ultimately, the essential feature of most offenders is their knowledge or belief that their actions will go unpunished'. 

Our investigations have shown that predators are fed a steady stream of victims via Instagram’s algorithms and that predators are free to roam and prey at will, not just with impunity but with the endorsement of moderators who tell us their behaviour ‘doesn’t go against community guidelines’. 

Australia is at the forefront of global efforts to improve online safety. The Office of the eSafety Commissioner’s user-centred initiative, Safety by Design, was the outcome of consultation with industry, service providers, parents and young people and resulted in a set of principles that prioritises user rights and safety. Safety by Design highlights the imperative role of service providers like Instagram in the broader context of shared responsibility for online safety. It spells out eight initiatives designed to 'ensure that known and anticipated harms have been evaluated in the design and provision of an online service'. 

In the same week we witnessed how Instagram is used by sex predators to broadcast live sex acts to children, its parent company Facebook committed to a set of new, voluntary standards developed by the Five Country Ministerial (represented by government Ministers of Australia, the United States, the United Kingdom, Canada and New Zealand) to combat online child sexual exploitation and abuse.

Can Instagram abide by these principles and standards without a drastic overhaul to its ethos and operations? While it is underpinned by ideals like boundless connectivity which connects child predators to children? While it stands by community guidelines that accommodate the sexual harassment of little girls? While it indiscriminately gives its users tools - like 'Explore' and live posts - which predators can use to find child victims and commit sexual crimes against them? 

As Campaigns Manager Caitlin Roper warned:

'We cannot overlook the significance of a wider culture that sexualises children and treats them as appropriate objects of men’s sexual desire.'

Instagram - through its predator-friendly policies and practises - has fostered a community that fetishises underage girls and helped fuel a culture that normalises their sexualisation and harassment. Now - as well as upholding the principles it has committed to, Instagram must work to eradicate its child predator community, and to foster a culture in which the sexualisation, harassment, exploitation and abuse of children is unthinkable. We owe it to girls and women - this and future generations - to make sure they do.

Note:

  • If you are concerned about suspected online child exploitation material, make a report to the eSafety Office.
  • If you are concerned about an adult behaving inappropriately online toward a child, make a report to the Australian Federal Police.
  • Make an anonymous report to Crime Stoppers or phone their toll free number 1800 333 000.

See also:

Insta must act on predators: Collective Shout letter to platform heads

eSafety commissioner backs Collective Shout's call for Instagram overhaul

I was so concerned about porn-themed portrayals of young girls on Instagram I reported to police

Melinda Tankard Reist on ABC Radio National discussing the #WakeUpInstagram campaign

If Instagram can restrict diet products they can stop child sexual exploitation

‘Sexy girl’: How Instagram allows the offering of young girls as fetishised flesh

Tech companies turn a blind eye to child sexual abuse material


A thirty-seven year old mother spent seven days online as an eleven year old girl. Here’s what she learned.

Posted on News by Melinda Liszewski · February 26, 2020 6:30 PM

WATCH: Social media dangers exposed

Read more

Instagram hosts sharing of child sex abuse fantasies

Posted on News by Lyn Kennedy · January 07, 2020 9:52 AM

[CONTENT WARNING]

Instagram is a haven for child predators and a host to the broadcasting of child sex abuse fantasies. Contrary to corporate claims that there's 'no place' for content that exploits or endangers children, exploitative, graphic and degrading comments directed at underage girls are rampant on the Facebook-owned platform. 

Our campaign partners at National Center on Sexual Exploitation met with Instagram heads last December to discuss the ways Instagram puts children at risk and how child safety on Instagram could be improved. Instagram said that the matter of predatory comments will be investigated.  

Meanwhile, men continue to use Instagram to harass and fetishise little girls. See below for our latest roundup of child sex abuse fantasies and other predatory comments freely broadcast on Instagram. We will continue to expose and call out this predatory behaviour!

Take action today!

Help put pressure on Instagram corporate leaders to stop facilitating behaviours that harm children. Join the Twitter conversation using @collectiveshout #WakeUpInstagram and #InstaPimpsGirls.

 

See also:

Graphic rape comments fuel #WakeUpInstagram campaign

Insta must act on predators: Collective Shout letter to platform heads

eSafety commissioner backs Collective Shout's call for Instagram overhaul

Melinda Tankard Reist on ABC Radio National discussing the #WakeUpInstagram campaign

‘Sexy girl’: How Instagram allows the offering of young girls as fetishised flesh

 

 

 

 

 

 

 

Read more

No Facebook, you're not doing enough to protect children.

Posted on News by Coralie Alison · November 23, 2019 8:58 AM
  1. According to Fox News Facebook stated “Any content that endangers or exploits children is unthinkable and has no place on Instagram. We have policies that prohibit this type of content, and we use proactive technology to find and remove content that poses a risk to children. We’re constantly working on more ways to help prevent this behaviour on our platform.”

But we've found countless comments that have been up on underage girls' photos for up to a year. How is this using "proactive technology"? We've also reported hundreds of comments, many of which have taken weeks to get actioned and often the reply from Instagram is that they do not violate community guidelines. 

These comments here are from 27 weeks ago and 14 weeks ago. Also a comment regarding a topless shoot on a child's picture apparently did not violate Instagram's community guidelines:

Here's an example of comments from a man describing his desire to sexually assault a girl which apparently do not violate Instagram's community guidelines. And another example of a comment that had been up for 54 weeks:

This comment describing the sexual assault of and underage girl had been up for 38 weeks when we found it:

Here is a sample of 100+ comments we found on Instagram on pictures of underage girls:

  1. Sexy girl
  2. Mmm sexy
  3. Delicious
  4. Snack
  5. Hot little tidbit
  6. I can make you scream daddy
  7. You can sit down on my legs so that I can get hard
  8. Eggplant/squirt emojis signifying penis/sex abuse/masturbation/ejaculation
  9. Wow I’d love to spread those legs
  10. Cute budding breasts
  11. Yummy candy my favourite dessert + emojis signifying penis/sex abuse/masturbation/ejaculation
  12. Wanna jump on top of you laying like this
  13. I wanna slide in that so bad
  14. Beautiful ass out from water
  15. Love the pics of you in your cut lil panties. I hope you post more like this.
  16. Sit on my face like that
  17. Hello I am a young model scout and i am looking for a designer model. If you are interested write me pm
  18. Cooling off that hot ass? Nice
  19. I wish i could see more of you - send me a picture
  20. Sexy pussy
  21. So hot n sexy body
  22. I want to eat your everything if you don’t mind are you ready
  23. Wow body and boobs and facking nice 
  24. Such kissable full lips
  25. Nice pussy
  26. What is your snapchat name
  27. Yummmm
  28. Let me have porn sex for you
  29. Guess you like opening your legs
  30. Ooh la la + emojis signifying penis/sex abuse/masturbation/ejaculation
  31. Cute little tooshy
  32. Why did you have to cut off the best part of the pics. Still so sexy tho. Call me
  33. You are so dam yummy babe. When u going to pose in Victoria Secret outfits next
  34. Emojis signifying 'horny', penis/sex abuse/masturbation/ejaculation
  35. Uff so delicious
  36. I want lick
  37. Ahhhhh + emojis signifying penis/sex abuse/masturbation/ejaculation
  38. Stop cover up your body
  39. Imagine what am doing..10 inches n am cumming imaginit
  40. Hot wet pants
  41. I find it amazingly convenient that a specific area is wet
  42. Having that beautiful pair of legs on my shoulders and putting my penis in that beautiful ass
  43. Sweet camel toe; young camel toe; awesome lips, very edible and pokable
  44. I would love to fuck you SO HARD
  45. Fucking cutie be mine
  46. Super sexy for such a little girl
  47. You have the most ridiculously perfect sweet tender ass cheeks in the world.
  48. I want you
  49. That is so hot. You just made me …
  50. Mmm yummy
  51. Beautiful. Super cute in bare feet 
  52. Yes now stay like that – rape emojis
  53. Double penetration
  54. Looks like a fun group of girls. Wonder if they would do a private dance
  55. Hot AF, bootylicious
  56. Sweet and yummy
  57. Mmm hot sexy body
  58. I like you beautiful
  59. Nice legs
  60. Can you do a stretching vid without a skirt on in undies that are way too tight
  61. Want you caress and kiss you my love
  62. I always love your tight panties. Thanks for the great position
  63. Extreme cameltoe
  64. Yummy body
  65. Can I massage you
  66. The view from behind must be amazing
  67. I’ll put you on the bed n slowly take all your clothes off
  68. Would love to get soapy with you
  69. Dam you’re sexy goddam you’re hot
  70. Camel toe - squirt emoji
  71. Yeah I like to put it – rape emoji
  72. Can you jump up and down and make them wobble more plz
  73. Beautiful. Nice crotch
  74. I wanna slide in that and fill it up
  75. Let me suck your toes
  76. Hot sexy baby
  77. You are so sexy
  78. I love you sexy lady
  79. Sexy sexy sexy sexual lady
  80. You delicious, beautiful to eat
  81. Cute butt
  82. You’re getting way too sexy little girl
  83. Topless shoot would be so erotic
  84. Yes love that shorts. When it’s come off baby
  85. I’ll take you from the front or back baby..both my sweet little hot box
  86. (I like) the front while lying on your back preferably with your legs spread open
  87. Lovely cameltoe
  88. I know she got a fat lil pussy
  89. I bet you look amazing in those (underwear) I hope you show us
  90. Ok I’m going out of line - you put it (underwear) on I take it off
  91. Great pix - I was trying to imagine with less clothes
  92. Panty lines so cute
  93. I’d love to do that with you (pic of 2 girls in gymnastics pose)
  94. Wow mine would go in so smooth
  95. Sit on my face
  96. Yummy baby
  97. Stretch that labia
  98. Are you really 10..Please open your legs
  99. I busted a good night to this one
  100. Looks like she's getting it ball deep, very hot
  101. I masturbated to her
  102. I'm hungry for it
  103. el dedo en la vagina (the finger in the vagina)
  104. I'm in need of a sugar baby that I will spoil with shopping allowance and gift. DM if interested USA only
  105. I eat you
  106. I pay for meeting you
  107. You wear bra?
  108. Care to be my sugar baby?
  109. She's got legs for dayyyys (pic of 6yo girl)
  110. You pose so perfect for tongue workout
  111. That is such a naughty look in your eyes
  112. So hot be perfect for you to take it
  113. One sexy little princess
  114. I thought cum in your face
  115. I wish I was that pumpkin your sitting on
  116. I better start tossing some money at you
  117. fancy a drink?
  118. Your vagina is little
  119. Sexy legs small boobs
  120. I wanna lick your pussy and ass
  121. Love to taste your butt hole an lick your wet pussy
  122. OMG anal licking
  123. Let me put my dick in ur asshole
  124. Her face after getting a massive cock in her fat pussy

We need to send a strong message to Instagram. Add your name to the letter here. 

 


See also:

Why is Instagram letting men share sexual fantasies about 7 year old girls?

Help Wake Up Instagram to Child Exploitation!

Instagram a “predator’s paradise”: Collective Shout joins anti sexploitation groups in global campaign

‘Sexy girl’: How Instagram allows the offering of young girls as fetishised flesh

Read more

Tech companies turn a blind eye to child sexual abuse material

Posted on News by Melinda Liszewski · November 11, 2019 1:47 PM

Victims live in fear knowing images of their abuse are circulating the web

Read more

Revenge porn is making our social media toxic

Posted on News · September 08, 2017 9:45 AM

Facebook faced over 50,000 potential revenge porn cases in January alone.

A closed men’s Facebook group with a membership of 15,000 recently received a post from a member displaying pictures of himself having sex with a woman without her consent. 

fb_getty_images_pablo.png

Source: Getty Images

Read more

  • ← Previous
  • 1
  • 2
  • 3
  • 4
  • Next →

Join Us Web Banner


You can defend their right to childhood

Your gift will fuel the fight

Give Today
  • Sign in with Facebook
  • Sign in with Twitter
  • Sign in with Email


Collective Shout

  • Home
  • About
  • FAQ
  • Join
  • Contact

Latest Campaigns

  • Honey Birdette
  • Kids Exposed
  • Child Dolls
  • X/Twitter
  • Dump Playboy

Trending Now

  • Contact your 2025 Federal election candidates
  • SoundCloud Dumps Andrew Tate’s 'PhD' Course - Collective Shout
  • Email Soundcloud and demand they remove all Andrew Tate's harmful courses on trafficking women - Collective Shout

Follow us on our socials

© 2025 Collective ShoutCreated with Heartburst
Sign in with Facebook, Twitter or email.
ACNC Registered Charity

Collective Shout is a not-for-profit company limited by guarantee and is governed by a board of directors. We are an independent registered charity with no affiliation to religious or political institutions.