Ask Spotify to protect kids from p*rnography and s*xual predators
We've been contacted by supporters to say their kids have been exposed to porn on Spotify. We spoke out about this in 2022 when we did our own digging and discovered an overwhelming amount of explicit audio porn, nudes and hardcore porn images and sexually explicit fanfictions.
Take action: Our friends at National Center on Sexual Exploitation have made it easy with a quick online form.
Join the global campaign to hold Spotify accountable.
Read moreCall on the U.S. Attorney General Investigate OnlyFans
Collective Shout joins with National Centre on Sexual Exploitation (NCOSE) and thousands of grassroots advocates to urge the U.S. Attorney General to investigate OnlyFans, so they can be held accountable.
OnlyFans is on the NCOSE 2023 Dirty Dozen List for facilitating sex trafficking, child sexual abuse material, and image-based sexual abuse.
Click here to add your name.
The letter states:
Please investigate OnlyFans for exploitive content and conduct found throughout their site.
I believe that OnlyFans is facilitating prostitution and sexual exploitation with its predatory platform based on revenue earned from vulnerable people’s bodies – especially the bodies of women and minors. Exploiting financial insecurities deepened by the COVID-19 crisis, OnlyFans promises fast cash, empowerment, and even fame, when in reality, it facilitates sexual exploitation, harms minors, and emboldens men to objectify and degrade women.
Law enforcement and service providers are seeing evidence of sex trafficking and child sex abuse material (CSAM) proliferating on the site. BBC recently ran an article about the significant number of minors selling nudes on OnlyFans, images that are CSAM under U.S. federal law.
OnlyFans has insufficient verification measures for age or consent—meaning, it is highly likely that CSAM and sex trafficking are on the platform. This website has all the hallmarks of Backpage.com (shut down by the FBI in 2018): pornographic pictures and videos are used to advertise for both sex trafficking and prostitution victims (including minors). Further, law enforcement is finding that many sex trafficking victims, and victims of CSAM, are coerced into creating live streams or webcam pornography as well. OnlyFans is an obvious marketplace for traffickers to promote and sell sexual exploitation.
Please investigate this company and hold them accountable!
Head to the NCOSE website to learn more.
Reddit facilitates image-based sexual abuse
Turning a blind eye to survivors
Sign the Global Joint Letter here
In partnership with National Center on Sexual Exploitation (NCOSE)
Read moreWIN: New measures to help protect minors from predators on Instagram
We've had another major #WakeUpInstagram win, with Instagram's rollout of new tools to help protect underage users from predators on the platform including:
Read moreWin! Mastercard and VISA cut ties with Pornhub + millions of vids removed
Progress in our joint global campaign but still a long way to go to bring Pornhub to justice
Read moreSchool girl’s Instagram ‘live’ post becomes sex predator webcam
How Instagram broadcasts live sex acts to kids
*Content warning: this article describes real events that may be distressing for readers
As a researcher and campaigner advocating for an end to the objectification of women and sexualisation of girls, I follow dozens of underage girls on Instagram. They are aspiring models, gymnasts and dancers. They pose in swimwear and leotards. I watch how they use Instagram to promote a new brand-name bikini, or to exhibit their latest ventures in flexibility: an attempt at oversplits or a contorted backbend. Some of the girls have hundreds of thousands of followers. We - their followers - didn't have to look for them: Instagram’s search , ‘Explore’ and ‘Suggested for you’ features served them to us in an algorithm-procured gallery of pre-pubescents and young teens that caters to the predator’s eye.
An Instagram feed filled with prepubescent girls in bikinis
The setting is a picture of after-school normality: an average kitchen in a home in Australian suburbia. Two girls in school uniform do what kids up and down the eastern seaboard are doing: arriving home after a long day of school, they dump their school bags and head to the kitchen to make a snack.
One of the girls - 14 according to information on her Instagram account - casually picks up her phone. Still in her uniform (easily providing information about what school she attends and where) she opens Instagram and with a tap of an icon starts a live post: a livestream video that her followers can watch. Instagram even promotes the live broadcast. I, one of her 12,000 algorithm-procured followers - and one of hordes of strangers whose identities, whereabouts and motives for watching a 14 year old girl are unknown - get an Instagram notification that she’s started a ‘live’. I click her avatar to start the livestream, and instantly I’m transported into the family kitchen. The girl and her friend occupy the foreground, creating a soundtrack with teenage chatter. In the background is a fridge plastered with photos, bills and reminders – artefacts of average family life.
One of the 50+ viewers makes a request to ‘be in’ the ‘live’. This request is one of Instagram’s built-in live-post features that allows viewers to interact with the host via a simultaneous, live video broadcast which the other viewers can see. The girl accepts the request, smiling curiously at the screen as she scans viewers’ incoming comments. As she does, my screen splits horizontally, making way for the viewer’s live video broadcast.
The viewer is a man. He is naked. And he is masturbating.
The girl bursts into nervous laughter and steps out of view, leaving viewers to watch the fridge and the man. He repositions his phone to show his genitals from a different angle before his school girl host returns, hand over mouth, and ends his live video. With the screen to herself again, she continues her live post giggling, while, from off-camera, her friend makes a comment to the effect that they shouldn’t be laughing: it’s not funny. But they don’t appear all that shocked. It’s almost as if this isn’t the first time a stranger has made a sexual approach this way, as though this after-school event is also normal. Parents aren’t told, no alarm is raised. They continue with their live post - even accepting another viewer’s request to be in the live post.
An underage girl has just – with no moderation or intervention from the global multi-billion dollar Facebook-owned platform - broadcast a live video of a naked man masturbating. She and her friend - and fifty other people - just witnessed a serious criminal act, prohibited by Australia’s Commonwealth, state and territory child exploitation material laws.
Who else witnessed the live sex act? Other school friends? Perhaps younger children – cousins or neighbours who tuned in to catch up on some big-girl news? How widely did Instagram disseminate this piece of child exploitation material that it failed to moderate and helped produce? How many times is this scene being played out in Australia each day? How many kitchens and bathrooms and bedrooms of Australian homes are being infiltrated by predators who want to abuse underage girls in this way? How many men are using Instagram to broadcast live sex acts to children? Has this type of criminal behaviour become ‘normal’ for girls who have been desensitised to predatory advances because sexual objectification, harassment and predation are so entrenched in their everyday, lived experiences? Why - in flagrant disregard of human rights, law, child safety principles and common sense - is Instagram connecting predators to minors?
Four days later our concerns that this event was not a one-off, that predators are targeting underage girls for the purpose of broadcasting live sex acts to them and that this is 'normal' for some girls were confirmed when we found the public Instagram account of a 9 year old girl based in Europe. She had saved a live post to her profile, allowing anyone to watch it for the 24-hour period that followed. We watched the video and saw that it was interupted several times as the young girl accepted requests from different viewers to be in the broadcast. We counted three different viewers who filmed themselves masturbating. We then followed the girl. Within an hour we received a notification from Instagram that she had started a live post. We began viewing the video immediately and within seconds she accepted a viewer's request to be in the broadcast. It was another naked, masturbating man.
In the week since I first saw men masturbating via live videofeed at those girls, I have not been able to erase the images from my mind. These are among the most disturbing things I’ve come across since my colleagues and I began investigating hundreds of predatory approaches to underage girls through their Instagram pages. Within a short time of making a report to Instagram about the 9 year old girl her account was removed. But how many backup accounts does she have? How long until she creates a new account? How long until Instagram reconnects her old followers to her? How long before they're again using Instagram as a webcam to broadcast live sex acts to her and other children? How many other victims are there? And what does the future hold for these girls who have been groomed by Instagram's predators to believe that men exposing and rubbing their genitals at them is normal? Will they be safe from unwanted sexual advances from their bosses and colleagues? From strangers? Will the #MeToo movement mean anything for them? Will others enable men to harass or commit other heinous, sexual crimes against them, the way Instagram did in their childhood?
Our investigation began last July and demonstrated how Instagram serves as a pedophile directory and forum. We reported web-based pedophile forums containing direct links to underages girls’ Instagram accounts, in which pedophiles described violent sex abuse fantasies involving Instagram’s child models, gymnasts and dancers - girls as young as one. We reported multiple examples of child exploitation material.
In November 2019, Collective Shout, in coalition with the National Centre on Sexual Exploitation in the US and Defend Dignity in Canada, launched #WakeUpInstagram - an international campaign to hold Instagram and Facebook executives accountable for the exploitation and predation of underage girls on their platform to try to leverage our combined weight to force the platforms to act.
We then wrote to Instagram’s Head of Global Policy with some of our key discoveries, including countless sexualised and predatory comments made by men to underage girls, and to ask Instagram to address its widespread child predator problem. In the letter we made several recommendations and asked Instagram to stop adults from contacting minors during live posts. Having viewed several live posts hosted by girls as young as 11, we knew that men were using ‘lives’ to harass and solicit sexualised content from minors.
Three months later – while Instagram’s investigations continued - we witnessed yet another example of how Instagram caters to predators and even facilitates criminal behaviour, and how girls’ safety and well-being are sidelined. Instead of safeguarding children, Instagram is bringing child predators - naked and masturbating in real-time - into their homes.
Instagram’s catchphrase rings of utopian ideals of boundless connectivity: “Bringing you closer to the people and things you love”. When juxtaposed against our discoveries which show that the ‘people’ are often predators, and the ‘things’ they love are underage girls, the slogan rings sinister. Nowhere in society are we fostering connections between child predators and children. In fact we are vigilant in our efforts to prevent such connections. Why are the rules different for social media companies? Shouldn’t Instagram not only stop connecting predators to children but work fastidiously to prevent these connections?
Instagram does prevent certain adult-child connections: those between parents and their children. According to its “Tips for Parents” Instagram can’t - due to privacy laws - give a parent access to their 13+ year old child’s account. But an Instagram-procured predator who wants to masturbate at a child has the freedom to do so.
In a timely report, the United Nations Special Rapporteur on the sale and sexual exploitation of children highlighted some of the dangers of social media that we - through the #WakeUpInstagram campaign - are calling on Instagram's corporate leaders to address:
'Offenders, traffickers and criminal groups use Internet tools, such as social media, to identify child victims more easily and establish relationships, subsequently intimidating them into exploitative situations.'
The report further pointed out that offenders are empowered by impunity:
'Ultimately, the essential feature of most offenders is their knowledge or belief that their actions will go unpunished'.
Our investigations have shown that predators are fed a steady stream of victims via Instagram’s algorithms and that predators are free to roam and prey at will, not just with impunity but with the endorsement of moderators who tell us their behaviour ‘doesn’t go against community guidelines’.
Australia is at the forefront of global efforts to improve online safety. The Office of the eSafety Commissioner’s user-centred initiative, Safety by Design, was the outcome of consultation with industry, service providers, parents and young people and resulted in a set of principles that prioritises user rights and safety. Safety by Design highlights the imperative role of service providers like Instagram in the broader context of shared responsibility for online safety. It spells out eight initiatives designed to 'ensure that known and anticipated harms have been evaluated in the design and provision of an online service'.
In the same week we witnessed how Instagram is used by sex predators to broadcast live sex acts to children, its parent company Facebook committed to a set of new, voluntary standards developed by the Five Country Ministerial (represented by government Ministers of Australia, the United States, the United Kingdom, Canada and New Zealand) to combat online child sexual exploitation and abuse.
Can Instagram abide by these principles and standards without a drastic overhaul to its ethos and operations? While it is underpinned by ideals like boundless connectivity which connects child predators to children? While it stands by community guidelines that accommodate the sexual harassment of little girls? While it indiscriminately gives its users tools - like 'Explore' and live posts - which predators can use to find child victims and commit sexual crimes against them?
As Campaigns Manager Caitlin Roper warned:
'We cannot overlook the significance of a wider culture that sexualises children and treats them as appropriate objects of men’s sexual desire.'
Instagram - through its predator-friendly policies and practises - has fostered a community that fetishises underage girls and helped fuel a culture that normalises their sexualisation and harassment. Now - as well as upholding the principles it has committed to, Instagram must work to eradicate its child predator community, and to foster a culture in which the sexualisation, harassment, exploitation and abuse of children is unthinkable. We owe it to girls and women - this and future generations - to make sure they do.
Note:
- If you are concerned about suspected online child exploitation material, make a report to the eSafety Office.
- If you are concerned about an adult behaving inappropriately online toward a child, make a report to the Australian Federal Police.
- Make an anonymous report to Crime Stoppers or phone their toll free number 1800 333 000.
See also:
Insta must act on predators: Collective Shout letter to platform heads
eSafety commissioner backs Collective Shout's call for Instagram overhaul
I was so concerned about porn-themed portrayals of young girls on Instagram I reported to police
Melinda Tankard Reist on ABC Radio National discussing the #WakeUpInstagram campaign
If Instagram can restrict diet products they can stop child sexual exploitation
‘Sexy girl’: How Instagram allows the offering of young girls as fetishised flesh
Tech companies turn a blind eye to child sexual abuse material
A thirty-seven year old mother spent seven days online as an eleven year old girl. Here’s what she learned.
WATCH: Social media dangers exposed
Read moreKids movie Show Dogs accused of grooming children for sexual abuse
*UPDATE: CNN has reported the film will undergo edits to remove the objectionable content!*
Child advocates have accused new kids film Show Dogs of sending “a troubling message that grooms children for sexual abuse”. The film was released in the US last week, and is not scheduled to be released in Australia until July.
The film follows the story of a police dog going undercover at a dog show. There are reportedly several scenes in which the dog, Max, has to have his genitals inspected. When he is uncomfortable and wants to stop he is told to go to a ‘zen place’. When he does this, he can advance to the final round of the dog show.
National Center on Sexual Exploitation has called on distribution company Global Road Entertainment to halt the distribution of Show Dogs in movie theaters and recut the movie:
“The dog is rewarded with advancing to the final round of the dog show after passing this barrier. Disturbingly, these are similar tactics child abusers use when grooming children — telling them to pretend they are somewhere else, and that they will get a reward for withstanding their discomfort.
“Children’s movies must be held to a higher standard, and must teach children bodily autonomy, the ability to say ‘no’ and safety, not confusing messages endorsing unwanted genital touching.”
Reviewers, too, have expressed their discomfort over the scenes in question.
Slate writer Ruth Graham called it “unsettling on several levels”.
“First, this is a children’s movie in which the protagonist’s success depends on withstanding a stranger touching his genitals even though it makes him uncomfortable,” she wrote.
“The movie’s solution to Max’s discomfort with the inspection is not to empower him to escape it somehow; it’s to have him learn to checkout mentally while he endures it, and to make no outward sign of his humiliation. It is not paranoid to say that this is a bad message for kids.”
Writer Jenny Rapson echoed those sentiments in a blog post on For Every Mom: “Max’s success is riding on whether or not he lets both his partner (for practice) and a stranger (the competition judge) touch his private parts. IN A KIDS MOVIE. WHAT??? Newsflash, folks: THIS IS CALLED GROOMING and it’s what sexual predators do to kids!”
Writer Terina Maldonado wrote on family film blog Macaroni Kid that “during the movie, I kept thinking, “This is wrong, it doesn’t need to be in a kids movie. Everything else in the movie is good fun except for this.”
In response to the outcry, Global Road Entertainment, co-producers of the film released a statement to CNN:
“The dog show judging in this film is depicted completely accurately as done at shows around the world; and was performed by professional and highly-respected dog show judges,” the statement said in part. “Global Road Entertainment and the filmmakers are saddened and apologise to any parent who feels the scene sends a message other than a comedic moment in the film, with no hidden or ulterior meaning, but respect their right to react to any piece of content.”
One of the writers of the film has spoken out against the scenes in question, claiming that they were written into the script by of the “13 other writers” who worked on the movie.
“[I] didn’t get to see the film until it was in its final stage of completion, and had zero say in creative choices the second I signed away the rights to my work.”
“I absolutely condemn any suggestion or act of non-consensual touching in any form, as well as disassociation as a coping mechanism for abuse of any kind. I understand and empathise with the parents’ and groups’ concerns regarding the message the movie may impart,” he said.
Children’s charity Bravehearts is also calling for a ban on the Australian Classification Board to ban the film:
Bravehearts is responding to reports this children’s film contains multiple scenes where a dog character must have its private parts inspected and manhandled. When the dog feels uncomfortable and wants it to stop is then told to just go to a ‘zen place’ and is later rewarded for his consent by being advanced to the final round of the dog show. This message is not only wrong, but it promotes acceptance of grooming and goes against the very basic principles of child protection.
The charity asked supporters to urgently contact the board at [email protected] and Senator, The Hon Mitch Fifield at [email protected].
Cineplex Theatres have already pulled the film:
Be sure to follow our Facebook page for further updates.