WATCH: Social media dangers exposed
By Sloane Ryan / Roo Powell
Note: This piece contains sexual content and descriptions of child sex abuse that could be disturbing to some readers. The messages, images, and conversations included here are real.
I’m standing in a bathroom with the hem of a pale blue sweatshirt bunched up under my chin as I weave an ace bandage tightly around my ribcage. The mirror serves as a guide as I wrap and wrap again the bandages around my sports bra, binding my chest. I step out of the bathroom and find our team waiting.
“This look OK?”
I get nods in response, and as Avery art directs, I pose my arms and tilt my head towards the camera. Normally, I’m not in clothes meant for a tween girl. Normally, I don’t have glitter polish on my nails and neon hair ties on my wrist. Normally, I’m dressed, I suppose, like your average 37-year-old mom. Jeans. Shirts that cover my midriff. Shoes with reasonable arch support.
Reid snaps a couple of photos of me. She scuttles off with Avery to our make-shift command center — a repurposed dining room now covered in cork boards and maps and papers and computer monitors. Will’s brow furrows as he quickly edits.
With the help of context — clothing, background, hair styling — and the magic of photo manipulation, we’re no longer staring at an image of me, an adult woman with crow’s feet.
I move to the kitchen to give him space. We’re gearing up for the heaviest part of the day, which we know from experience will be fast-paced and emotionally exhausting.
“It’s ready,” Will calls from the command center. A few of us gather around Will’s computer screen and examine.
“Yeah, I buy it,” Brian says. Brian is the CEO of Bark, the company spearheading this project. Bark uses AI to alert parents and schools when children are experiencing issues like cyberbullying, depression, threats of violence — or in this case, targeting by sexual predators. Currently, we’re covering more than 4 million kids, and we analyze 20 million activities a day. I look at Brian studying the computer screen and consider his assessment. I nod and sigh. I buy it, too.
With the help of context — clothing, background, hair styling — and the magic of photo manipulation, we’re no longer staring at an image of me, an adult woman with crow’s feet. We’re staring at a photo of fictitious 11-year-old Bailey, and no matter how many times we do this, the results are still unnerving. Not because we’re creating a child out of thin air, but because we are deliberately putting Bailey in harm’s way to show exactly how pervasive the issue of predation is for Generation Z.
Read full article here
See also: UPDATE Sloane Ryan Unmasked
Campaign: stop child sexual exploitation on Instagram
Instagram must act to stop the rampant sex trafficking, child sexual abuse, grooming and the fetishisation of underage girls on its platform.
Collective Shout has carried out an investigation into the behaviour of predators on Instagram. We collected hundreds of samples of sexual, predatory comments on the posts of underage girls - some as young as 7. These included comments by adult men about girls bodies, body parts, sex-abuse acts they would like to carry out on the girls and requests for nude images. We also found that sexualised images of children posted on Instagram, shared under the guise of child modelling, were then shared to paedophile forums where men discuss their sexual fantasies for children.
Some Instagram accounts featuring sexualised imagery of children offer paid subscriptions to ‘exclusive content’, with one allowing men to purchase bikini shots of a 13- year-old girl washing a dog.
Help put pressure on Instagram corporate leaders to stop facilitating behaviours that harm children. Join the Twitter conversation using @collectiveshout #WakeUpInstagramand #InstaPimpsGirls.
Instagram claims to be using proactive technology to ensure the platform is safe for children. However, when Collective Shout campaigner Lyn Swanson Kennedy reported the content she discovered, Instagram countered that ‘no community guidelines’ had been breached:
"What we’ve found shows that sexualisation and harassment of underage girls on Instagram is rampant. By giving adult males unfettered access to children and facilitating the transmission of sexual comments Instagram is complicit in normalising the idea that girls are available for sexual gratification. This puts girls at risk."
"If these mega social media companies are going to allow minors on their platform, they must provide adequate measures to keep children safe from sexual predators."
Collective Shout is partnering with the National Center on Sexual Exploitation (USA) and Defend Dignity (Canada) in a campaign calling on Instagram to change its policies. In December Collective Shout wrote to Instagram’s Global Head of Policy Karina Newton with evidence from our investigation.
Join our international campaign to call on Instagram to make three vital policy improvements:
1) Instagram must change its settings so that strangers cannot direct message minors,
2) Instagram must fix its algorithm to proactively remove sexualising or sexually graphic comments on minor’s photos,
3) Instagram must update its reporting system so that if someone is reporting a sexual comment on a minor’s post it can be reported as such. The “harassment/bullying” selection does not capture the fact that these comments come from adults who are grooming/sexualising/harassing a child.
You can take action in three major ways:
First, you can email Instagram executives here. Fill out the short form and you can send an email directly to Instagram, demanding that they do better to protect minors!
Second, you can learn more and see censored screenshots and videos of proof here.
Third, you can post on social media using #WakeUpInstagram and #InstaPimpsGirls. Remember to tag @Instagram and their head @Mosseri in your post as well, so you can get their attention! Here are some sample tweets you could consider posting:
It's time to #WakeUpInstagram! We need to tell Instagram to fix safety features to stop sex trafficking, child abuse, and pedophile-like comments on minors' photos! #InstaPimpsGirls @Instagram @Mosseri
Minors whose Instagram accounts are set to private can still receive unsolicited direct messages from strangers, which has led to several instances of sex trafficking and child sexual abuse – this must be fixed! #WakeUpInstagram #InstaPimpsGirls @Mosseri
There are countless comments by predatory adults on the photos of minors on Instagram, where they leave sexually graphic comments, sexualize children, or solicit sex from children. Will Instagram fix this? #WakeUpInstagram #InstaPimpsGirls @Mosseri
Thank you for taking action and helping defend children from sexual exploitation online!
Add your comment