Each week we find new evidence that underage girls are at risk from predators on Instagram. We've rounded up some of the worst, recent examples of predatory comments we've come across and shared these below. Some of the comments depict male fantasies for carrying out violent sex abuse acts on little girls.
There should be no room anywhere for these comments, and that's why we are continuing to call on Instagram heads to stop the fetishisation and harassment of little girls that is happening on their platform. Join our campaign and help #WakeUpInstagram to child exploitation. Learn how here.
Join the Twitter conversation by tagging @collectiveshout and using #WakeUpInstagram and #InstaPimpsGirls.
(Addressed to Instagram Global Head of Policy)
December 4, 2019
Widespread predation of underage girls on Instagram
We have identified and collected hundreds of comments posted by men on the Instagram pages of underage girls between August and the time of writing. These include grooming-style comments, sexually harassing comments, requests for sexualised content (including nudes), requests for direct messages and for images to be sent via DM. Large numbers of comments are porn-themed. Men describe in explicit detail sexual abuse fantasies involving the underage girls whose identities and locations are often exposed, placing them at risk of further sexual exploitation. (We have compiled some of the most graphic comments here.)
Inadequate tools for keeping children safe
These predatory comments exist in plain view, unmoderated and unremoved. In some cases they have remained for more than a year - including on posts of girls as young as 4-years- old. Our reports to Instagram are often dismissed as not violating its ‘community guidelines’. Recent examples of dimissed comments include: (on a 13-year-old girl’s post) ‘Omg i sooooo wanna (emoji denoting) lick your (emoji denoting) pussy n (emoji dentoing) ass’; (on a child advertising underwear) ‘Ok im going out of line you put it on i take it off’; ( on a 9- year-old girl’s post) ‘please open your legs’ + emojis including kiss-lips; (various comments on underage girls’ posts) eggplant/peach/cucumber/squirt emojis denoting sexual acts. If these comments are not deemed in violation of whatever standards you use to determine whether content you host is in breach or not, perhaps those standards need to be revised?
There is also a serious shortcoming in Instagram’s reporting system. While comments can be reported as ‘harassment/bullying’ or as ‘violence/threat of violence’, there is no option which properly conveys the serious predatory and grooming nature of these comments. Instagram’s reporting system should include “sexualised/predatory/grooming comment directed at a minor” as its own category.
Instagram also needs to take action to prohibit adults from using ‘live’ posts to contact minors. We have viewed a 13-year-old girl’s live posts during which she was bombarded with sexual comments including a request for sex, questions about her underwear, a man telling her he wanted her to give him an erection, another saying he would pay to meet her and others pressuring her to remove her shoes and show her feet.
We note your statement about having “proactive technology” to keep the platform free from content that is harmful to children. Whatever methods Instagram has employed to date are insufficient for dealing with the rampant predation of children that we have documented. (Our response to Facebook’s official statement here.)
Instagram connects predators to children
Instagram is catering to the fantasy and fetishises of child predators. Through its ‘explore’ feature it directly connects predators with more underage girls. This appears to go directly against your own policies prohibiting content which endangers or exploits children . Regardless of the intended purposes of this feature, the reality is that it results in pointing predators directly to the girls they are preying on (virtually or in real life). Instagram should stop its ‘explore’ feature from promoting minors’ pages and connecting predators with children.
We have documented several Instagram pages devoted to republishing images of girls and young women in bikinis or in various modeling, gymnastics or dance poses. Posts on these pages frequently tag the child’s account with captions instructing others to follow the child. Some pages mix images of very young girls with porn-themed and pin-up style images of older teens and young women. These accounts serve as magnets for predators pointing them directly to more children. Instagram has responded to our reports of these accounts that they do not go against ‘community standards’. Instagram should properly investigate these parasite pages. Your company should also prohibit the republishing of images of children on pages that also feature pornified images of adults.
Instagram serves as supplier for pedophile forums
We have documented the names of hundreds of underage models, gymnasts and dancers on pedophile forums. We’ve reviewed a selection of the profile pages and found that every one contains direct links to the child’s Instagram account/s. We’ve also reviewed several forum pages and have evidence that Instagram is the primary source of images shared to the forums where men write explicitly about sex-abuse acts they’d like to carry out on individual girls. We have evidence of an individual commenting on an underage girl’s Instagram post to direct others to the forums.
Instagram accounts used to promote sale of images/videos of children
Some underage girls’ accounts are used to advertise monetised website subscriptions where men purchase exclusive photos and videos of children. Other accounts promote links to personal web pages where fans can purchase posters, calendars and photo/video packages. Underage girls also use Instagram to promote accounts on other social media platforms where sexualised content is under even less scrutiny, placing minors in further danger of being targeted by predators. We believe that Instagram must cease acting as an advertising service for individuals selling images/videos of minors.
The evidence we have gathered demonstrates the sexual exploitation of underage girls on Instagram. Your company is allowing predators almost unfettered access to them. Content hosted on your platform violates their right to grow up free from activity that harms them (United Nations Convention on the Rights of the Child, Article 36). Because of how common predatory behaviour has now become on Instagram, girls learn to think of it as normal which sets them up for even more harm.
We urge Instagram to prioritise this issue and act in a way that places the wellbeing of vulnerable young people above the interests of predators and reflects accepted standards of corporate social responsibility and ethical behaviour.
- Revise ‘Community Standards’ so that all sexualised, predatory and grooming-style comments (text, slang, short-hand, hashtags, emojis and graphics) qualify as violations.
- Add ‘sexualised/predatory/grooming comment directed at a minor’ as a reporting category.
- Prohibit adults from using ‘live’ posts to contact minors.
- Update Instagram’s system used to detect and remove sexualised, predatory comments.
- Recognising that Instagram serves as a supply source of images of children for web-based pedophile forums, update all relevant policies, guidelines and help documents (including ‘A Parent’s Guide to Instagram’) so that users are properly informed of the risks of sharing images of children to the platform.
- Stop the ‘explore’ feature from promoting minors’ pages and connecting predators with children.
- Investigate parasite pages that are exclusively devoted to republishing photos of minors, deleting pages where children are sexualised, harassed, groomed or where any type of predatory comments/behaviour is displayed.
- Prohibit the republishing of images of minors on Instagram pages that also feature porn-style images of adults.
The sexual exploitation of young girls on Instagram must stop nowRead more
Collective Shout's Caitlin Roper quoted in 10 DailyRead more
We call on Instagram to stop providing a platform for normalising the eroticisation of children
Content warning: discussion about child sexual abuse material and paedophiliaRead more
12,000 pornographic images, many violent, found on teenage murderer's devicesRead more
Tumblr Apps banned from the App Store after child sexual abuse material was foundRead more
The Child Rescue Coalition has warned parents about the risks of posting intimate photos of their children, as well as using certain hashtags that can be accessed and misused by predators. It is a reminder to parents to take great care when posting images of their children on social media, to consider their privacy settings and possible risks to their children.
#ToddlerBikini and the ‘adultification’ of children
Psychologists, health professionals and child advocates have spoken out about the harmful impacts of sexualisation and ‘adultification’ of children. Adultification occurs when children are deliberately posed and styled to appear much older than they are, as high fashion ‘mini-me adults’ in adult attire. There is substantial research to suggest that premature sexualisation or adultification negatively impacts children’s natural development and puts girls at risk of exploitation and abuse.
Given evidence of the harmful impacts of sexualising and adultifying children, the practice by some parents of posting pictures of their female toddlers in skimpy bikinis alongside the hashtag #ToddlerBikini is troubling.
Dr Emma Rush, co-author of the report Corporate Paedophilia, told ABC:
“Children are not a reflection of the adult’s personal style...they’re not a miniature adult, they’re not a fashion accessory, they’re a developing human being and they need the cultural space to be just that.”
Photo sharing and risks to children
Once the image has been shared publicly, the owner no longer has control over it. A study by Internet Watch Foundation (IWF) revealed that 88% of sexual or suggestive images and videos posted by young people on social media sites are being stolen by ‘parasite’ porn sites. Of the 12,224 images and videos monitored on 68 different websites, 10,776 were later found on parasite websites. Parents need to be aware that once posted online, they may no longer retain control of where their child’s image ends up or how it might be used.
The sharing of sexualised images of girls puts them at risk from paedophiles. In 2015 we published a piece by dance teacher and writer Jemma Nicoll, who exposed the exploitation of girls in the dance industry. Nicoll followed the Instagram account of a prominent US dancewear company, California Kisses, which featured highly sexualised images of young girls, as young as 5-7, sometimes alongside suggestive slogans.
(Note: ‘Pop that’ refers to the ‘popping’ of a girl’s cherry or the taking of her virginity. It is a popular porn genre.)
The company’s recklessness didn’t end there, with California Kisses failing to moderate frequent sexual and predatory comments from adult men directed towards female children, including requests for sexual acts.
The devastating emotional impact on children
Just this week, a woman has won her fight to get nude photos taken of her as a child removed from several prominent New Zealand art galleries. The victim claimed the photos, taken by her artist mother, were “child pornography masquerading as art”, calling on the Government to protect children from exploitation.
In her submission to the Film and Literature Board, the victim indicated the photographs represented her in a way that was harmful, and that as a naked child model she was subject to a power imbalance and unable to render consent.
The woman, who was also a victim of child sexual abuse, told the board that if the images were to outlive her they “will continue to remind her of that time of her life, oppressing and humiliating her.”
The Board banned two photos. The first picture suggested “sexual availability and experience”, with the victim “made up to look older than her years”, found the board. The second banned photograph was taken when the victim was 10.
Board President Rachel Schmidt-McCleave stated:
“The image seeks to make a statement about sexual availability and power that ought not to be made in the case of a child or young person. There is no doubt that depiction of a child in this way... is injurious.”
“The board is of the view that depicting young persons being older than they are and being sexually available normalises the sexualisation of young women and forms part of a continuous desensitising of the public to the sexualisation of children and young people and is therefore harmful to those children and injurious to the public good.”
14 year old girls shown in sexualised poses, smoking drugsRead more
We have a saying at Collective Shout:
"Silence is the language of complicity, speaking out is the language of change."
And recently Melbourne mum, Melanie Sheppard, refused to be silent after discovering that her daughter had been sexually exploited on social media.Read more