Mega and PayPal named in Instagram's child sex abuse material promos
For the past three years we have documented Instagram's failure to detect and remove child exploitation material, and failure to remove it even after we reported it. Some of our reports from January 2022 (10 months ago) are still ‘In review’.
Our latest investigations reveal Instagram parent company Meta's unconscionable failure to protect children as it aids those who exploit them.
— LSKennedy (@LSKennedy5) November 8, 2022
Child rape videos for sale on Instagram
In August 2022, we made 100 reports of child exploitation activity using Instagram's in-app reporting tools. In each case, the content promoted the sale and/or trade of child sexual abuse material - often via links to online files hosted by New Zealand based cloud storage company, Mega. At last count, Instagram had reviewed less than half of our reports (41). Fifty-four were still 'In review'. Of the 41 which had been reviewed, 38 had been seen by Instagram's 'review team' and three by their 'technology'. Instagram took action to remove just three accounts/pieces of content. Of the remaining reported content, Instagram said it did not remove it because it did not go against its Community Guidelines.
-
Promotions of purchaseable links, featuring screenshots of Mega/online folders with red flag terms including
- 'CP' ('child pornography' - a reference to child sexual exploitation material),
- 'young girls'/'young boys',
- megalinks/trade/for sale
- DM (an invite to Direct Message)
- Daughter dads; 'Dad + Son'
- 'little girl solo', 'little boy solo'
- A promo of a Mega folder titled 'playground' indicating it contained 42 sub folders. One was titled 'Hidden cam' and indicated it contained 7 subfolders/479 files
- Hacked
- hidden
- Karen slavegirl
- girl raped
- fake casting
- blackmail
- 'rape, incest, footjob, young'
- links to Telegram and other encrypted chat sites listed in account bios. This tactic, known as digital 'breadcrumbing', is used to signpost other child exploitation activity (for example, access to more child sexual abuse material and connection with other paedophiles).
- References to specific age groups (eg 'preteen'; 'girls under 6'; 'little girl age from 4 to 17 videos')
- A story featuring a screen grab of a 3GB Mega folder titled 'cp rape full'
- A story with text: Mega folders 'Best CP Collection 1TB 40$' + 'Kid's video collection 1.94 Tb 100$'' + 'Only kids video'
- Child sexual exploitation material promos featuring images of real children
The largest file size reference we documented was 2.2TB: a folder labelled 'New CP Collection'. To put this into perspective, a 2.2TB drive can hold up to 34,000 hours of MP3s, 80 days worth of video, 620,000 photos, and 1,000 high-definition movies.
Sellers often promoted PayPal, CashApp and cryptocurrency as accepted payment methods.
We included this information in our recent submission on the online industry's draft Codes of Practice. The Codes outline how online entities plan to protect users from exposure to harmful content.
We described how Big Tech has aided predators and failed children by providing the means to advertise, promote and sell children’s content (including child sexual abuse material), and connecting sellers to predatory men who drive demand. We said that industry codes must reflect genuine commitment to children’s safety and explicitly prohibit this activity, and that online entities must also indicate how they will detect, remove and appropriately report accounts engaging (or appearing to engage) in the sale of child exploitation material to relevant authorities and/or regulators.
We also sent this information to eSafety. We would like to know what our Federal Government online safety agency is doing to hold corporates like Meta, Mega and PayPal accountable for facilitating child exploitation.
Add your comment