Draft Codes leave major loopholes for child sexual exploitation
In response to an eSafety directive, tech industry agents have pitched their first draft Codes of Practice. The Codes are intended to spell out how platforms plan to reduce access and exposure to certain types of harmful online material, known as Class 1A and 1B material. We think - if approved as they are - the draft Codes will fall well short of protecting children from predators and the predatory online porn industry.
We gave our feedback on the Codes in a Submission to ONLINESAFETY.ORG.AU. We made 17 recommendations which we believe will help improve online safety for children:
- Codes must include mandatory time limits on responding to complaints.
- Providers should make detailed data available on all complaints.
- Providers should include access to a mechanism for end-users to make a complaint to a third party if they are dissatisfied with the provider’s response to a complaint.
- Codes should remove clauses specifying that end-user accounts are terminated only if they intend to cause harm.
- Social media platforms must not tolerate violations of laws prohibiting CSAM material. They should remove the requirement that an end-user “repeatedly violated terms and conditions, community standards, and/or acceptable use policies.”
- Industry should report CSAM regardless of where or to whom it is happening.
- Industry must invest in tools and resources to enable providers to detect and deal with first-generation, existing, and live-streamed CSAM.
- Industry must address live-streamed CSAM with available technology and continuing investment in innovation and resources.
- Industry codes must explicitly prohibit sexual discussions and other degrading and exploitative treatment of minors.
- Industry codes must explicitly prohibit paedophilic networking, including the use of red flag terms known for use in connecting sexual predators and aiding trade in child sexual abuse material.
- The timeframe of 24 hours for reporting an instance of identified CSAM as an immediate threat to the life or health of an adult or child should be changed to “immediately” or at minimum, a two hour time frame.
- Industry codes must explicitly prohibit monetisation of children’s content.
- Industry codes must explicitly prohibit the promotion of off-site monetised children’s content.
- Industry codes must prohibit the use of preteen children (or children whose age is less than the platform’s approved user age) in paid promotions.
- Remove the suggestion that it would be sufficient to require a user to declare their date of birth during the account registration process, as this is an ineffective method.
- Industry must use existing tools to detect behavioural signals and CSAM materials in end-to-end encrypted services.
- Industry must invest in tools and resources to enable providers to detect and deal with first-generation, existing, and live-streamed CSAM in end-to-end services.
We don't want any more empty words about 'zero tolerance' for child exploitation on their platforms. If Big Tech wants us to believe they care about protecting children, they will need to demonstrate it. They can start by presenting robust Industry Codes which actually prioritise children's safety over the interests of the porn industry and men who want to prey on children.
Click here to read our full submission.
See also
Submission to Social Media and Online Safety inquiry
Twitter + Instagram remove child exploitation material after we exposed them
Children over profit: Big tech needs to protect children from predators, porn - Our Submission on Online Safety (Basic Online Safety Expectations - BOSE)
Add your comment