Submission to eSafety Industry Standards

For many years Collective Shout has been observing and critiquing developments in the production and online distribution of Child Sexual Abuse Material (CSAM). We have a long history of documenting harms facilitated and enabled by online platforms and calling for their regulation. We thank eSafety for taking this next step to compel the online industry to address these harms. Our submission pertains only to CSAM rather than pro-terror material, as sexual exploitation is our area of expertise.

In our Submission on Draft Consolidated Industry Codes of Practice for the Online Industry (Class 1A and Class 1B Material), we called for major changes that would close loopholes that would have allowed CSAM to continue to proliferate on digital services.

We support the requirements outlined in the draft standards for both Designated Internet Services (DIS) and Relevant Electronic Services (RES). We strongly support the inclusion of closed communication and encrypted services in minimum compliance measures. We are pleased to see a strong start made on regulating generative AI.

The proposed additional requirements placed on the industry are proportionate to the extreme harms caused by the production, proliferation, and consumption of CSAM. These requirements constitute new opportunities for the digital sector, allowing the development of a safer and more inclusive internet.

We are hopeful that the framework of investing in research and development of new technologies, combined with the requirement to adopt whatever appropriate technologies are available, will result in a growing capacity for the online industry to address CSAM on and through its platforms.

We appreciate the opportunity to comment again on this important legislative development and propose further recommendations.

Summary of Recommendations

  1. More specific guidance should be given in the technical feasibility exception.
  2. A service should be required to establish and implement investment and development programs if they meet the monthly user threshold and/or a specified annual turnover threshold.
  3. Standards for RES and DIS must require a mandatory time limit in which class 1A and 1B material is removed, and in which further risks are effectively dealt with.
  4. The Standard should include the provision of support to personnel who are required to deal with CSAM.
  5. Providers must be required to share detection models and information about known bad actors.

Screen_Shot_2024-01-24_at_9.58.46_am.png

Click here to read the full submission


Add your comment

  • Coralie Alison
    published this page in News 2024-01-24 10:07:49 +1100

You can defend their right to childhood

A world free of sexploitation is possible!

Fuel the Movement