For many years Collective Shout has been observing and critiquing developments in the production and online distribution of Child Sexual Abuse Material (CSAM). We have a long history of documenting harms facilitated and enabled by online platforms and calling for their regulation. We thank eSafety for taking this next step to compel the online industry to address these harms. Our submission pertains only to CSAM rather than pro-terror material, as sexual exploitation is our area of expertise.
In our Submission on Draft Consolidated Industry Codes of Practice for the Online Industry (Class 1A and Class 1B Material), we called for major changes that would close loopholes that would have allowed CSAM to continue to proliferate on digital services.
We support the requirements outlined in the draft standards for both Designated Internet Services (DIS) and Relevant Electronic Services (RES). We strongly support the inclusion of closed communication and encrypted services in minimum compliance measures. We are pleased to see a strong start made on regulating generative AI.
The proposed additional requirements placed on the industry are proportionate to the extreme harms caused by the production, proliferation, and consumption of CSAM. These requirements constitute new opportunities for the digital sector, allowing the development of a safer and more inclusive internet.
We are hopeful that the framework of investing in research and development of new technologies, combined with the requirement to adopt whatever appropriate technologies are available, will result in a growing capacity for the online industry to address CSAM on and through its platforms.
We appreciate the opportunity to comment again on this important legislative development and propose further recommendations.
Summary of Recommendations
- More specific guidance should be given in the technical feasibility exception.
- A service should be required to establish and implement investment and development programs if they meet the monthly user threshold and/or a specified annual turnover threshold.
- Standards for RES and DIS must require a mandatory time limit in which class 1A and 1B material is removed, and in which further risks are effectively dealt with.
- The Standard should include the provision of support to personnel who are required to deal with CSAM.
- Providers must be required to share detection models and information about known bad actors.
Call for input on the use of technology in facilitating and preventing contemporary forms of slavery
In our submission to the United Nations Special Rapporteur on contemporary forms of slavery, including its causes and its consequences (April 2022) we presented evidence of modern technology being used to recruit and subject people to contemporary forms of slavery in Australia.
Porn + prostitution harms to women and children exacerbated by pandemic, war
In our December 2022 submission to the Joint Standing Committee on Foreign Affairs, Defence and Trade inquiry, we highlighted the issues of
- sexual violence against women and children
- the role of the sex industry in normalising and perpetuating this violence
- the particular impact on migrant women within the sex industry in Australia and on pornography as a driver of violence against women