Don’t leave kids to defend themselves: Social Media Report cites our evidence
Children should not be required to build capacity to keep themselves safe: Committee recommendations draw from our evidence
Collective Shout welcomes the final report of the Joint Select Committee on Social Media and Australian Society which examines the influence of social media on users’ health and wellbeing. Collective Shout’s submission to the inquiry and Movement Director, Melinda Tankard Resist’s evidence before a public hearing were cited in the final report titled: Social media: the good, the bad, and the ugly.
Read moreSubmission to the Review of the Commercial Television Industry Code of Practice
This is a Position Statement on behalf of Collective Shout, Children and Media Australia and 8 other individuals and organisations
Read moreCriminal Code Amendment (Deepfake Sexual Material) Bill 2024 Inquiry
Criminal Code Amendment (Deepfake Sexual Material) Bill 2024
Legal and Constitutional Affairs Committee
The case for criminalising ‘creation’ alone, not just ‘distribution’
Read moreInput for UN Human Rights Council SR VAWG's report on violence against women and prostitution
We are grateful for the opportunity to provide input into the Special Rapporteur’s report on violence against women and prostitution.
Read moreOur recommendations re sexualised depictions of children heard: Gov responds to Classifications Review
In 2020, after the discovery of illegal child sexual abuse material being sold in Australian stores – and with an unrestricted age rating - we called for an overhaul of the Classifications system.
Read moreSubmission on Modernising Australia’s National Classification Scheme – Stage 2 Reforms
15 recommendations to help protect kids, inform parents + carers
Read moreAmendment to the Online Safety (Basic Online Safety Expectations) Determination 2023: Our submission
We welcome this opportunity to comment again on the BOSE Determination, made under section 45 of the Online Safety Act 2021. The proposed amendments are a significant improvement. We especially welcome stronger requirements relating to generative AI, transparency, terms of use, and accountability.
Read moreSubmission to eSafety Industry Standards
For many years Collective Shout has been observing and critiquing developments in the production and online distribution of Child Sexual Abuse Material (CSAM). We have a long history of documenting harms facilitated and enabled by online platforms and calling for their regulation. We thank eSafety for taking this next step to compel the online industry to address these harms. Our submission pertains only to CSAM rather than pro-terror material, as sexual exploitation is our area of expertise.
In our Submission on Draft Consolidated Industry Codes of Practice for the Online Industry (Class 1A and Class 1B Material), we called for major changes that would close loopholes that would have allowed CSAM to continue to proliferate on digital services.
We support the requirements outlined in the draft standards for both Designated Internet Services (DIS) and Relevant Electronic Services (RES). We strongly support the inclusion of closed communication and encrypted services in minimum compliance measures. We are pleased to see a strong start made on regulating generative AI.
The proposed additional requirements placed on the industry are proportionate to the extreme harms caused by the production, proliferation, and consumption of CSAM. These requirements constitute new opportunities for the digital sector, allowing the development of a safer and more inclusive internet.
We are hopeful that the framework of investing in research and development of new technologies, combined with the requirement to adopt whatever appropriate technologies are available, will result in a growing capacity for the online industry to address CSAM on and through its platforms.
We appreciate the opportunity to comment again on this important legislative development and propose further recommendations.
Summary of Recommendations
- More specific guidance should be given in the technical feasibility exception.
- A service should be required to establish and implement investment and development programs if they meet the monthly user threshold and/or a specified annual turnover threshold.
- Standards for RES and DIS must require a mandatory time limit in which class 1A and 1B material is removed, and in which further risks are effectively dealt with.
- The Standard should include the provision of support to personnel who are required to deal with CSAM.
- Providers must be required to share detection models and information about known bad actors.