Collective Shout welcomes the opportunity to contribute to Online Safety legislative reform. We support intentions to consolidate and harmonise current laws and to ensure streamlining and consistency in a range of digital offences. We are especially pleased to see plans for an expansion of protection against cyberbullying, cyber abuse, image-based abuse and seriously harmful content. As the digital landscape is in a constant state of flux, new opportunities arise – and with them new dangers. This necessitates updated legislation to ensure a safer online environment prioritising human rights and community welfare.
Our recommendations relate primarily to our work combatting the objectification of women and sexualisation of girls, the damaging impacts of pornography on the developing sexual templates of young people, the sexual grooming and exploitation of minors, sexual abuse especially of children for Live Distant Child Abuse, and cyber abuse including of women (of which our team is unfortunately well familiar; see for example Roper, 2014).
Summary of Recommendations
Collective Shout supports the proposed regulatory policy changes.
However, based on our grassroots activism and collection of evidence, we believe much more needs to be done to protect minors on digital platforms. We additionally recommend that:
● The BOSE should adopt the recommendations from the #wakeupinstagram campaign in which we are a global partner:
○ Strengthened requirements that apps and services protect all minors from being direct messaged by adults.
○ Algorithms, overseen by human moderators, should be included to proactively remove sexualising or sexually graphic comments on minors’ images and posts.
○ Privacy settings should be much more visible in order to increase awareness of safety tools.
○ Digital services should automatically provide children with maximum data protection whenever they download a new app, game or visit a website, as proposed in the UK. Privacy settings should be set to high by default. Nudge techniques should not be used to encourage children to weaken their settings. Location settings should also be switched off by default. Data collection and sharing should be minimised and profiling that can allow children to be served up targeted content should be switched off by default (Information Commissioner’s Office 2020).
○ When an account is made private, remove the ability for strangers to send unsolicited direct messages to that account. Remove the ability for that person’s account to be visible in Likes or Comments on other posts.
○ Include links in safety sections to define sexual harassment, and how to get help.
○ Revise ‘Community Standards’ so that all sexualised, predatory and grooming-style
comments (text, slang, short-hand, hashtags, emojis and graphics) qualify as
○ Add ‘sexualised/predatory/grooming comment directed at a minor’ as a category for
reporting violations of community guidelines and address these reports as a priority.
○ Prohibit adults from using ‘live’ posts to contact minors.
○ Update systems used to detect and remove sexualised, predatory comments.
○ Recognising that social media serves as a supply source of images of children for
web-based pedophile forums, update all relevant policies, guidelines and help documents (including ‘A Parent’s Guide to Instagram’) so that users are properly informed of the risks of sharing images of children to the platforms.
○ Stop the ‘explore’ feature from promoting minors’ pages and connecting predators with children.
○ Investigate parasite pages that are exclusively devoted to republishing photos of minors, deleting pages where children are sexualised, harassed, groomed or where any type of predatory comments/behaviour is displayed.
○ Prohibit the republishing of images of minors on pages that also feature porn-style images of adults.
● Sexual grooming and exploitation should be a distinct category of online harm.
● The BOSE should use an age-specific approach to internet safety, recognising that minors
have unique needs and vulnerabilities.
● Children’s rights should be understood in the context of the Convention on the Rights of the
Child, with special reference to Article 34.
● Privacy and problems caused by encryption should be included in the scope of this legislative
● Transparency reporting should include the requirement to report on the qualitative nature
of complaints and responses, rather than merely the quantity of complaints and takedowns.
● For companies which fail to meet the BOSE, we recommend additional sanctions. For
companies such as PornHub who repeatedly and knowingly allow seriously harmful content and image-based sexual abuse on their platform, we recommend that Australian ISPs be required, subject to penalties, to block all access.
● A clear definition of “menacing, harassing or offensive” material be provided for this legislation to move forward, with an expansion of the definition to include the terms ‘harmful’ and ‘exploitative’.
● More focus on human oversight of algorithms, so that algorithms are monitored by actual humans, and final decisions in takedowns or blocks are made by humans. This would help to understand the meaning of suggestive comments and emojis that are currently not being picked up by algorithms.
● The eSafety Commissioner should have an expanded function of conducting or commissioning independent research into the prevalence and causes of technology-related impacts.
● The BOSE should encourage the development and provision of more stimulating quality content online for young people.