Children should not be required to build capacity to keep themselves safe: Committee recommendations draw from our evidence
Collective Shout welcomes the final report of the Joint Select Committee on Social Media and Australian Society which examines the influence of social media on users’ health and wellbeing. Collective Shout’s submission to the inquiry and Movement Director, Melinda Tankard Resist’s evidence before a public hearing were cited in the final report titled: Social media: the good, the bad, and the ugly.
In its report, the Committee agreed with our view that:
Children should not be required to build capacity to keep themselves safe on platforms that are inherently dangerous, and for which children lack the developmental capacity to navigate safely.
The report includes 12 recommendations that address the need for immediate action, and the need for a sustained digital reform agenda. The report puts “Big Tech on notice” with Committee Chair, Ms Sharon Claydon MP, stating “social media companies are not immune from the need to have a social licence to operate.”
Collective Shout supports the Committee’s recommendations.
We had also called on the Committee to recommend raising the age of social media access to 16 and hold social media companies responsible for age verification.
While we believe the Committee could have made a stronger recommendation around age verification, we are pleased it acknowledged that the Federal Government had committed to an age assurance trial.
Our comments around age verification were cited in the report, with the Committee highlighting our observation that “we cannot allow large-scale reform to be scuttled by disagreements about the technical aspects.” The Committee also cited our view that “while it may not be absolute in its effectiveness, it is better than no action at all to prevent young people's exposure to the worst of the online environment:
We believe that if we implement age assurance technologies, and if we can delay access to these social media platforms, you'll have fewer children being exposed to porn, to predators, to harmful online content, to bullying…
Even if it's not foolproof, it's still better to go from zero to maybe 70 per cent or 85 per cent because right now there's nothing. There are no protections at all, and we're seeing the results of that in the declining mental health of our young people, especially girls—rising anxiety, self-harm, suicidal ideation et cetera.
In our submission, we recommended an overarching statutory Duty of Care for all digital services, including social media, to be introduced. In response, the Committee recommends “that the Australian Government introduce a single and overarching statutory duty of care onto digital platforms for the wellbeing of their Australian users.”
A common theme of the inquiry was that complaints or concerns raised with social media companies often go nowhere or are dismissed by the platform with no further avenue available to users. We were cited as highlighting our frequent inability to get a response from the platforms when concerns are raised, or complaints made:
We are constantly frustrated by the lack of action by the platforms. Instagram rarely responds […] We track the activities of online predators and are constantly reporting to Instagram, with very little action.
In response, the Committee recommended “that the Australian Government introduce legislative provisions requiring social media platforms to have a transparent complaints mechanism.”
Collective Shout was cited as describing “social media business models as 'inherently unsafe', with platforms designed to be highly addictive and to maximise the time spent engaging with them.” In response, the Committee recommended “that industry be required to incorporate safety by design principles in all current and future platform technology.”
In our submission, we supported the recommendation of Professor Selena Bartlett that social media platforms should “be tailored … strictly prohibiting addictive features, advertisements, and multi-layer marketing tactics, with regulated age-appropriate content and strictly preventing access by criminals, predators, and advertisers.”
In response, the Committee recommended “that the Australian Government, as part of its regulatory framework, ensures that social media platforms introduce measures that allow users greater control over what user-generated content and paid content they see by having the ability to alter, reset, or turn off their personal algorithms and recommender systems.”
This is a step in the right direction to fight the advanced algorithms and targeted messages that are being sent to children and young people. However, it does not address one of the issues we highlighted in our submission: “Minors are served up to predators through Instagram’s algorithms which curate sexual content for them.”
It also fails to address the impacts of sextortion that Melinda Tankard Reist highlighted to the Committee in her evidence before the public hearing: “to date, five Australian boys have ended their lives due to being tricked by sextortion scammers… blackmailing them after an exchange of nudes” and “almost all" of the sextortion scams targeting minors were enabled by Instagram. More needs to be done to prevent criminals and predators from accessing vulnerable children and young people on social media.
Finally, Collective Shout welcomes the recommendations that “the Australian Government introduce legislative provisions to enable effective, mandatory data access for independent researchers…” and “the Australian Government support research and data gathering regarding the impact of social media on health and wellbeing to build upon the evidence base for policy development.”
We look forward to conducting further research in this area to provide an evidence base for policy development to confront the range of harms that are enabled by Big Tech companies through their social media platforms.
Collective Shout will continue to demand accountability, transparency, safety and risk mitigation in alignment with duty of care obligations.
See also:
Age Matters: The Case for Raising the Social Media Age Limit
Hold social media platforms to account: MTR addresses Fed inquiry
The Social Media Summit proved that experts need to do better
Add your comment