Collective Shout stands with the Alannah & Madeline Foundation calling for more protections for children from online pornography.
Last week, the Australian Government responded to the eSafety Commissioner’s report on ways to keep online pornography out of the hands of children. It’s clear now that technical ‘age assurance’ solutions will not be mandated in Australia any time soon. Parents and child safety advocates are left wondering: what’s next?
The Alannah & Madeline Foundation holds deep concerns about the risks children face when exposed to pornography and other distressing, age-inappropriate content online. Online pornography is recognised as a threat to children in General Comment 25 of the United Nations Convention on the Rights of the Child ('On children's rights in relation to the digital environment') and in Australia’s National Plan to End Violence Against Women and Children.
We believe the decision-makers who design, run and regulate the digital world should treat the best interests of the child as a primary concern. Adults should put children’s healthy, positive development first and work to make sure children can enjoy all their rights. Children’s best interests are more important than the commercial priorities of industry.
As such, we welcomed the Australian Government’s interest in creating a Children’s Online Privacy Code, as well as the Government’s investment in respectful relationships and online safety education and their undertaking to review the Online Safety Act with a view to improving protections for children.
We also welcomed the news that the eSafety Commissioner had been invited to create a ‘roadmap’ to age verification for online pornography.
eSafety found three-quarters of Australian teens aged 16-18 years had seen pornography. Of those who had seen porn, a third saw it for the first time when they were under 13. 70% saw it on specific pornographic websites; 35% on social media feeds; 28% on social media ads; and 22% on social media messages.
These findings indicate that making dedicated pornographic sites inaccessible to children is a must in reducing exposure. More work is also needed to make social media platforms safer by design for under-18s.
eSafety recommended the Australian Government fund a pilot to trial the best technical solutions before seeking to mandate age assurance technology. Technical solutions can include steps like verifying formal ID or facial scanning. While the technology is new and emerging, the Foundation believes a funded trial would have been an appropriate next step, and we welcomed several of the priorities identified by eSafety, such as consulting with young people and parents.
The Foundation feels strongly that interventions led by industry (such as the codes) are not enough to ensure children’s rights are upheld.
The first set of industry codes developed under the Online Safety Act, which focused on serious, illegal content such as child sexual abuse material, took well over a year to create and register. They will not come into effect until December this year. eSafety had to demand redrafts and take over development of two codes due to persistent, unresolved concerns about child safety. Even then, these redrafted codes did not go far enough to match international standards of good practice e.g. they did not require high default privacy settings for 16-17 year olds. This is unacceptable: the first set of codes set the standard which others will follow – and we are knowingly allowing lower standards for child safety than other comparable parts of the world.
Given this lengthy process, we hold serious concerns that the second round of code development will not deliver better or prompter results, especially given the legality of pornography in Australia, however risky and distressing.
The digital world should be designed and run in ways which minimise risks to children. Children and parents should not be held solely responsible for avoiding risks which are ‘baked into’ digital platforms.
In particular, we know that children who are very vulnerable in the offline world are much more likely to take risks and have upsetting experiences online. These children – who include children in out-of-home care and children with mental health problems or disability – are less likely to get the support they need from parents and teachers. Built-in solutions like age assurance are especially important for these children, who may have few reliable protections in place.
We trust the reviews of Australia’s Privacy Act and Online Safety Act will lead to fresh opportunities to strengthen protections for children’s rights online.
We want to see the Online Safety Act amended to ensure that the creation of codes to uphold children’s safety online is led by a trusted, expert public regulator, not by industry itself.
The Alannah & Madeline Foundation will continue to uphold the rights of children and young people to be safe. Only when children’s rights are taken seriously is their best future possible.