Our evidence + recommendations to the Standing Committee on Social Issues, Parliament of New South Wales
In our submission we highlighted:
- Harms of pornography including impacts of early exposure
- The unavoidability of porn exposure for children and young people on social media and gaming platforms
- Links to mental health issues and negative body image
- Shortcomings of consent education which excludes discussions about harms of pornography
- "Deepfake" AI-generated image based sexual abuse (IBSA)
- The mainstreaming of violent pornography and links to attitudes and behavious which put women and girls at risk of real world male violence
- Teachers at risk of porn-inspired and porn-fuelled sexual harassment by male students
- Vulnerable groups at risk of porn-related harm
- The urgent need for age verification to block the porn industry's access to children
Recommendations
- Acknowledge that exposure to pornography is harmful to children and young people
- Acknowledge that exposure to pornography fuels attitudes contributing to male violence against women
- Support the Federal Government age assurance trial and not allow it to be derailed by the vested interests of the pornography industry
- Recommend the Federal Government extend any future age assurance system to apply to online gaming services, such as Roblox. Gaming service providers should be responsible for ensuring all individuals using their services are over the age of 16
- Use the terms ‘Image Based Sexual Abuse’ and/or Deepfake Sexual Abuse to centre the harms done to victims
- Urge the State government use the full extent of its power to legislate to criminalise the creation, production, soliciting and distribution of AI-enabled Image Based Sexual Abuse [IBSA]/deepfake sexual abuse. Creation of such material (whether distributed or not) should be a stand-alone offence with its own criminal penalty.
- Consider IBSA as part of a broader offence ‘intimate intrusion’ proposed by Clare McGlynn
- Urge other States to develop uniform laws for dealing with Image Based Sexual Abuse/Deepfake sexual abuse including the above
- Provide funding for the development of a dedicated website in line with the recommendations of Dr. Gemma McKibbin and Georgia Naldrett, with resources and information about:
- Sex, pornography and harmful sexual behaviour
- Laws and social rules about sex
- CSAM and other illegal online behaviour
- ‘I Need Help Now’ resources
- Stories from children, survivors, victims
- Provide funding for the development of an online helpline and chat support, with experienced practitioners, to help children and young people
- Call on United Nations State parties to:
- Hold company directors and executives of social media platforms personally liable for harms occurring on their platforms, including trafficking, child sexual exploitation, sextortion, and exposure of children to pornography
- Urgently address the rapidly growing threat of sextortion on social media platforms and private messaging services
- Introduce a statutory Duty of Care for all digital services
- Require sites that host pornography to implement effective privacy-preserving age verification to reduce risks to children
- Address ongoing gaps in social media safety, and platforms that do not use safety by design principles
- Support global efforts, such as Justice Defense Fund22, to criminalise porn hosting platforms which have knowingly profited from rape, sex trafficking, child sexual abuse, violence, and degradation.
Click image below to read our submission.
Add your comment