"I have to live with the knowledge that my abuse will never end, and that every second of every day, someone could be – almost certainly is – watching my torture and abuse."
Associate Professor Michael Salter and Clinical Psychologist Dr Elly Hanson have documented the phenomenon of social media users attempting to report and prevent online child sexual exploitation and child sexual abuse material where ISPs, social media platforms and governments have failed to implement effective regulation.
My chapter with Elly is now published - this is the first paper identifying that some child abuse survivors are being forced to seek out and report their own abuse images online, since platforms are not obliged to proactively detect and remove CSAM. https://t.co/iTm57PN769— Michael Salter (@mike_salter) June 1, 2021
The book chapter, “I Need You All to Understand How Pervasive This Issue Is”: User Efforts to Regulate Child Sexual Offending on Social Media, published in The Emerald International Handbook of Technology Facilitated Violence and Abuse, exposes how survivors of child sexual abuse, some of them teenagers, are seeking out and reporting videos of their own abuse.
Salter and Hanson argue that a “cyberlibertarian” ideology, which prioritises individual liberties and freedom from state interference over responsibility to others, has played a significant role in “legitimising an anti-regulation ethos within industry and government despite recognition of the likely costs to children”.
Salter and Hanson note that the responsibility for protecting children is often deflected to parents and schools:
...the public condemnations of online abuse by industry figures too often segue into calls for more parental responsibility and internet safety programs for children, which effectively devolve responsibility for child safety to parents, schools, and children.
Necessarily, these strategies are most effective at reaching the least at-risk children, that is, the children with engaged parents who are regularly attending school. Research has consistently shown that the children who are most at risk of online abuse are those who are already vulnerable offline due to disadvantage and prior experiences of abuse and neglect (Jones, Mitchell, & Finkelhor, 2013). Furthermore, a significant proportion of CSAM is, in fact, created by parents and other family members; an inconvenient fact that has been consistently sidestepped by industry and government authorities for decades despite the cumulative evidence.
The paper cites child sexual abuse survivor Avri Sapir, who has found and reported videos of her childhood sexual abuse circulating on Twitter:
I have to live with the knowledge that my abuse will never end, and that every second of every day, someone could be – almost certainly is – watching my torture and abuse. Even once I’m dead, my degradation will continue. I will never be able to escape it. This trauma is infinite.
Child porn is not a victimless crime, and it is the worst form of torture a person can be forced to endure. We must wake up to the realities of this pervasive and wide-spread abuse, and push past the discomfort of the topic, because I don’t have that privilege. We must hold accountable the people, companies, policies, laws, and cultural beliefs that allow this type of abuse to continue and thrive. We must listen to and defend survivors and put an end to the gaslighting, victim-blaming, and excuses that allow this issue to be ignored.
Salter and Hanson argue tech companies have been unwilling to prioritise child safety above profits:
...the libertarian formulation of privacy in terms of the capability of users to share content (including CSAM) without consequence is prioritised by technology companies over the human rights of CSAM survivors to privacy, dignity, and safety. By (either explicitly or implicitly) adopting the cyberlibertarian narrative, industry, governments, and individuals have effectively given themselves license to see online harms, including child abuse and exploitation, as inevitable “necessary evils” that should be tolerated or ignored given the necessity of “online freedom.”