WIN! Gov to legislate against non-consensual deepfake p*rn
Since 2019 we've been highlighting the harms of deepfake and nudifying apps and calling for their removal.
Last week the Government finally announced a crackdown on the creation of deepfake p*rnography.
News.com reported "That included legislation to outlaw the creation and non-consensual distribution of deepfake – a portmanteau of ‘deep learning’ and ‘fake’ – porn, and the sharing of sexually-explicit material using technology such as artificial intelligence."
From the PM's announcement:
The Albanese Government will introduce legislation to ban the creation and non-consensual distribution of deepfake p*rnography. Digitally created and altered sexually explicit material is a damaging form of abuse against women and girls that can inflict deep harm on victims. The reforms will make clear that creating and sharing sexually explicit material without consent, using technology like artificial intelligence will be subject to serious criminal penalties.
Read more"No way of knowing if the child is fictional": How ‘virtual’ child sexual abuse material harms children
Child sexual abuse material (CSAM) refers to images and videos where children are sexually abused and exploited. It is the photographic or video record of the rape, sexual abuse and torture of children and infants.
Read more"Undress any girl you want": The explosion of Deepnude apps puts all women in danger
"Increasing prominence and accessibility of these services will very likely lead to further instances of online harm, such as the creation and dissemination of non-consensual nude images, targeted harassment campaigns, sextortion, and the generation of child sexual abuse material."
New research has found Deepnude apps are exploding in popularity, with undressing websites attracting 24 million unique visitors in September 2023.
Deepnude apps allow users to virtually undress a woman, using AI to manipulate an image or video of a woman to remove her clothing. Many apps only work on women.
Read moreThe mainstreaming of child exploitation material on Instagram
Why Facebook must abandon plans for 'Instagram for kids'
Collective Shout with James Evans*
**Content warning**
How would you feel if you found out your neighbour had 10,000 images of underage girls wearing bikinis on his hard drive? What if I told you that Instagram hosts collections just like this, serving no other purpose than the sexual entertainment of men?
Read more