The app's creator pulled the software siting ethical concerns
You can help us fight things like this - Sign up to Collective Shout: for a world free of sexploitation here.
Triple J reported that the app called 'Deepnude,' which could render images of clothed women, but not men, into fake nude images had been available for several months. After an article appeared in tech website 'Motherboard' the surge of interest caused the developers site to crash. This led to the developer pulling the app, citing the potential for abuse.
This is an odd statement to make given that the abuse - which is also a crime - always occurs when the app is used exactly as intended.
"We created this project for user's [sic] entertainment a few months ago," the app's Twitter account tweeted on Friday morning.
"We thought we were selling a few sales every month in a controlled manner. We never never thought it would become viral."
"If 500,000 people use it, the probability that people mis-use it will be high."
The fact that this app was created in the first place is indicative of an indifference to women's privacy and autonomy. That so many visitors to the site caused it to crash shows that in a porn saturated culture, where thousands of images of women are available online, there is still a huge market demand for images of non-consenting women. It doesn't matter if a woman isn't willing to pose nude, anyone using this app could virtually force her to appear as though she has.
As reported in Triple J, using this app to create fake nude images amounts to image based abuse. This is a crime.
Placing a person's face onto a nude body and then showing this to others is image-based abuse. Since May last year, it's been illegal in New South Wales - punishable by up to three years jail and an $11,000 fine. South Australia, the ACT and WA have followed suit, while in Victoria it could come under the criminal offence of image-based sexual abuse.
Many jurisdictions also have laws that ban the use of a carrier service to menace or harass, which can be used to prosecute cases of revenge porn. In August last year, the Commonwealth passed federal revenge porn legislation, introducing civil penalties of up to $105,000 for individuals and up to $525,000 for corporations if they do not remove an image when requested to by the eSafety Commissioner.
A culture of sexual objectification harms women - you can help us change it. Sign up to Collective Shout today