"Undress any girl you want": The explosion of Deepnude apps puts all women in danger
"Increasing prominence and accessibility of these services will very likely lead to further instances of online harm, such as the creation and dissemination of non-consensual nude images, targeted harassment campaigns, sextortion, and the generation of child sexual abuse material."
New research has found Deepnude apps are exploding in popularity, with undressing websites attracting 24 million unique visitors in September 2023.
Deepnude apps allow users to virtually undress a woman, using AI to manipulate an image or video of a woman to remove her clothing. Many apps only work on women.
Read moreShould boys be allowed to bark at girls at school every day?
During the last 8-week tour across 5 states before the break, more girls told me that they were subjected to boys barking at them.
There are differing interpretations of this (anti) social behaviour – boys may be conveying they think the girl is hot by acting like an animal or the opposite - they are intimating she is ugly – a dog. Alternatively, the boys think it’s just a funny thing to do with their mates – “just a joke”.
Read moreNew Deepnude app allows men to produce nude images of women
A new way to prey on women.
Read moreWomen targeted with fake porn videos
Fake-porn videos are being weaponised to harass and humiliate women: ‘Everybody is a potential target’
By Drew Harwell, as published at The Washington Post.
The video showed the woman in a pink off-the-shoulder top, sitting on a bed, smiling a convincing smile.
It was her face. But it had been seamlessly grafted, without her knowledge or consent, onto someone else’s body: a young pornography actress, just beginning to disrobe for the start of a graphic sex scene. A crowd of unknown users had been passing it around online.
She felt nauseated and mortified: What if her co-workers saw it? Her family, her friends? Would it change how they thought of her? Would they believe it was a fake?
“I feel violated — this icky kind of violation,” said the woman, who is in her 40s and spoke on the condition of anonymity because she worried that the video could hurt her marriage or career. “It’s this weird feeling, like you want to tear everything off the Internet. But you know you can’t.”
Airbrushing and Photoshop long ago opened photos to easy manipulation. Now, videos are becoming just as vulnerable to fakes that look deceptively real. Supercharged by powerful and widely available artificial-intelligence software developed by Google, these lifelike “deepfake” videos have quickly multiplied across the Internet, blurring the line between truth and lie.
But the videos have also been weaponised disproportionately against women, representing a new and degrading means of humiliation, harassment and abuse. The fakes are explicitly detailed, posted on popular porn sites and increasingly challenging to detect. And although their legality hasn’t been tested in court, experts say they may be protected by the First Amendment — even though they might also qualify as defamation, identity theft or fraud.
Disturbingly realistic fakes have been made with the faces of both celebrities and women who don’t live in the spotlight, and actress Scarlett Johansson told The Washington Post she worries that “it’s just a matter of time before any one person is targeted” by a lurid forgery.
Johansson has been superimposed into dozens of graphic sex scenes over the past year that have circulated across the Web: One video, falsely described as real “leaked” footage, has been watched on a major porn site more than 1.5 million times. She said she worries it may already be too late for women and children to protect themselves against the “virtually lawless (online) abyss."
“Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” she said. “The fact is that trying to protect yourself from the Internet and its depravity is basically a lost cause. . . . The Internet is a vast wormhole of darkness that eats itself.”
Read the full article at The Washington Post.
It’s time to criminalise the street harassment of children and young people
I was walking … to school on my own … all of a sudden this car drove by on the main road and some guy stopped at the red light, wound down his window, stuck his head out and started whistling at me.
This is an account by an 11-year-old girl of everyday sexism and harrassment – and there are countless others like it. But few children or young people know what to do if they are whistled at, beeped at or stared at when they are out and about in public.
“Street harassment” is defined by the activist group Hollaback! as unwelcome comments, gestures and incidents in public, including on public transport. For those who experience it, walking to school, going out with friends, and using the bus, tram, train or tube can become extremely stressful. Children and young people can worry about going out in public. They can even think that something is wrong with them.
What can be done?
There are some, if limited, reporting options for children and young people who experience street harassment. But it is unlikely that such an incident could be successfully prosecuted under the current relevant UK law, the Public Order Act 1986.
Image: Shutterstock
Read more