Image editing software like Adobe Photoshop can alter every image in every way possible, and this made the app a popular method for 'undressing' women.
But chances are, most people aren't good in image manipulation in the first place. This was why a programmer created an app that uses AI to remove clothing from the images of women, making them look realistically nude.
It was called DeepNude.
When deepfakes was introduced by a Reddit user, and took the internet by storm in 2017, researchers, the media and the government have all concerned with the dangers the technology can deliver. From revenge porn to a tool to create misinformation.
However, deepfakes were largely used, and most popular for, undressing women without their consent. This somehow created a a trend where people maliciously spread nonconsensual porn on the internet.
DeepNude here, is an evolution of deepfakes. While it only worked on images of women, the idea is that, the technology introduced an easy way for anyone to claim ownership over women's bodies.
As an app, DeepNude was very easy to use, and even more accessible than deepfakes have ever been.
Whereas deepfakes require a lot of technical expertise and huge data sets, DeepNude is an easy-to-install app which could generate fake nude images in just 30 seconds, with just a click of a button.
DeepNude was first launched as a website to show a sample of how it worked. Downloadable version of the software was made available for Windows and Linux on June 23rd.
"This is absolutely terrifying," said Katelyn Bowden, founder and CEO of revenge porn activism organization Badass. "Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public."
This is an “invasion of sexual privacy,” said Danielle Citron, professor of law at the University of Maryland Carey School of Law, who had previously testified to Congress about the deepfake threat.
“Yes, it isn’t your actual vagina, but... others think that they are seeing you naked,” she said. “As a deepfake victim said to me—it felt like thousands saw her naked, she felt her body wasn’t her own anymore.”
According to DeepNude creator who goes with the alias Alberto, said that the app is based on pix2pix, an open-source algorithm which uses generative adversarial networks (GANs) to train AI on a large data set of images.
Powering DeepNude, the algorithm was trained on more than 10,000 nude images of women.
The algorithm only works with women, Alberto said, because images of nude women are easier to find on the internet. He was hoping to create another one for male, too.
His idea for creating DeepNude, was inspired by ads for X-Ray glasses he saw back in magazines from the 1960s and 1970s. This was further emphasized by the DeepNude logo, which is a man using spiral glasses.
"I'm not a voyeur, I'm a technology enthusiast,” he said, explaining that he created DeepNude for fun and curiosity.
"Continuing to improve the algorithm. Recently, also due to previous failures (other startups) and economic problems, I asked myself if I could have an economic return from this algorithm. That's why I created DeepNude."
DeepNude was only available for less than a month.
Its Twitter account announced that the app was dead, and no other versions will be released. The app was also closed for anyone who wished to use it.
"We created this project for users' entertainment months ago," said Alberto. "We thought we were selling a few sales every month in a controlled manner... We never thought it would become viral and we would not be able to control traffic."
Alberto was also struggling with questions about morality and ethical use of his product.
"Is this right? Can it hurt someone?" he said he asked himself. "I think that what you can do with DeepNude, you can do it very well with Photoshop (after a few hours of tutorial)," he said. If the technology is out there, he reasoned, someone would eventually create this."
Since then, according to the statement, he decided that he didn't want to be the one responsible for this technology.
"We don't want to make money this way," the statement said. "Surely some copies of DeepNude will be shared on the web, but we don't want to be the ones to sell it.”
"The world is not yet ready for DeepNude,"