Source Code Copies Of Deepfakes' 'DeepNude' Are Not Allowed On GitHub

With computers, we can create things that are otherwise impossible.

When deepfakes was introduced by a Reddit user, people started undressing women without their consent. This somehow created a a trend where people maliciously spread nonconsensual porn on the internet.

They also started sharing fake videos of politicians and other high-profile figures using deepfakes, which in turn worries the government about the technology's use to spread misinformation.

Fake news are already plaguing the web and social media in general. And here, GitHub, the open-source repository for projects by developers, doesn't want to be part of this trend.

This is why the company stated that deepfakes aren't welcome on its platform. And that includes DeepNude, the app that was used to create fake nude pictures of women with just a click of a button.

The Microsoft-owned software development platform confirmed that DeepNude projects aren't welcome. According to GitHub, the code violates its rules against “sexually obscene content".

Following this announcement, GitHub started removing multiple repositories, including one that was officially run by DeepNude’s creator.

DeepNude was originally a paid app, created by a developer that goes with the name Alberto. It allowed anyone to create nonconsensual nude pictures of women using a similar artificial intelligence and approach used by deepfakes.

While the demand was extremely high, the creator decided to shut it down after Motherboard’s report, saying that “the probability that people will misuse it is too high."

At that time, the source-code for DeepNude was still on GitHub.

Just like any other popular software, when the official one ceased to exist, others will try to revive it. As a result, many developers started to reverse engineer the original app to create their own versions of DeepNude app, and also made copies of the codes available on GitHub.

While some intended to create imitations of DeepNude, others aimed to create methods to solve a typical AI problem, including helping researchers and developers in fields like fashion, cinema and visual effects.

But still, GitHub doesn't appeal them. Despite its guidelines say that “non-pornographic sexual content may be a part of your project, or may be presented for educational or artistic purposes", it won't allow deepfakes or any projects related to it available on the platform, deeming it "pornographic" and "obscene".

The DeepNude website, when it was still showcasing its product

For as long as humans have lived on Earth, women has been the representation of beauty. And from there, they became the source for erotica.

Nude and partly-nude women could be found etched as hieroglyphs by ancient Egyptians, painted by famous painters, sculpted in more than many times, much more than male. After humans created computers, the ways to create nudity have become increasingly easier.

With tools like Adobe Photoshop for example, people with skills in image manipulation can create fake nude photos. And this fact has been around for decades. This image manipulation technique sparked the evolution of nonconsensual nudity.

With deepfakes, technology has made the process even easier. With AI, people with no knowledge of experience, can also undress women.

However, creating deepfakes requires time, knowledge, expertise and huge data sets. DeepNude on the other hand, eliminated all those burdens. What this means, DeepNude didn't actually invented the concept of fake nude photos; it improvised it.

But the consequences are inevitable, as the presence of DeepNude would allow practically anyone to create fake nude images of women, enabling them to threaten victims.

The impact of such technology can also be devastating beyond just fake images of nude women. Governments from many countries around the world have raised alarm about deepfakes and any other technologies based on it. They worry that the technology would create potential political impact.

Virginia in the U.S., has taken steps to ban this technology for harassment, by considering it a form of nonconsensual revenge porn, making it one of the first to ban deepfakes by law.