DeepNude, the app that has been described as disturbing  has now been taken offline. 

The app faced backlash when users realized that it used artificial intelligence to produce fake nude images of women.

In a series of tweets the anonymous creator who goes by the name ‘Alberto’ explained that the decision to take the app down stemmed from its potential to be abused. 

via GIPHY

‘Despite the safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high,’ read the tweet. 

Creators say the app will no longer be offered for free or otherwise, and the creators said they will not release any other version, nor will they grant the app’s usage to others. 

The app worked by removing all clothing from any uploaded image of a woman – and this sparked fears that it could be used to blackmail unsuspecting victims with fake revenge porn threats. 

Shortly after the app was highlighted by the media — initially reported by Vice — it was taken offline. 

Creators claimed that they couldn’t handle the volume of interest and said they needed ‘to fix some bugs’ but would be back online shortly.

via GIPHY

While a free version of the app, covered the output images with a large watermark a paid version, which costs $50,  removed the the mark, but added a stamp says ‘FAKE’ in the upper-left corner.  

Vice tested the app on dozens of photos and got the most convincing results on high resolution images from Sports Illustrated issues.

Supermodel Tyra Banks’ iconic Sports Illustrated snap is one of the images used as an example by the publication.

(c) Instagram

Others images depict fake ‘nudes’ of Kim Kardashian, singer Taylor Swift and Wonderwoman star Gal Gadot.

The app is not capable of producing nude images of men and, when attempted by Vice, created a picture of a man with a vulva.

DeepNude’s website calls itself ‘an Offline automated software that transforms photos, creating fake nude’.

The anonymous creator said that the software is based on pix2pix, an open-source algorithm developed by University of California, Berkeley researchers in 2017. 

via GIPHY

Pix2pix uses generative adversarial networks (GANs), which work by training an algorithm on a huge data set of images.

In the case of DeepNude, more than 10 000 nude photos of women, the programmer said—and then trying to improve against itself. 

RELATED ARTICLE: WhatsApp will stop working on devices using Android 2.3.7 and iOS 7 in 2020

This algorithm is similar to what’s used in deepfake videos, made to trick viewers into thinking someone did something they didn’t and what self-driving cars use through deep learning.

Alberto told Vice that he was inspired by X-ray spec, ‘Like everyone, I was fascinated by the idea that they could really exist and this memory remained.’ 

(c) Instagram

The fake images are easy to mistake for the real thing, and could cause untold damage to individuals’ lives. 

Cropping out the ‘fake’ stamp or removing it with Photoshop would be very easy.

Motherboard tested it on images of women and men, in varying states of dress—fully clothed to string bikinis—and a variety of skin tones. 

When fed a well lit, high resolution image of a woman in a bikini facing the camera directly, the fake nude images are passably realistic. 

via GIPHY

But it’s not flawless. Most images, and low-resolution images especially, produced some visual artifacts.  

As with revenge porn, these images can be used to shame, harass, intimidate, and silence women. 

There are forums where men can pay experts to create deepfakes of co-workers, friends, or family members, but tools like DeepNude make it easy to create such images in private, and at the touch of a button.

-Daily Mail