Blog

Appendix DeepNude «undressing" of a person in the photo. How it works and some examples

Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive
 

Brief results of the test: the neural network runs fast, "strips" Only women and men adds female sexual characteristics

Unknown programmers create an application DeepNude , «removes» clothes with your photos by using neural networks. This allows you to make a fake, but it is quite realistic nude female pictures. TJ tested the algorithm on a few examples.

Updated at 14:20: At the site DeepNude any problems. According to developers, the failure is associated with a large number of visits. The company has promised to restore the work within a few days.

First of DeepNude had told the publication Vice. Reporters called the application evolution of ideas DeepFake, but no longer in the video and in photos. With this new algorithm is much easier to learn - feykovye photo can be done literally in a couple of clicks.

An application to run on 23 June. While it only works on Windows and Linux. The developer, in which you'll have a profile on Twitter, further plans are not declared. If you believe your account filled biography, the author DeepNude lives in Estonia.

Editor TJ downloaded and tested the application. When using it on a "stripped" there is a huge photo watermark, which covers a large part of the result of the algorithm. This is achieved with the paid version DeepNude: for $ 50, instead of the sign can only mark «Fake» in the corner. For $ 100 the user receives a photo without any signs.

DeepNude creator recommends the use of photos of girls, who already is bare most of the body - for example, in a bathing suit or underwear. TJ also check has shown that the application is sensitive to the postures - it is better not to use images where the person stands with his back half-turned or.

The "ideal" conditions for the algorithm to cope with the work quality, filling stations, which are hidden clothing.

 

Sometimes the app gives noticeable glitches. For example, not drawn nipples or adds one more naked body in a random location. It also shows that in some cases, the neural network makes the figure slimmer than the original.

And if you upload a photo of the man, then the algorithm will try to substitute a woman's body under clothing. For example, the breast may appear in place of the penis. Obviously, the algorithm is not trained on the man's naked pictures.

Vice journalists paid $ 50 to test the application. They also contacted the developer: he introduced the name of Alberto and said that the neural network is trained by 10 thousand pictures of naked women. According to him, the function of "undressing" of men is in his plans.

Alberto said that the idea was inspired by the "X-points" of his childhood. For his project he used pix2pix algorithm, which is often used for "dorisovki" objects.

How to "strip" the person in the photo

  1. Site for the application and download it - the program only works on Windows and Linux. The site offers an online version, but it's just a demonstration of already processed images;
  2. After installing the algorithm will be a few more minutes to download the necessary libraries. Thereafter DeepNudes is ready for use;
  3. The app only two buttons: upload a photo and remove watermarks. After you download the image, increase the picture and put the right person in the center, if necessary; 
  4.  After about 30 seconds, a minute will be "stripped" picture. Download it in full resolution can not be free - only to premium account. But you can take a screenshot. 

After Vice material Alberto warned that due to the high load photos can be processed longer. At the same time in the algorithm we have already noticed, human rights organizations, which angered this "invasion of privacy a person's sex life." Vice application called "terrifying."

The authors of the publication showed DeepNudes University Professor of Computer Science at Borkli Hany Farid (Hany Farid). According to the journalists, he was shocked by the ease of creating create fake picture.

We need to learn to better identify these fakes, and scientists and researchers will increasingly think about how to protect their own experience of such "harmful" use. Social networks, in turn, should thoroughly discuss how to change the rules to prevent the spread of such content.

To adjust the "dipfeyka" have to think about, and our legislators.

hani Farid
Professor

About This...

This site is like a chain of help (6 handshakes).
I really hope that people will feel and help in the realization of 4 dreams.

Then this site will go to the next user, who will be selected at random,
among all those who decide to donate.

©2019 WorldDonat. People for People. All Rights Reserved.

Search