Categories
Ξ TREND

We have tested ClotHoff, the artificial intelligence that undresses women. It is as easy to use as it is worrying


Although artificial intelligence has many advantages for our daily lives, it also comes to have uses that can scare us. We have the clearest example very recently in a town in Extremadura where minors between 12 and 14 years old have taken photographs of girls from their town and have undressed them with an AI to spread them among themselves.

This is a serious problem, and we have been able to test artificial intelligence that allows us to do exactly this. AND The reality is that the result has left us stunned, because unlike others like DeepNude, there is no message that states that we are facing a fake nude. Simply, It is generated from the photograph that is uploaded and it will be possible to download without problem to start sharing.

AIs to undress people are a real business

As we say, we have been able to test one of the most famous artificial intelligence to undress people. And the result is quite good, maintaining skin tones and appearing a girl completely without clothes (because yes, these services are almost always advertised as AI that will only undress women). And we are only talking about girls because this AI does not undress men (as happens with others), obtaining in this case a very bad result and not at all realistic.

Image created with Stable Diffusion AI to perform the demo nude.

The result is therefore quite realistic, and one can think that the girl was indeed posing without any type of clothing. The only notice that appears and indicates that it is a is presented at the bottom of watermark mode where the URL of the website used appears to create these montages. But obviously it is something that can be cut very easily.

The requirements that a photograph must meet in order to be completely stripped are really easy to meet. The most important thing of all is that you should not wear a big coat, but rather Clothing should be more or less light. Furthermore, only one girl will be able to appear in the photograph, since if there are several it will not work correctly. And finally, the quality of the light and good focus will be essential so that the AI ​​can detect the model and undress her.

But obviously this AI is not open to everyone, and such a claim to undress anyone is big business for its developers. This means that to use this AI you have to be buying different virtual currencies that are exchanged for each nude. Although the first of them is free so you can learn how the tool works and its power.

For example, having four attempts to obtain an image of a full nude is going to cost two dollars, existing different offers to get more and more coins at a lower price. And this is logically a big business for developers. But they can also be obtained for free by sharing the tool with different friends. Although what is truly clear is that good regulation is required to avoid these types of events that in the end are criminal.

ClothOff ‘washes its hands’ with the use of these images

If we look at the terms and conditions of this artificial intelligence tool, it stands out above all that ‘they wash their hands’. This means that they are not responsible for the use that will be given to the images that are generated (such as extorting someone). The Internet user “is responsible for not using the application for illegal or unauthorized purposes,” the website specifically says that it does not want judicial responsibilities for what can be done with its technology.

Although another important point may be the data processing that is carried out with all the images that are uploaded or generated. And it is that in many parts of the page before generating the content There are messages reminding them that they do not store any type of data. This implies that once the input image has been used in the AI, it is deleted from its database.

But as we always say: it is better to always distrust everything. This means that we do not recommend uploading personal or real images, since you can never know the real treatment behind them.

Using these media can be very expensive… if they are shared

As we have mentioned previously, these tools can be used precisely to extort money from a person by creating a fake nude (which can look very real). We have the example with the minors of a town in Extremadura who will have used similar tools to achieve this goal. And this is something that constitutes a crime.

Specifically, our legislation includes up to two years in prison for these types of acts. The professor and lawyer specializing in Digital Law explains that. This means that what is truly important in the eyes of a judge is what is done with those photographs, not how they are done.

Although, as is the case in Spain, prison sentences that can reach nine years in the case of child ___ography can not imply this prison sentence. It all depends on the age of those who commit the offense, and in the case of minors the penalty imposed is much milder.