As generative AI tools like Undress continue to encroach on internet users’ right to privacy, regulations are struggling to keep up with the innovation.
In one month, the Undress app hit over 7.6 million visits, with users spending 21 minutes per visit. session. Users of TikTok, one of the world’s most engaging social media platforms, spend an average of over 10 minutes per session on the app. Undress is a generative AI tool that allows users to input a photo of anyone and in return get a photo with that person’s clothes removed. In addition, the site will allow users to enter their specifications for preferred height, skin tone, body type and so on. It can upload the image of its choice and create a deep nudity of the person in the uploaded image.
Over the past three months, Undress’s global ranking has risen from 22,464 to 5,469. On search engines, the keywords “undress app” and “undress ai” have a combined search volume of over 200,000 searches per month, showing the level of demand for the tool.
On its website, whose slogan is “Undress any girl for free”, the app’s creators disclaim that they “take no responsibility for any of the images created using the website.” This means that victims whose photos have been stripped without consent cannot contact the site for complaints or removal requests.
Some reports also state that fraudulent loan apps gain access to a person’s gallery and then use tools like Undress to morph nude photos of users and then use them to extort money from them.
According to the Economic Times, Undress is just one generative AI application in a cesspool of similar tools. Google Trends has classified such sites as ‘breakout’ searches, meaning they have seen a ‘huge increase, probably because these queries are new and had few searches before the boom in generative AI tools.
The bad news for victims is that these tools only keep getting better, in addition to having no way to prevent inappropriate use of their photos. According to one expert, it will eventually reach a point where the resulting images are so convincing that there is no way to distinguish them from real images.
“Children between the ages of 11-16 are particularly vulnerable. Advanced tools can easily transform or create deepfakes with these images, leading to unintended and often harmful consequences. Once these manipulated images find their way onto various websites, it can be difficult and sometimes impossible to remove them,β said public policy expert Kanishk Gaur.
According to Jaspreet Bindra, founder of Tech Whisperer, a technology consulting firm, regulation should start with having ‘classification’ technology that differentiates between real and fake.
βThe solution has to be two-pronged β technology and regulation,β he said. “We must have classifiers to identify what is real and what is not. Similarly, the government must mandate that anything AI-generated be clearly labeled as such.”
With debates about what a regulatory framework for AI would look like still raging, generative AI tools like Undress demonstrate the need to accelerate this process. Deepfakes have already demonstrated the damage they can do by spreading misinformation and fake news in the political sphere, and tools like Undress show that this threat is now moving from only affecting politicians, celebrities and influencers to ordinary people, especially women, who innocently upload their photos to their social media profiles.
#case #undressing #app #regulation