Apps Using AI to Undress Women Gain Popularity, Sparking Concerns

Apps and websites that utilize artificial intelligence (AI) to undress women in photos are experiencing a surge in popularity, raising concerns among researchers.

ADVERTISEMENT

Marketing on Social Networks

According to the social network analysis company Graphika, there has been a significant increase in the number of people visiting undressing websites. In September alone, 24 million people accessed these platforms.

The researchers at Graphika noted that many of these undressing services, also known as 'nudify' apps, use social networks like X and Reddit for marketing purposes. They observed a drastic rise in the number of links advertising such apps on social media, with an increase of more than 2,400% since the beginning of the year.

Using AI technology, these services alter and recreate images to make the person appear nude. It is important to note that most of these services specifically target women.

Concerns about Non-Consensual Pornography

The emergence of these apps highlights a concerning trend of non-consensual pornography, specifically deepfake pornography, which has been enabled by advancements in AI. Deepfake pornography involves the use of fabricated media, often obtained from social media without the subject's consent or knowledge.

Legal and ethical challenges arise due to the unauthorized distribution of these images. The ease with which deepfake software can be accessed and used poses a significant threat to privacy and personal security.

The availability of open-source diffusion models, which enable the creation of highly realistic deepfake images, has contributed to the proliferation of such apps. These apps are free to use, making them even more appealing to users.

Legal Concerns and Inadequate Enforcement

The advertisement of undressing apps often involves language that encourages harassment and non-consensual sharing of images. Some apps have even engaged in sponsored content on platforms like Google's YouTube, further normalizing their presence.

Despite the harmful impact of deepfake pornography, there is currently no federal law in the United States explicitly banning its creation. Law enforcement agencies often struggle to address these cases, leaving victims with limited avenues for legal action.

Privacy experts stress the urgency of implementing stronger regulations and enforcement mechanisms to combat the misuse of AI technology in the creation and dissemination of non-consensual pornography.