Dozens of AI “nudify” apps that digitally undress women are still available on Google Play and Apple’s App Store, a new report finds raising troubling questions about how such apps passed review, stayed live, and earned millions despite clear bans.

Some apps go all the way and generate nude images
New Delhi: Apple and Google are facing renewed scrutiny after a new investigation revealed that dozens of AI-powered “nudify” apps capable of digitally undressing women remain available on their app stores, despite explicit bans on such content.
The findings have reignited concerns about the misuse of artificial intelligence to create non-consensual sexualised images and the failure of big tech platforms to enforce their own safety policies.
The report was published by the Tech Transparency Project (TTP), a nonprofit watchdog that monitors major technology companies. Its investigation comes amid growing alarm over AI tools being weaponised to exploit women and children online.
Apple to launch Gemini AI-powered Siri in February 2026; Here’s what to expect
TTP researchers searched app marketplaces using simple keywords such as “nudify” and “undress” and found that these apps were neither hidden nor difficult to access.
According to the report, 55 apps on Google Play Store and 47 apps on Apple’s App Store were capable of removing clothing from images or videos of women. Some apps generated fully nude images, while others replaced outfits with bikinis or underwear.
Researchers noted that even free versions of the apps could produce sexualised images within seconds. Importantly, tests were conducted using AI-generated images rather than real people.
These apps are far from obscure. Collectively, they have amassed more than 705 million downloads worldwide and generated an estimated USD 117 million in revenue.
Since both Apple and Google take a commission from in-app purchases, the report argues that the companies are financially benefiting from apps that enable non-consensual sexual imagery content they publicly claim to prohibit.
The investigation found that some apps were rated suitable for children as young as nine or marked as appropriate for “all ages.” In many cases, users could generate undressed images with a single tap or text prompt. There were no consent verification mechanisms, warnings, or restrictions to prevent misuse.
Apple enters Indian digital payment market, launch expected this year; Here’s how it works
Following the report, Apple told CNBC that it had removed 28 apps and issued warnings to other developers. Google said it had suspended or removed over 30 apps after reviewing the findings. However, the Tech Transparency Project maintains that numerous similar apps remain accessible on both platforms.
As more cases emerge involving AI tools being used to digitally undress women and children, regulators in the US and Europe are intensifying scrutiny of tech companies’ enforcement practices.
The report concludes that inconsistent policy enforcement by Apple and Google continues to leave users vulnerable to harassment, abuse, and humiliation in the age of AI.
No related posts found.