Apple and Google Face Criticism After Report Reveals 483 Million Downloads of Nudify Apps

Google Removes More Than 483 Million Harmful Ads and Suspends 1.7 Million Accounts in India in 2025
According to a report released Wednesday by the Tech Transparency Project, Apple Inc. and Google continue to provide mobile applications that enable users to create nonconsensual sexualized images of individuals, despite their policies against such content.

Searching for phrases like “nudify” and “undress” in Apple and Google’s app stores reveals access to software that can manipulate images of celebrities and others to make them appear nude or partially undressed, as stated by the group, which is affiliated with the nonprofit Campaign for Accountability. The companies also display advertisements for similar nudifying applications in their search results.

The report indicates that the apps identified by the group have accumulated 483 million downloads and generated $122 million in revenue, based on estimates from market researcher AppMagic. A spokesperson for AppMagic noted that the Tech Transparency Project’s investigations have led to the removal of several apps and prompted changes in user policies for others.
Over the last year, calls from politicians worldwide to limit the proliferation of nudifying apps have intensified. Earlier this year, the companies took down apps flagged by the Tech Transparency Project. However, researchers later found dozens of similar apps still available just a few months later, according to the organization.

“It’s not merely that these companies are failing to conduct proper reviews of these apps and continue to approve and profit from them,” Katie Paul, director of the project, remarked in an interview. “They are actively directing users to the apps themselves.”

Through its app store searches, the group identified 18 nudifying apps in the Apple App Store and 20 in the Google Play Store. Additionally, both platforms sometimes guided users to these apps through their autocomplete features by suggesting names of more nudifying applications as users typed, the researchers noted.

Some apps used names and imagery that sexualized them, while others could be misused for that purpose despite not being marketed as such, making them more accessible than traditional photo-editing tools. Some of these apps even offered subscription services, the Tech Transparency Project reported.

Apple’s App Store guidelines prohibit “overtly sexual or pornographic material.” The Google Play Store bans “apps that degrade or objectify individuals, such as those claiming to undress people or see through clothing, even if labeled as prank or entertainment applications.”

Google stated that many of the apps mentioned in the report have been suspended from Google Play for violating its policies, with an ongoing investigation.

“When we receive reports of policy violations, we investigate and take appropriate action,” the company stated in an email.

Apple mentioned that it removed 15 apps highlighted by the group after Bloomberg inquired about their availability. Among those removed was PicsVid AI Hot Video Generator, which provided templates featuring women engaging in suggestive actions, according to the researchers. PicsVid’s developer did not respond to a request for comment.

Another app flagged by the Tech Transparency Project, Uncensored AI — No Filter Chat, was able to strip clothing from images uploaded by the researchers. A representative from Uncensored AI’s developer claimed the app no longer supports clothing removal.

Apple reported that it has contacted developers of six apps to inform them of issues requiring attention, indicating they risk removal. The company noted that other apps referenced by the Tech Transparency Project did not violate its guidelines. Apple added that it has proactively rejected numerous apps and removed others.

The enforcement actions by these tech giants are described as “uneven and largely opaque” by Anne Helmond, a professor at Utrecht University in the Netherlands.

“If an app presents itself as a generic image generator, it might pass the review process, even if it can be misused,” stated Helmond, who directs the App Studies Initiative, an international research group. “Visibility is influenced by ranking and search systems that prioritize engagement, meaning that controversial uses can elevate an app’s visibility.”

One of the apps identified in the Google Play Store, Video Face Swap AI: DeepFace, promoted swapping actress Anya Taylor-Joy’s face with that of Game of Thrones character Daenerys Targaryen. However, within the app, under a section labeled Girls, users could overlay faces onto video templates of women performing suggestive movements, as Bloomberg discovered. The app, rated “E” for Everyone, has been downloaded over 1 million times, and users could access it by searching for “face swap.”

Okapi Software, the provider of Video Face Swap AI, stated it had initiated an investigation into the issues raised by Bloomberg and has removed certain content, which it claimed was user-uploaded.

“Our app does not provide ‘nudify’ functionality, and we do not allow the generation of nude or sexually explicit content,” Okapi asserted. “We take content safety and compliance very seriously.”

There is a growing demand among regulators for the companies to enhance their policy enforcement. Last year, President Donald Trump enacted the Take It Down Act, which criminalizes the publication of non-consensual sexual content and mandates social media and websites to remove such posts. In April, the UK government plans to introduce legislation aimed at prosecuting tech executives whose companies fail to eliminate such images.

Previous Article

Roblox Settles for $12 Million in Nevada Agreement, Implements Age Verification and Stricter Child Safety Regulations

Next Article

Ways for Mumbai Metro passengers to convert online shopping into travel savings.