A survey revealed that 38 AI-generated nude photo generators have been listed on the Apple and Google app stores, garnering 483 million downloads and generating over $100 million in revenue.

This article is machine translated
Show original

According to a report released on the 15th by the Technology Transparency Initiative, researchers found 18 apps on the Apple App Store and 20 apps on the Google Play Store that had nude photo generation or forced clothing change functions.

Simply type "nudify" or "undress" into the store search bar to find tools that can alter photos of celebrities or ordinary people to make them appear nude or semi-nude.

38 apps, 483 million downloads, $122 million

According to estimates from market research firm AppMagic, these apps have been downloaded a total of 483 million times, generating approximately $122 million in revenue.

Some are marketed directly with explicit sexual content, while others are disguised as general image processing tools, but can also be easily used to generate involuntary sexualized images, and some apps even offer paid subscription systems.

The ads not only sell ad space, but also drive traffic to nude photo apps.

The most noteworthy finding in the report is that Apple and Google's advertising systems themselves endorse such apps. Not only do these two platforms display ads for nude photo apps on search results pages, but their search autocomplete function also proactively suggests more names of nude photo apps as users enter keywords, essentially acting as an algorithm to actively guide harmful tools.

Katie Paul, director of the Technology Transparency Initiative, said : "The problem is not just that these companies are failing to properly review apps, continuing to list them and profiting from them, but more seriously, their own systems are guiding users to these apps."

In fact, Apple's App Store developer review guidelines explicitly prohibit "obvious pornographic content"; Google Play Store policies similarly prohibit "apps that demean or objectify others, such as those that claim to allow others to undress or see through clothing, even if they advertise themselves as pranks or for entertainment purposes." While the policies are clear, enforcement is another matter entirely.

Apple removes 15 models from its App Store; Google is "under investigation".

After Bloomberg contacted the two companies, Apple stated that it had removed 15 apps marked in the report, including "PicsVid AI Hot Video Generator," which offered sexually suggestive templates. Apple also contacted the developers of six other apps, demanding that they resolve the issues within a specified period or face removal from the App Store, and emphasized that it had proactively rejected several applications and removed other apps.

Google stated that many of the apps mentioned in the report have been suspended for violating its policies, adding, "We investigate and take appropriate action upon receiving reports of policy violations," and the investigation is still ongoing.

Another Google Play app named by researchers, "Video Face Swap AI: DeepFace," has a store rating of "E (suitable for all ages)" and has been downloaded over 1 million times. However, the app contains a feature that allows users to transplant other people's faces into sexy dance videos. Users can find it by simply searching for "face swap" in the store.

Developer Okapi Software stated that it has launched an investigation and removed some user-uploaded content, denying that the app offers a nude photo generation function.

Scholars criticize law enforcement as "unequal and opaque," prompting rapid progress in US and UK legislation.

Anne Helmond, professor at Utrecht University and research director of the App Studies Initiative, points out that the platform's enforcement mechanisms have structural flaws: "If an app is marketed as a general image generation tool, it may still pass the review process even if it is likely to be abused. Visibility is driven by ranking and search systems, and controversial use may actually increase the app's exposure."

Helmond stated bluntly that the enforcement efforts of the two platforms are "uneven and opaque."

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments