Image

Meta Takes Legal Action Against AI Apps That Generate Fake Nude Images

As Meta continues to encourage the creation of content via its own AI generation tools, it’s also seeing more harmful AI-generated images, video and tools filtering through to its apps, which it’s now taking legal measures to stamp out.

Today, Meta has announced that it’s pursuing legal enforcement against a company called “Joy Timeline HK Limited,” which promotes an app called “CrushAI,” which enables users to create AI-generated nude or sexually explicit images of individuals without their consent.

As explained by Meta:

Across the internet, we’re seeing a concerning growth of so-called ‘nudify’ apps, which use AI to create fake non-consensual nude or sexually explicit images. Meta has longstanding rules against non-consensual intimate imagery, and over a year ago we updated these policies to make it even clearer that we don’t allow the promotion of nudify apps or similar services. We remove ads, Facebook Pages and Instagram accounts promoting these services when we become aware of them, block links to websites hosting them so they can’t be accessed from Meta platforms, and restrict search terms like ‘nudify’, ‘undress’ and ‘delete clothing’ on Facebook and Instagram so they don’t show results.

But some of these tools are still getting through Meta’s systems, either via user posts or promotions.

So now, Meta’s taking aim at the developers themselves, with this first action against a “nudify” app.

We’ve filed a lawsuit in Hong Kong, where Joy Timeline HK Limited is based, to prevent them from advertising CrushAI apps on Meta platforms. This follows multiple attempts by Joy Timeline HK Limited to circumvent Meta’s ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules.”

It’s a difficult area for Meta, because as noted, on one hand, it’s pushing people to use its own AI visual creation apps at any opportunity, yet it also doesn’t want people using such tools for less savory purpose.

Which is going to happen. If the expansion of the internet has taught us anything, it’s that the worst elements will be amplified by every innovation, despite that never being the intended purpose, and generative AI is proving no different.

Indeed, just last month, researchers from the University of Florida reported a significant rise in AI-generated sexually explicit images created without the subject’s consent.

Even worse, based on UF’s analysis of 20 AI “nudification” websites, the technology is also being used to create images of minors, while women are disproportionately targeted in these apps.

This is why there’s now a big push to support the National Center for Missing and Exploited Children’s (NCME) Take It Down Act, which aims to introduce official legislation to outlaw non-consensual images, among other measures to combat AI misuse.

Meta has put its support behind this push, with this latest legal effort being another step to discourage, and ideally eliminate the use of such tools.

But they’ll never be culled entirely. Again, the history of the internet tells us that people are always going to find a way to use the latest technology for questionable purpose, and the capacity to generate adult images with AI will remain problematic.

But ideally, this will at least help to reduce the prevalence of such content, and the availability of nudify apps.

SHARE THIS POST