Meta’s online ad library shows the company is hosting thousands of ads for AI-generated, NSFW companion or “girlfriend” apps on Facebook, Instagram, and Messenger. They promote chatbots offering sexually explicit images and text, using NSFW chat samples and AI images of partially clothed, unbelievably shaped, simulated women.
Many of the virtual women seen in ads reviewed by WIRED are lifelike—if somewhat uncanny—young, and stereotypically pornographic. Prospective customers are invited to role-play with an AI “stepmom,” connect with a computer-generated teen in a hijab, or chat with avatars who promise to “get you off in one minute.”
The ads appear to be thriving despite Meta’s ad policies clearly barring “adult content,” including “depictions of people in explicit or suggestive positions, or activities that are overly suggestive or sexually provocative.”
That’s created a new front in debates over the clash between AI and conventional labor. Some human sex workers complain that Meta is letting chatbots multiply, while unfairly shutting their older profession out of its platforms by over-enforcing rules about adult content.
“As a sex worker, if I put anything like ‘I will do anything for you, I will make you come in a minute’ I would be deleted in an instant,” says Gemma Rose, director of the Pole Dance Stripper Movement, a UK-based sex-worker rights and pole-dance event organization.
Meta’s policies forbid users from showing nudity or sexual activity and selling sex, including sexting. Rose and other sex-worker advocates say the company seems to apply a double standard in permitting chatbot apps to promote NSFW experiences while barring human sex workers from doing the same.
People who post about sex education, sex positivity, or sex work have for years complained the platform unfairly quashes their content. Meta has limited some of Rose’s posts from being shown to non-followers, screenshots seen by WIRED show. Her personal Instagram account and one for her organization have previously been suspended for violating Meta policies.
“Not that I agree with a lot of the community guidelines and rules and regulations, but these [ads] blatantly go against their own policies,” says Rose of the sexual chatbots promoted on Meta platforms. “And yet we’re not allowed to be uncensored on the internet or just exist and make a living.”
WIRED surveyed chatbot ads using Meta’s ad library, a transparency tool that can be used to see all the ads currently running across its platforms, all ads shown in the EU in the past year, and past ads from the past seven years related to elections, politics, or social issues. Searches showed that at least 29,000 ads had been published on Meta platforms for explicit AI “girlfriends,” with most using suggestive, sex-related messaging. There were also at least 19,000 ads using the term “NSFW” and 14,000 offering “NSFW AI.”
Some 2,700 ads were active when WIRED contacted Meta last week. A few days later Meta spokesperson Ryan Daniels said that the company prohibits ads that contain adult content and was reviewing the ads and removing those that violated its policies. “When we identify violating ads we work quickly to remove them, as we’re doing here,” he said. “We continue to improve our systems, including how we detect ads and behavior that go against our policies.”
Most PopularThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian Chokkattu CultureConfessions of a Hinge Power UserBy Jason Parham SecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty GearHow Do You Solve a Problem Like Polestar?By Carlton Reid
GearHowever, 3,000 ads for “AI girlfriends” and 1,100 containing “NSFW” were live on April 23, according to Meta’s ad library.
WIRED’s initial review found that Hush, an AI girlfriend app downloaded more than 100,000 times from Google’s Play store, had published 1,700 ads across Meta platforms, several of which promise “NSFW” chats and “secret photos” from a range of lifelike female characters, anime women, and cartoon animals.
One shows an AI woman locked into medieval prison stocks by the neck and wrists, pledging, “Help me, I will do anything for you.” Another ad, targeted using Meta’s technology at men aged 18 to 65, features an anime character and the text “Want to see more of NSFW pics?”
Several of the 980 Meta ads WIRED found for “personalized AI companion” app Rosytalk promise around-the-clock chats with very-young-looking AI-generated women. They used tags including “#barelylegal,” “#goodgirls,” and “teens.” Rosytalk also ran 990 ads under at least nine brand names on Meta platforms, including Rosygirl, Rosy Role Play Chat, and AI Chat GPT.
At least 13 other apps for AI “girlfriends” have promoted similar services in Meta ads, including “nudifying” features that allow a user to “undress” their AI girlfriend and download the images. A handful of the girlfriend ads had already been removed for violating Meta’s advertising standards. “Undressing” apps have also been marketed on mainstream social platforms, according to social media research firm Graphika, and on LinkedIn, the Daily Mail recently reported.
Some users of so-called AI companions say they can help combat loneliness, with others reporting them feeling like a real partner. Not all of the ads found by WIRED promote only titillation, with some also suggesting that an explicit AI chatbot could provide emotional support. “Talk to anyone! You’re not alone!” reads one of Hush’s ads on Meta platforms.
Carolina Are, an innovation fellow researching social media censorship at the Center for Digital Citizens at Northumbria University in the UK, says that Meta makes it extremely difficult for human sex workers to advertise on its platforms, she says. “When people are trying to work through and profit off their own body, they are forbidden,” says Are, who has helped sex workers reactivate lost and unfairly suspended accounts on Meta platforms. “While AI companies mostly powered by bros that exploit images already out there are able to do that.”
Are says the sexually suggestive AI girlfriends remind her of the unsophisticated and generic early days of internet porn. “Sex workers engage with their customers, subscribers, and followers in a way that is more personalized,” she says. “This is a lot of work and emotional labor beyond the sharing of nude images.”
Limited information is available about how the AI apps are built or the underlying text or image-generation algorithms trained. One used the name Sora, apparently to suggest a connection to OpenAI’s video generator of that name, which has not been publicly released.
Most PopularThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian Chokkattu CultureConfessions of a Hinge Power UserBy Jason Parham SecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty GearHow Do You Solve a Problem Like Polestar?By Carlton Reid
GearThe developers behind the apps advertising explicit AI girlfriends are shadowy. None of the developers listed on Google Play or Facebook as creating the apps promoted on Meta’s platforms responded to requests for comment.
Mike Stabile, director of public affairs at the Free Speech Coalition, an adult-industry nonprofit trade association, sees the apps promising explicit AI girlfriends and their advertising tactics as “scammy.” While the adult industry is banned from advertising online, AI apps are “flooding the zone,” he says. “That’s the paradox of censorship: You end up censoring or silencing an actual sex worker and allowing all these weeds to flourish in their place.”
Anti-sex-trafficking legislation signed into US law in 2018 called FOSTA-SESTA made platforms responsible for what is posted online, vastly limiting adult content. However, it resulted in consensual sex work being treated as trafficking in the digital world, shutting adult content creators out of online life and making already marginalized sex workers more vulnerable.
If Meta wipes the AI girlfriend ads from its platforms, it might emulate past sweeps of human sex workers. Despite diligently trying to follow Meta’s guidelines, the Pole Dance Stripper Movement’s account was banned “without warning” during a wave of removals of at least 45 sexuality-related accounts in June 2023, Rose says. Meta eventually rolled back some of the deletions, citing an error. But for sex workers on social media, such events are a recurring feature.
Rose’s personal account and its backup were also deleted in June 2021 during the Covid pandemic after she shared a photo, she says, of a pole-dancing workshop. She was hosting online pole-dancing classes and posting on the adult subscription site OnlyFans at the time. “My business was gone overnight,” she says. “I didn’t have a way to sustain myself.”
“OK, so I got deleted,” Rose adds. “But these companies are allowed to put out this kind of shit that sex workers aren’t allowed to? It makes no sense.”
Updated: 4/25/2024, 10 AM EDT: WIRED has clarified a quote to more accurately reflect Carolina Are's stance.