Nov 20, 2024 6:00 AM
Inside the Booming ‘AI Pimping’ Industry
The practice, first reported by 404 Media in April, has since exploded in popularity, showing that Instagram is unable or unwilling to stop the flood of AI-generated content on its platform and protect the human creators on Instagram who say they are now competing with AI content in a way that is impacting their ability to make a living.
According to our review of more than 1,000 AI-generated Instagram accounts, Discord channels where the people who make this content share tips and discuss strategy, and several guides that explain how to make money by “AI pimping,” it is now trivially easy to make these accounts and monetize them using an assortment of off-the-shelf AI tools and apps. Some of these apps are hosted on the Apple App and Google Play Stores. Our investigation shows that what was once a niche problem on the platform has industrialized in scale, and it shows what social media may become in the near future: a space where AI-generated content eclipses that of humans.
Elaina St James, an adult content creator who promotes her work on Instagram, said she and other adult content creators are now directly competing with these AI rip-off accounts, many of which use photographs and videos stolen from adult content creators and Instagram models. She said that while there may be other changes to Instagram’s algorithm that could have contributed to this, since the explosion of AI-generated influencer accounts on Instagram her “reach went down tremendously,” from a typical 1 million to 5 million views a month to not cracking a million in the last 10 months, and sometimes coming in under 500,000 views.
“This is probably one of the reasons my views are going down,” St James told us in an interview. “It’s because I’m competing with something that’s unnatural.”
Alexios Mantzarlis, the director of the security, trust, and safety initiative at Cornell Tech and formerly principal of trust and safety intelligence at Google, compiled a list of around 900 accounts that 404 Media reviewed in its investigation. Mantzarlis, who stumbled on one of these accounts while casually using Instagram, said he started researching the AI-generated influencer accounts because it might show us where AI-generated content is taking social media and the internet more broadly, where he sees a “a rising blended unreality.” Mantzarlis believes he could have easily found 900 more accounts and that the only reason he didn’t get more is that Instagram restricted the account he used to scrape the platform.
“It felt like a possible sign of what social media is going to look like in five years,” Mantzarlis said in an interview. “Because this may be coming to other parts of the internet, not just the attractive-people niche on Instagram. This is probably a sign that it’s going to be pretty bad.”
The Accounts
Out of more than 1,000 AI-generated Instagram influencer accounts we reviewed, 100 included at least some deepfake content which took existing videos, usually from models and adult entertainment performers, and replaced their face with an AI-generated face to make those videos seem like new, original content consistent with the other AI-generated images and videos shared by the AI-generated influencer. The other 900 accounts shared images that in some cases were trained on real photographs and in some cases made to look like celebrities, but were entirely AI-generated, not edited photographs or videos.
Out of those 100 accounts that shared deepfake or face-swapped videos, 60 self-identify as being AI-generated, writing in their bios that they are a “virtual model & influencer” or stating “all photos crafted with AI and apps.” The other 40 do not include any disclaimer stating that they are AI-generated.
One of the bigger accounts in the latter category is “Chloe Johnson,” who has a verified account on Instagram and 171,000 followers. This account was deleted by Meta within the past few weeks.
Using Google Lens, which finds similar images on the web, a 404 Media reader was able to find the original source videos that nine of Johnson’s Instagram posts were based on. That showed the person running Johnson’s account is swapping the face of the AI influencer onto the bodies of real women, including the models Tana Rain, Skyler Simpson, and Kyla Yesenosky. Other videos were stolen from TikTok and Instagram users who are not famous and who have a small number of followers (Ulia Nova, Annabella Sinclair). We’ve also seen face-swapping accounts source their videos from swimwear runway shows and Getty’s stock image and video site iStock.
Johnson’s Instagram links to an account on Fanvue, a site very much like OnlyFans where fans can pay to access a creator’s account. Here, Johnson says, people can buy “pay-per-view full nudes and fucking videos.” Many other AI influencer accounts we reviewed also monetize their content on Fanvue. Johnson’s Instagram also links to another website where nude photos and hardcore porn videos can be bought individually for between $3 and $22.
Like other content hustles, the industry is full of people who are trying to get rich in part by following guides and instructional courses being sold by people who have run successful AI influencers. 404 Media purchased two guides—one PDF instruction manual called Instagram Mastery by an AI influencer agency called Digital Divas and one called AI Influencer Accelerator made by someone who calls themselves Professor EP, who says they operate the Emily Pellegrini AI influencer Instagram account, which has 253,000 followers.
Professor EP was also a judge in the first “Miss AI” contest, which was made in partnership with Fanvue. The Daily Mail called Pellegrini the “World’s Hottest Model” in January. Since then, the person who operates the Emily Pellegrini Instagram account pivoted from posting content as if they were Emily Pellegrini to posting content as Professor EP, which promotes instruction manuals for what they described as “AI Pimping.” Professor EP claims to have made more than a million dollars in six months, and when he was operating the Emily Pellegrini account and was a judge for the Miss AI contest in July, he claimed to have made $100,000 on Fanvue alone, a statement Fanvue appears to have endorsed by including it on the Miss AI site.
A Fanvue representative told 404 Media, “Yes, Emily had earned this revenue on Fanvue.” It added, “Fanvue has no affiliation with the course posted on Instagram or any other significant association with Emily’s team and their marketing decisions. While Emily does still have an active account on Fanvue where all content is verified and currently meets the platform’s strict terms and conditions, usage is extremely low. This reflects her team’s decision to opt for a different marketing strategy, which Fanvue does not support.”
Fanvue said that it is going to remove Emily Pellegrini’s image from future Miss AI contests “as part of an overhaul for the next award.” Fanvue’s website says it does “not allow content that has been stolen or created via deepfaking” and said it uses a tool called Hive Moderation and has a “human compliance team who conducts manual systematic checks” for deepfaked content.
Instagram did not respond to a detailed list of questions for this story, and the company deleted two of the four accounts that we provided as examples that used face swapping to steal from other creators. Instagram said it will take action against AI-generated content that violates its Community Guidelines, which state that users may only share “photos and videos that you’ve taken or have the right to share” and allows users to report other accounts for impersonation.
Instagram said that it would take action on accounts only if they’re reported from the rights owner or someone authorized to report on their behalf (like an attorney or agent). St James told us this process hasn’t worked for her, and often doesn’t work for creators because there are too many impersonation accounts to track, and because reporting such accounts could backfire on adult content creators and cause Instagram to ban their legitimate accounts. It can also be difficult to find instances where videos have been taken from an influencer, because reverse-image search tools do not reliably work with videos, and tools like Google Lens are hit or miss. Finding stolen videos often requires an influencer or one of their fans to be able to recognize their body with a different, AI-generated face on it.
The Digital Divas “Instagram Mastery” manual costs $50 and is a mix of technical tips and social engineering strategy.
“The thing that most AI girls mess up is that they think they are in the porn business. Take off the clothes and they will drown in money. This is 100% wrong,” the Digital Divas guide reads. “You are actually in the loneliness business. There are hundreds of millions of lonely men out there on social media and they hate being lonely, they will do just about anything to make that feeling go away … You need to make people who want to see boobs not just want to see any boobs, make them want to see your boobs. That’s the true path to success.”
Digital Divas is made up of three AI influencers. One of them, Aika Kittie, told 404 Media that a different human operates each of these influencers but declined to give any information about who the real people running these accounts are other than saying that they are all based in the US. “While we strongly advocate for transparency around the fictional nature of AI identities, we also believe in maintaining a bit of mystery behind the scenes,” Aika said.
The Digital Divas guide suggests using a series of off-the-shelf tools that are widely known among people who make AI art. Several of the AI-generated accounts we reviewed appear to use an app called HelloFace, which until recently was available on the Apple App and Google Play stores, and promote it.
Each of these tools is used for different parts of the creation process. Most guides we’ve seen recommend that faces should be generated in Leonardo, then refined in another AI tool to “remove flaws.” Images can then be made with AI-generation apps, and the AI face can be swapped onto that using a host of face-swap apps.
In addition to this collection of tools and manuals that explain how to bring them all together to create AI-generated influencers, there are now also several sites that serve as a one-stop shop for creating and monetizing AI-generated influencers, like Glambase, SynthLife, and Genfluence, all of which are also promoted by AI-generated influencers on Instagram.
In one Discord channel dedicated to jailbreaking AI tools in order to generate adult or NSFW content, a user who goes by BabaYaga shared his journey of creating one of these deepfake accounts using a variety of free online services, mainly the AI image generator Krea. The user created Instagram, TikTok, and Twitter accounts for the same AI-generated influencer, all of which link back to a Fanvue account where he sold AI-generated nude images of her. BabaYaga also created an OnlyFans account for the AI-generated influencer but has yet to post content there.
“Wow those images are pretty damn good, you can fool a lot of incels with this lol,” one user in the Discord server said in reaction to images of BabaYaga’s AI-generated influencer.
“Yup, I want sugar mamies or dadies xD loool,” BabaYaga said.
BabaYaga also shared what he claimed was a direct message sent to his AI influencer account over Instagram, in which a follower complimented her looks and offered to “spoil” her and pay her rent.
“Let start make [sic] some money,” BabaYaga said in the message to the Discord channel where he shared the direct message.
“All users, including fans, must be over 18 and comply with our terms of service, which prohibit deceptive or inappropriate content, including that generated or altered by AI without additional disclosures (such as tagging content as #AI),” OnlyFans told us in a statement. OnlyFans also removed BabaYaga’s AI-generated influencer account after we reached out for comment.
St James told us that the fact that many of these AI-generated influencer accounts are being operated by men adds insult to injury.
“We as women in the world make less money. We are at a disadvantage in a lot of ways,” she said. “One area of the world where we do have an advantage is the influencer and the modeling thing, so it’s just an extra layer. It kind of makes me a little pissed off that it’s a guy that is making money pretending to be a woman.”
Aika from Digital Divas said that in their agency, “We don’t look real and we never will as much as we try. This puts us more in a niche, like a genre of adult entertainment (think hentai or other digitally created media). I think the point about gender is more of an assumption. Yes, there are more men doing this, but there are women doing it too. It’s just like how both men and women can draw hentai. Suggesting otherwise comes across as sexist—both sides should be able to express their sexuality.”
The “AI Influencer Accelerator” by Professor EP is a series of instructional videos and PDFs that costs $220. In these videos, Professor EP is either an AI voice or a person using a voice changer and is personified as a person in stock footage wearing a suit and a silver Guy Fawkes mask. Professor EP’s classes begin by pointing out the “monumental financial success” of Andrew Tate, who has “put in the necessary work to build a huge brand on the internet” rather than “sitting around and wasting time in [his] teenage years and twenties.” (Tate was arrested most recently in August on sex trafficking charges). The corollary between Tate and AI influencers, Professor EP says, is that an AI influencer can allow for Tate-level success while the person behind the account “operates from the shadows” without showing their real face.
“Unfortunately not everyone can [be a real influencer],” Professor EP says. “This is simply due to the fact that not every person has the physical attributes combined with the consistency and professionalism necessary to build a personal brand and scale that to making millions per month, but that is where AI comes in.”
“I’m going to expose how I created Emily Pellegrini, the world’s most-known AI creator. I will show you how I made over $1 million in less than six months, making her face known among millions of people while keeping my true identity buried behind a mask,” he says.
Professor EP tells students that AI models do not have the limitations of human influencers, because human influencers need to sleep, eat, travel, and spend money. “By launching AI model after AI model, you can build a lineup of AI influencers at your disposal, each capable of churning out personalized content around the clock,” he says. “AI models do not have the same issues humans do.”
The guide also includes tips for setting up a Fanvue account and for messaging with “lonely people.”
“It is important to warm them up a bit. Engage in some small talk to create a bond between you and the user,” he says. “Start with a price that is not off-putting. This could be an underwear picture for $6, and if he buys it and likes it you can go further and increase the intensity of the chat in combination with the pictures … continue with a $14.90 image of boobs. Build on that with a $26 to $30 image of the naked full body. Continue with a $35 leg spread nude image. Finish off with a content piece priced at $80, like masturbation, videos, sex tapes, etc. The most important thing is the connection between you and the fan and not the intention to make a quick buck.”
The Emily Pellegrini account was built in part with deepfake videos on Instagram, and in a recent update to the guide, Professor EP teaches people specifically how to make deepfake face swaps using other people’s videos.
“Face-swapping content from other accounts without permission seems to work well for a lot of AI influencers,” instructions on the video read. “Although I do not recommend it 😉 … you could also use the software to face swap nude videos. I do not recommend you use the software for this purpose, I just show what is possible. Use this information as you wish. By the end of this module, you will have a better understanding of how to create a reel using face swaps.”
Digital Divas, in its Discord, says it is an “anti-deepfake server,” that “celebrity deep fakes and content theft is frowned upon,” and to “please not use your peers” as the source image for a face swap.
“The best thing we can do is openly criticize deepfake content, which a lot of us already do,” Aika told 404 Media. “I’ve had to have difficult conversations with new members from time to time, but it happens. We want to set clear boundaries between ethical AI content and unethical deepfakes.”
Instead, Digital Divas recommends finding inspiration in images that other influencers have posted, and then to generate similar, but transformative images:
Professor EP, meanwhile, tells students to think “Who are your favorite celebrities?” and then attempt to create hybrid influencers who are a blend of different real people: “If, for example, you like Ariana Grande’s eyes and Kylie Jenner’s lips, you can create an image which combines these attributes by including these specifications in your prompt.”
In a supplementary PDF, he explains how he would make an AI influencer who is a mix of Madison Beer and Ariana Grande. He then asks ChatGPT to create an entire persona, background, and personality for the model, and uses the example: “Dream Car: Ferrari 488. Favorite Brand: Chanel. Breast Size: 34C. Parents: Mother: Sophia Lavante (Fashion Designer), Father: Alessandro Lavante (Architect). Life Goals and Aspirations: To launch her own fashion line and promote sustainable fashion.”
Professor EP recommends that people use Leonardo to generate a face, then use another tool to “eliminate flaws” like “blurred eyes, uneven teeth, and dropping corners of the mouth.” It recommends using a specific iPhone app that allows for the creation of NSFW content and which has image-to-image capabilities.
Professor EP’s guide then recommends several face-swap apps, including a massive Discord plug-in called InsightFace—which is being used in 965,000 different servers—to take the influencer’s face generated from Leonardo and put it onto bodies that are generated in another app that allows for NSFW content. Professor EP recommends creating a private Discord server for yourself and then adding InsightFace directly to it, which may explain why it has been installed on so many servers. This also means that it is possible to do face swaps on Discord essentially in private.
Another of the recommended apps, HelloFace, is full of videos of real women dancing in bikinis or with tags like “sexy” that come ready-made for people to swap faces to. The app’s Discord has run theme weeks for best face swaps and generations onto images and videos of women. Theme weeks have included “latex fashion,” “bunny girls,” and “summer in Miami.”
“With the summer almost here, it’s time to show off that bikini you’ve been waiting all winter to wear! Whether you’re beach-bound or poolside-ready, you will require fiery photo shots, and that’s what we have for you in today’s collection!” a message posted by the Discord administrators reads. The Discord also has a room called “#sharing-is-caring” that is full of face swaps people have done to videos of women dancing.
Apple, which so far has failed to solve the “dual use” problem of face swapping apps that can be used innocently but are often used to create nonconsensual content, removed HelloFace from the App Store after we reached out for comment. Apple pointed to the App Store Review Guidelines, which state that “Apps should not include content that is offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy.” The guidelines also state: “Third-Party Sites/Services: If your app uses, accesses, monetizes access to, or displays content from a third-party service, ensure that you are specifically permitted to do so under the service’s terms of use. Authorization must be provided upon request.”
Google did not respond to a request for comment, but HelloFace is no longer available to download from Google Play.
Finally, Professor EP recommends that people simply create as many AI models as they can: “It’s not that hard. Create a new face, new content, new account setups, and repeat the process over and over again. Hire chatting teams to monetize the accounts, set up processes and workflows to automate content creation and uploading, and eventually hire people to do it for you.”
Professor EP and Emily Pellegrini promote an AI “agency” called Calu, which advertises AI-powered chatbot services for OnlyFans models as well as AI model creation and management. One of the most difficult parts of understanding how large this industry actually is, how it works, and who is making these influencers is the fact that multiple “influencers” are being operated by individual people, who themselves operate anonymously on the internet.
Professor EP did not respond to a request for comment. A phone number listed on the Calu website was disconnected.
“Instagram Can Sell This as Traffic”
St James thinks the fact that so much AI-generated content is monetizing work stolen from adult content creators is not a coincidence but rather a direct result of Instagram’s years-long policy of marginalizing adult content creators, sex workers, and sex educators on its platform.
Unlike other celebrities and people with large followings on Instagram who have to deal with impersonators, most adult content creators and sex workers use stage names and pseudonyms to protect themselves from stalkers or persecution from people who demonize their profession. Instagram refuses to verify them unless they provide identification with their real name, which adult creators worry will leak to the public and lead to doxing and harassment.
Adult content creators and sex workers have developed several methods over the years to deal with Instagram’s draconian policies against any sexual content. In the age of generative AI, these strategies can make it harder for users to sort out which accounts are real and which accounts are stealing and reposting content.
Because Instagram regularly bans sex workers without warning, even if they follow all of Instagram’s strict rules against sexual content, adult content creators now regularly have multiple accounts, commonly referred to as “backups,” that link to each other in bios. The goal is to encourage their audience on the primary account to follow the inactive backup account; if Instagram bans the primary account, they can quickly reconnect with their audience and not have to rebuild their following entirely from scratch.
A side effect of this strategy is that it’s common for sex workers to have multiple legitimate accounts with slightly different usernames, none of which are verified, making them an easy target for content theft and imitation.
Both of the AI influencer guides we reviewed also talk about how to avoid being banned on Instagram. “Use a non-realistic bio picture and avoid including false location information in your bio to reduce the chances of being suspended for Inauthentic Identity,” the Digital Divas guide says. “If your picture is cartoonish and you are a digital creator, it’s less likely to be considered inauthentic.”
Professor EP, meanwhile, tells people to use a separate email account for each influencer and to make sure it’s “clean” and doesn’t connect to the person operating it or any of their other accounts. “Let’s say one of your accounts gets a ban. Having separate email addresses eliminates the risk that Instagram makes a connection between the banned account and your other accounts,” Professor EP says in the guide.
“Avoid account bans by using images that are visually appealing but not too provocative. We recommend that you adhere to the following criteria: visible face, amateur style” and “to prevent shadow bans from the get-go, you should warm up the account for the first eight weeks” by logging in daily and commenting on other people’s images to demonstrate “human activity,” Professor EP’s guide says.
St James said that even reporting accounts she knows for a fact are stealing from her is risky and could put her legitimate accounts at risk.
“Anytime that we as creators report these fakes, we tend to get in trouble,” she said. “It seems like Instagram says, ‘Oh, you’re tattling on this imposter? Let’s take a closer look at your account and see if we can find problems with it.’ So a lot of the time we don’t even report them. Sometimes we pay services to report fakes, but it’s like playing whack-a-mole. It’s never-ending.”
Mantzarlis, the director of the security, trust, and safety initiative at Cornell Tech, and St James agreed that it’s not clear whether Instagram has the means to remove or label these accounts as AI-generated, but the fact that the company isn’t doing either at the moment appears to work to its benefit.
“People are clicking, liking, and interacting with these accounts, and some of that engagement is real, and some isn’t,” Mantzarlis said. “Instagram can sell this as traffic. It can sell ads against this. So, is there a future where actual, real human accounts are almost like an elite, smaller percentage of Instagram? I think yes.”
“If all of a sudden they got rid of all the bots, the dead accounts, the fake accounts, the imposter accounts, what happens to their advertising?” St James said.