TikTok has an AI conspiracy theory problem
Content creators claim users can make “thousands” off of AI-generated conspiracy theory content through TikTok’s Creativity Program
Written by Abbie Richards
Research contributions from Carly Evans
Published
Users are exploiting TikTok’s Creativity Program by pumping out viral conspiracy theory content, using AI-generated images and voices for profit.
While scrolling the popular video-sharing platform, users may come across conspiracy theories seemingly read by an AI-generated voice and presented along with a series of apparently AI-generated images. These videos range from making various baseless and outrageous claims about the U.S. government capturing mythical or fictional creatures like vampires, wendigos, or King Kong, to conspiracy theories arguing that advanced ancient civilizations have been systematically erased from history.
While the videos come from a wide array of faceless accounts, they have many similar traits. These videos often feature AI-generated voices, usually one of the free options provided by text-to-speech AI programs like ElevenLabs, while some feature AI-generated celebrity voices. The AI-generated images have typical tells like extra fingers, asymmetry, or distortions.
In one video, an apparently AI-generated copy of Joe Rogan’s voice plays over footage of Rogan speaking on his podcast, giving the impression that the clip is actually from his show. The fake Rogan proceeds to say: “We are all probably going to die in the next few years. Did you hear about this? There’s this asteroid that is on a collision course with Earth. Pull it up, Jamie.”
Some of these videos also use some combination of real images and AI-generated art to accompany their conspiracy theories. The AI art isn’t necessarily being used to depict reality though — instead, it’s a storytelling tool to keep viewers engaged. For instance, the video below — which was posted by an account with 367,000 followers and a bio touting the “Art of Ai Storytelling” — seemingly uses AI-generated images of an “explorer and researcher” named “Tommy Brady,” who supposedly discovered a secret cave system where the U.S. government researches mythical creatures. This video garnered over 1.9 million views.
These videos are proliferating across TikTok's English-language and Spanish-language spheres, receiving millions of views. A video of Rogan purportedly claiming that “scientists captured the wendigo and tried to keep it a secret” has 21.6 million views. A video from the account “forbiddencombo” claims a “scientist just got caught hiding the last saber-toothed tiger" and has 32.3 million views. A Spanish-language video claiming the U.S. government captured a giant kraken has 23 million views.
Some of the accounts posting these videos appear to be affiliated with each other. As of February 1, “forbiddencombo” has received over 342 million views since it began posting in February 2023, according to Media Matters' analysis, while the Spanish-language account “prohibidocombo” (which translates to “forbidden combo” and has the same profile picture as the English-language account) has received over 329 million views since it began posting in September 2023.
TikTok’s Creativity Program incentivizes conspiratorial content
In addition to their apparently AI-generated art and voices, these videos have something else in common: They are over 60 seconds long. This is because TikTok’s Creativity Program Beta, which is designed to pay creators for the content they generate on the platform, only compensates creators for videos that are over 60 seconds long.
AI-generated conspiracy theory content is emerging as a popular “niche” for making money online. Content creation gurus on YouTube promise that you can make “thousands” tapping into the conspiracy theory “niche” on TikTok. Medium articles about TikTok content with the highest revenue per 1,000 views (or RPM) rank conspiracy theory content as some of the most profitable around.
Conspiracy theories have proven to be financially rewarding for those who peddle them, and they especially perform well in engagement-driven algorithms that respond to the strong emotions they evoke. From the outlandish conspiratorial claims to the overstimulating editing tactics they use, these videos are designed to keep viewers engaged for as long as possible. The longer people are engaged, the more TikTok’s algorithm recommends the video to other users and the higher the RPM, both of which increase the payout for the anonymous person running the account.
The AI-generated TikTok content industry
These AI-generated conspiracy theory videos are part of a larger cottage industry that has emerged around TikTok’s Creativity Program. Other supposedly profitable niches include motivational content, history content, and horror stories. Successful AI-generated content accounts frequently run Discord servers where members discuss how to best use AI to profit off TikTok’s Creativity Program.
In these servers, users in the “conspiracy theory niche” discuss strategies for increasing their profits. One user suggested: “If it’s a conspiracy channel, post podcast clips about how they’re poisoning the food supply and then link an affiliate product that is meant to detoxify the body.”
Another asked whether their conspiracy theory account’s age demographic of 30-50-year-olds is too high and would targeting a younger audience be more profitable.
Users also recommended using AI services like Chat GPT to create conspiracy theory scripts for TikTok. One user prompted: “Im a short form content creator looking to monetize my account, I run a conspiracy theory account. Generate me a script for a video about the moon landing being fake. Follow this format: 1. Captivating hook 2. Proof 3. Outro.”
Another user posted that they were seeking to recruit “experienced professionals” for work focused on “Motivational and Conspiracy Theories Niches.”
The profitability of this AI-generated content cottage industry expands beyond TikTok’s Creativity Program too, with popular AI content creators selling their expertise as coaches, offering weekly calls, instructional videos, and video ideas to those willing to pay for expensive premium memberships. These content creation gurus also participate in a vast web of affiliate marketing around various AI services.
Incentivizing low-quality viral content
TikTok’s community guidelines ban “inaccurate, misleading, or false content that may cause significant harm” to people, including “specific conspiracy theories that name and attack individual people” or “material that has been edited, spliced, or combined (such as video and audio) in a way that may mislead a person about real-world events,” and it requires AI images and other “synthetic media” to be “disclosed” as such if they depict “realistic scenes” or “the likeness (visual or audio) of a real person.”
But the AI-generated conspiracy theory content monetized by the Creativity Program likely doesn’t qualify as causing “significant harm” in the same way as the QAnon conspiracy theories. Similarly, the AI images and voices typically aren’t trying to depict reality — instead they exist to maximize engagement (and corresponding profits).
Financially incentivizing content that is both highly engaging and cheap to manufacture creates an environment for conspiracy theories to thrive. While it’s good that TikTok is compensating creators for their labor, the platform should not be financially rewarding fictional content disguised as “theories” designed to hijack viewers’ attention.
As we head into a crucial election year, we should be especially wary of systems that reward misinformation. With a huge portion of the electorate already primed to believe election-related conspiracy theories, the combination of AI’s accessibility and TikTok’s strong financial incentive for creators to churn out conspiracy theory content could be a recipe for disaster.