Some users on X have found a lucrative way to profit from sharing content that includes election misinformation, AI-generated images, and unfounded conspiracy theories, claiming to earn “thousands of dollars” from their posts. The BBC’s investigation uncovered networks of accounts that reshare each other’s content multiple times daily, creating a mixed bag of true, false, and misleading information aimed at boosting their visibility and revenue on the platform.

Within these networks, individuals report incomes ranging from a few hundred to several thousand dollars. They coordinate using forums and group chats to amplify each other’s posts, illustrating a collaborative effort to maximize engagement. These profiles vary in political allegiance, with some supporting Donald Trump, others backing Kamala Harris, and some operating independently. Interestingly, certain users have been approached by politicians seeking supportive content, indicating a potential intertwining of social media and political strategies.

Recently, X modified its payment system, shifting from a model based on ad revenue to one that rewards accounts with significant reach depending on engagement metrics such as likes, shares, and comments. As a result, the incentive structure now rewards posts, regardless of their truthfulness.

While many social platforms impose restrictions on monetizing accounts that post misinformation, X lacks similar comprehensive policies, raising questions about its influence on political discourse, particularly during a critical election period in the US. The BBC compared users’ reported earnings with expected amounts, validating claims of profitability based on interaction metrics.

Among the disinformation spread through these networks are allegations of election fraud already debunked by authorities and extreme conspiracies targeting presidential candidates. Some content, which originally gained traction on X, has ricocheted onto larger platforms like Facebook and TikTok, increasing its reach.

For example, one user fabricated an image of Kamala Harris purportedly working at McDonald’s, which stirred discussions about perceived manipulations by the Democratic Party. Unfounded conspiracy theories surrounding the assassination attempt on Donald Trump have also gained traction from X to other platforms.

Users, like those interviewed, often display no hesitation about their content’s authenticity. Freedom Uncut, a prominent tweeter known for AI-generated imagery, spends up to 16 hours daily on X, relishing the creative aspect of his posts and the financial rewards since monetizing his account. He views his provocative imagery as a form of art and insists they spark important dialogues, claiming it has become significantly easier to earn money on the platform than before.

He acknowledges that content perceived as controversial garners the most interactions, equating this to traditional media’s approach to sensationalism. While asserting that he does not aim to deceive, he recognizes that other users profit from knowingly sharing false information.

Conversely, users like Brown Eyed Susan, another significant account supporting Kamala Harris, recounts how she never aimed to monetize her postings but found herself earning a modest income after receiving a blue tick, marking her content for monetization within the platform’s new engagement-based payment system.

Despite acknowledging the influence of misinformation in shaping public opinion and its potential to impact election outcomes, Freedom Uncut dismisses concerns about the implications of fabricated or misleading content. He expresses confidence in the ability of independent media to earn trust over mainstream outlets, although they heavily utilize AI-generated content and misinformation to pursue visibility.

With increasing levels of misinformation and digital manipulation, one might wonder about the impact of sensational claims on election dynamics. User Blake, who shared a doctored image of Kamala Harris, remarked that people often prefer narratives that reinforce their beliefs rather than pursue factual verification. This sentiment underscores the complexities of how misinformation is consumed and shared.

X asserts its commitment to user voices by deploying labels on manipulated media, yet little action appears to have been taken in response to the current political landscape’s disinformation epidemic. Given that social media platforms can significantly influence election outcomes, the role of X and its monetization policies remain crucial as voters prepare to take to the polls.