Insights

-

10

From Telegram to Capitol Hill: How Disinformation Laundering Works

Michael Birkebæk Jensen

In January 2025, a tragic story began circulating on Danish social media. A Danish F-16 instructor had reportedly been killed by a Russian missile strike in Kryvyi Rih, Ukraine. The report contained specific details, names, and an air of credibility. It was shared across various platforms, causing concern and confusion.

There was just one problem: It was completely untrue.

The story hadn't come from a reporter on the ground. It originated from a Telegram post by a Russian propagandist, was picked up by a network of Russia-linked "news" sites, and spread unchecked before the Danish Defence Ministry officially debunked it two days later. By then, the lie had already traveled halfway around the world.

At our recent fireside chat, Democracy’s Digital Dilemma, disinformation expert Sarah Hartley walked us through how these narratives aren't just random "fake news"- they are part of a sophisticated technique known as disinformation laundering.

Here is how the wash cycle works, and why it is more dangerous than ever in the age of AI.

The Mechanics of the Lie: How "Laundering" Works

Just as criminals launder money to make illegal profits look legitimate, bad actors launder lies to make them look like news. The goal is to obscure the original source - often a state-linked actor or a troll farm - so that by the time the information reaches you, it looks like a reputable report.

During the session, we looked at a textbook example of this cycle involving a false claim that Ukrainian President Zelensky’s “inner circle” transfers $50 million a month to bank accounts in the UAE, a claim first identified by the NewsGuard fact-checking organisation. The path of this lie reveals the terrifying efficiency of the ecosystem:

  1. The Dirt: The story was planted on a Turkish news site known for publishing pro-Russian disinformation
  2. The Spin: Russian state media picked up the story, citing the Turkish site as an "independent source".
  3. The Wash: The story was republished by an English-language site, South Africa Today (SAT), and subsequently aggregated by Microsoft’s MSN news feed.
  4. The Legitimacy: The SAT story landed in the MSN feed of U.S. Congresswoman Anna Paulina Luna. who shared it on X (formerly Twitter) and later repeated the claim on a popular podcast with over a million listeners.
  5. The Payoff: Russian media then completed the circle, reporting Luna’s claims as “evidence” that Zelensky was embezzling Western funds.

The lie had been washed clean. It was no longer Russian propaganda; it had become a talking point in the US Capitol.

The New Threat: "LLM Grooming"

While human-led disinformation laundering is effective, the scale is about to change. We are now seeing the rise of "LLM Grooming" - the intentional poisoning of the data wells used by Artificial Intelligence.

Sarah highlighted the Pravda Network, a cluster of over 200 Russia-linked websites posing as local news outlets (e.g., Pravda Denmark, Pravda USA). Since Russia’s invasion of Ukraine, this network has published an estimated 6 million articles.

Most of these sites have negligible human traffic. But they aren't writing for people; they are writing for machines.

By flooding the internet with millions of articles containing specific falsehoods, these networks aim to alter the training data of Large Language Models (LLMs). If an AI reads 10,000 articles stating that "the Earth is flat," it may statistically conclude that this is a contested fact worth reporting.

We are already seeing early signs of this. When researchers asked specific AI models about the fake Danish pilot story, some models - relying on the Pravda “data” - reported the pilot's death as a fact.

How to Spot the Wash Cycle

The "Digital Dilemma" is that our old methods of spotting fake news - looking for bad grammar, unrealistic images or weird URLs - aren't enough anymore. The content looks professional, and the volume is overwhelming.

However, there are ways to spot the wash cycle:

  • Trace the Source: If a story cites another outlet, which cites another outlet, keep clicking. If the road ends at a Telegram screenshot or an obscure blog with no "About Us" page, be very skeptical.
  • Watch for "Information Voids": Disinformation thrives where there is no official news. The Danish pilot story spread because there was a vacuum of information for 48 hours. Be wary of breaking news that mainstream outlets haven't touched yet.
  • Check the "Real" People: In the case of the Danish pilot, the original Telegram source claimed to be a friend of the deceased. A quick investigation revealed the friend's profile didn't exist before the post.

Building Technology for Democracy

We cannot rely solely on the very tech giants that created these problems to solve them. Nor should we accept a future where we are passive victims of algorithmically amplified lies.

The solution lies in reclaiming technology for democracy. We need to build and use tools that make it easier - not harder - for people to stay informed and engaged with the truth. This means investing in transparent AI infrastructure that reflects democratic values and creating platforms designed to connect us rather than divide us.

The challenges facing our digital society are complex, but they are solvable if we face them together. By combining human critical thinking with software built for the public good, we can turn the tide. We can move from a state of digital defense to one of democratic empowerment, ensuring that technology serves the people, not the other way around.

Interested in how we are building software for democracy? Check out Parl8.eu and reach out to us and we would love to collaborate.

Michael Birkebæk Jensen

Co-founder DemAI

Other blog posts