How the U.S. Can Counter Disinformation from Russia and China

In the Russian disinformation example, the kernel of truth is that the United States does help Ukraine and other former Soviet republics make their former Kremlin-operated labs safe, under the Biological Threat Reduction Program [PDF]. Someone who encounters the campaign does not need to know how biochemical weapons are made or how U.S. policy addresses them because the false narrative explains these matters in a compelling way. The lie, that the United States is developing bioweapons in Ukraine, then shifts unconscious beliefs in the target and, ideally, triggers behaviors that favor the Russian government, such as protesting U.S. weapons development in Ukraine, sending money to support a protest, or simply reposting the false narrative.

Amplify the Narrative
After the narrative is planted, it is essential that sources trusted by the target audience amplify it. These can include internet forums, social media websites, news sources, and false personas operated by Russia or their supporters and proxies. In addition, there can be unsuspecting yet credible spokespeople, deemed “useful idiots” in the disinformation literature.

Amplification occurs through restatement and variation. For example, NewsGuard has identified 200 false claims about the Russia-Ukraine war across 473 websites. In addition to the core false claim about U.S. operated bioweapons labs in Ukraine, other fabrications said the United States developed bioweapons to target ethnic Russians; that North Atlantic Treaty Organization (NATO) advisors were hiding out in a bioweapons lab underneath a steel plant in Mariupol, Ukraine; and that Ukraine conducted infectious disease experiments on its military personnel in U.S.-run biological laboratories.

Obfuscate the Source
Successfully spreading disinformation requires obscuring the provenance of the false narrative. Obfuscation is helped by numerous sources repeating the false claim, often with variations to it. Repeatability plus specificity equals believability. Thus, by ensuring the false narrative is repeated by diverse sources, including “useful idiots,” often with false granular detail, and that organic sharing or reposting occurs as well, the lie eventually rings true to its audience.

The repeatability of the narrative makes tracing its true source difficult. Hundreds of Russian-sourced online statements, retweets, posts, and news reports all circle back to each other. It is virtually impossible for the average consumer to source the origin of these claims or understand how they spread.

To reinforce the believability of a false narrative, its originators leverage influence principles and unconscious bias. Decades of research demonstrate that these tactics are fundamentally effective and difficult to thwart. They unconsciously bind the repeated narrative to audiences’ beliefs, which leads to changes in behavior as the audiences will naturally act in ways consistent with their beliefs, particularly if they have articulated or documented these beliefs [PDF] by reposting them on social media. Such behavior change is the ultimate goal of the false narrative author.

How to ‘Pre-bunk’
“Pre-bunking” a narrative, detecting it before it is amplified, combined with increased influence immunity is the most powerful way to prevent a disinformation campaign from taking hold in the first place. Once initiated, strong disinformation campaigns are difficult to counter. However, social science research demonstrates that the early countering of a false narrative is more likely to be effective if it provides the targeted audience an alternative, true narrative. This narrative should be detailed, remind the audience of the false narrative it is correcting, and be repeated, much like the amplification of a false narrative. Research shows that repeating the false narrative does not reinforce audiences’ belief in the disinformation. Additionally, making people aware of their vulnerability to false narratives and of the originator’s nefarious motivations can increase the effectiveness of debunking efforts.

A New Warning from China?
China is quickly catching up to Russia as an effective proliferator of disinformation. In April 2023, NewsGuard analysts spotted a false claim about a supposed U.S. bioweapons lab in Kazakhstan in a video created by China Daily, the Beijing-controlled English-language publication. The professionally produced video accused the United States of operating the laboratory to conduct secret research on the transmission of viruses to Chinese people from camels. Much of the purported “evidence” in the video was based on unsubstantiated claims first propagated by Russian disinformation websites that had stated that mysterious “mass deaths have happened” in Kazakhstan.

This accusation echoes the Russian claim about U.S. labs that was pre-positioned on YouTube before Russia invaded Ukraine. Its kernel of truth is that the United States and Kazakhstan are working to eliminate bioweapons labs in the former Soviet Republic as part of a 1995 agreement to destroy infrastructure [PDF] used to create weapons of mass destruction. This Chinese disinformation appears to be a pre-positioned false claim as well. Policymakers in the United States and allied countries should take heed of this effort by Beijing and what it could augur.

It is unclear why China would create this false narrative, but it meets the criteria of an advanced strategic warning of an incoming disinformation campaign. Explaining the true situation in a way that resonates with the target audience could be the best way to undermine the false narrative before it sows discord, creates uncertainty, or deepens community divides. Examining and pre-bunking obfuscated false claims as early as possible is essential in countering disinformation, as these early false narratives could serve as indicators of cyber or physical attacks to come.

Dana S. LaFon is the 2023–24 National Intelligence Fellow at CFR. This article is published courtesy of the Council on Foreign Relations (CFR)This work represents the views and opinions solely of the author. The Council on Foreign Relations is an independent, nonpartisan membership organization, think tank, and publisher, and takes no institutional positions on matters of policy.