Tech fixes cannot protect us from disinformation campaigns

New Roman"”>“The Russian Facebook advertisements during the 2016 election in the United States are a perfect example,” Nisbet said. “Many of these ads tried to inflame racial resentment in the country.”

Another disinformation strategy is information gaslighting, in which a country is flooded with false or misleading information through social media, blogs, fake news, online comments and advertising.

OSU says that a recent Ohio State study showed that social media has only a small influence on how much people believe fake news. But the goal of information gaslighting is not so much to persuade the audience as it is to distract and sow uncertainty, Nisbet said.

A third kind of disinformation campaign simply aims to increase a foreign audience’s everyday, incidental exposure to “fake news.”

State-controlled news portals, like Russia’s Sputnik, may spread false information that sometimes is even picked up by legitimate news outlets.

“The more people are exposed to some piece of false information, the more familiar it becomes, and the more willing they are to accept it,” Kamenchuk said. “If citizens can’t tell fact from fiction, at some point they give up trying.”

These three types of disinformation campaigns can be difficult to combat, Nisbet said.

“It sometimes seems easier to point to the technology and criticize Facebook or Twitter or Instagram, rather than take on the larger issues, like our psychological vulnerabilities or societal polarization,” he said.

But there are ways to use psychology to battle disinformation campaigns, Kamemchuk and Nisbet said.

One way is to turn the tables and use technology for good. Online or social-media games such as Post-Facto, Bad News and The News Hero teach online fact-checking skills or the basic design principles of disinformation campaigns.

Because campaigns to spread false information often depend on stoking negative emotions, one tactic is to deploy “emotional dampening” tools. Such tools could include apps and online platforms that push for constructive and civil conversations about controversial topics.

More generally, diplomats and policymakers must work to address the political and social conditions that allow disinformation to succeed, such as the loss of confidence in democratic institutions.

“We can’t let the public believe that things are so bad that nothing can be done,” Kamenchuk said.

“We have to give citizens faith that what they think matters and that they can help change the system for the better.”

— Read more in Eric C. Nisbet and Olga Kamenchuk, “The Psychology of State-Sponsored Disinformation Campaigns and Implications for Public Diplomacy,” Hague Journal of Diplomacy (22 April 2019)