Truth decayRegulation or research? Searching for solutions to reduce Truth Decay in the media

Published 31 May 2018

What is social media’s role in the decline of trust in the media? Is government intervention needed to help stop the spread of misinformation on these platforms? These questions were the focus of a recent RAND Corporation event on the connection between the media and Truth Decay.

What is social media’s role in the decline of trust in the media? Is government intervention needed to help stop the spread of misinformation on these platforms? These questions were the focus of a recent RAND Corporation event in Boston on the connection between the media and Truth Decay.

RAND says that the consensus of a panel of researchers was that, to begin solving the problem, more data, access to how social media platforms work, and transparency are needed.

Jennifer Kavanagh, a political scientist at RAND, opened the talk by defining what RAND researchers call “Truth Decay”—the diminishing role of facts and analysis in American public life. The panelists addressed how changes in the information system, including the rise of social media and the use of algorithms for news gathering, are driving Truth Decay. Kavanagh was joined by David Lazer, a professor of political science at Northeastern University, and Claire Wardle, a research fellow at the Shorenstein Center at Harvard Kennedy School and executive director of the nonprofit First Draft.

——————————————————————————————

Also read:

 “Declining trust in facts, institutions imposes real costs on U.S. society,” HSNW, 19 January 2018

 “Responding to Truth Decay: Q&A with RAND’s Michael Rich and Jennifer Kavanagh,” HSNW, 19 January 2018

 “Misinformation campaigns, social media, and science,” HSNW, 1 February 2018

——————————————————————————————

Everything you see is “algorithmically mediated” on social media platforms,” Lazer said, but there is very little research on the role algorithms play.

A lot of disinformation is being shared globally through encrypted messaging apps and text messaging services, according to Wardle. Without access to the type and volume of content spread on these closed systems, she said researchers are missing a huge part of the ecosystem. More understanding of how these platforms work is needed before society moves toward “regulation with a capital ‘R’,” she said.

Increasing transparency would be a step in the right direction, said Kavanagh. Social media platforms could provide clarity on where their advertising money comes from or open their application programming interfaces, and they could work to identify and monitor bots on their systems. But the companies need incentives and encouragement to make these types of changes, which run counter to their business model “whether that’s regulation or the threat of regulation remains to be seen,” she said.

“Transparency does create its own kind of incentives,” Lazer added.

Kavanagh advised that social media users and consumers of media need to be part of any solutions to address Truth Decay. “We can implement all the regulations that we want, but if people aren’t willing to look for facts and take the time to identify what is a fact, then I don’t think it makes a difference,” she said. “There has to be an understanding of why facts matter—and why it’s important to be an informed participant in democracy—if democracy is what you want.”