Social Media Posts Have Power, and So Do You
1. Reading Across Sources (Lateral Reading)
What Is the Problem?
It is important for individuals to know the difference between trustworthy and false or misleading information. However, identifying what is trustworthy can be difficult and is often influenced by the way information is presented online. For example, if someone reads about a product using only sources that are paid to promote it, that person might not know about any negative side effects. This practice can lead to one-sided understanding and put the person at risk. The same concept holds true when it comes to information about politics or current events.
In response to our survey, older U.S. adults reported that they are not always confident in identifying whether or not information is based on fact. Older generations, in particular, may feel unsure about how to know which information to trust. But there are practical approaches to teasing out fact-based information from the rest. Lateral reading is one simple strategy that has proven particularly effective for checking the trustworthiness of information.[8]
What Is the Solution?
Lateral reading is when you check whether information you found online is trustworthy by looking for corroborating evidence through other websites or sources. Lateral reading has been shown to be more effective than trying to determine whether something is trustworthy based only on clues from the information itself.[9] Before you share information, either online or in person, you should pause. The minimum time that it takes to verify details across sources through lateral reading may prevent you from contributing to the problem of false information.
After you have identified the information that you want to verify, follow these basic steps:
1. Open a new tab in your browser (e.g., Safari, Firefox, Google Chrome, Edge) on your computer or smartphone and navigate to a search engine, such as Google.
2.In the search engine, enter text about the topic in which you are interested. The search engine will bring up news stories and other websites that discuss the same topic.
3. Skim those additional sources to expand your understanding. Focus on information from organizations that do not seem like they are trying to influence you and news sites that do not aim to shock or excite readers.
4. Search for the author and the organization from which the information came. Look for answers to such questions as the following: Who paid to produce this work? What kinds of expertise do the author and the organization have? Do other people seem to think they are credible, and why?
Now that you are armed with more context, think critically about whether the information you found is based on facts. Practicing this process — opening a new tab on your browser and searching for additional sources — can help prevent you from accidentally sharing false or misleading information.
____________________________________________
What Is Prebunking?
This tool and its accompanying videos offer strategies commonly referred to as prebunking techniques. Prebunking is the process of exposing false or misleading information before it can be passed on to other people. Individuals can become skilled at prebunking by learning about common tricks used to spread bad information; by being aware of these tricks, people can better resist false information when they come across it. Research has shown that preventative messaging can be more effective than trying to correct inaccurate information after it has been spread.[7] Share this tool and these videos with your friends and family to help them learn prebunking strategies to resist false and misleading information.
____________________________________________
2. Resisting Emotional Manipulation
What Is the Problem?
Emotionally charged information travels fast online. Social media can make people feel strong emotions that lead them to share things quickly.[10] This concept is built into the design of social media platforms: Appealing to people’s emotions is one of the most certain ways to capture their time and attention online.[11]
Bad actors understand that people are more likely to interact with emotionally charged content. As a result, some people share things that are meant to make readers feel a certain way — even if details are exaggerated or untrue.[12] People are more likely to believe false information when they are in a heightened emotional state.[13]
Engaging with false and misleading information because of the emotions it evokes can have tremendously damaging effects. For instance, if someone believes false information about a group of people, that information may contribute to treating that group unfairly or even causing direct harm to the group.
What Is the Solution?
Although reading across sources is an important step in stopping the spread of false information, some misleading information cannot be confirmed as true or false. Such information might instead contain an opinion or an unprovable assertion. It may be sensational or accompany a shocking image. This kind of information often relies on emotional manipulation, which is why focusing on fact-checking is helpful. However, fact-checking is not always enough to curb the spread of misleading information. To do that, people must be prepared to both fact-check and resist emotionally manipulative content.
To prevent the spread of emotionally manipulative information, the following steps would be helpful:
1.Before liking, sharing, or commenting on information online, ask yourself why you want to do so — that is, is it because of an emotional reaction or for another reason?
2. Take a moment to reflect and step back from any emotional response you notice after reading.
3. Think critically about the information at hand, and what you would accomplish by liking, sharing, or commenting on it.
Studies show that engaging in critical thinking rather than reacting solely based on emotion can reduce the chances of spreading false information.[14] By taking these steps, you can resist emotional manipulation and help prevent the spread of false and misleading content.
3. Taking Personal Responsibility
What Is the Problem?
Our survey showed that most people aged 55 years and older are concerned about false and misleading information spreading in the United States. However, our respondents were more worried about other people spreading bad information than about doing so themselves. Other studies show that, compared with younger people, older generations are less likely to feel that the spread of misinformation and disinformation is their personal responsibility.[15]
Some survey respondents said that they felt that social media companies and government agencies should be responsible for stopping the spread of false and misleading content, at least to an extent. However, people cannot always rely on these groups for protection from misleading posts online. Some information might be manipulative but not break any rules, so it would not be removed by companies or the government. Although some information categorized as “fake news” is filtered out of feeds or flagged by social media companies, their systems cannot filter everything. It is simply not enough to believe that someone else will address the problem.
What Is the Solution?
The solution may be shifting your mindset and sharing what you have learned. By taking the following steps, you can remind yourself of your powerful role in addressing the problem of spreading false or misleading information:
1.Remember that you play a crucial role in stopping the spread of false and misleading information.
2. Read across sources before interacting with content.
3. Reflect on your emotional state before engaging.
4. Initiate conversations with friends and family about stopping the spread of false and misleading information.
You can multiply your positive impact by discussing these strategies with family and friends, particularly those who you notice may be sharing false or misleading information. Research shows that people are more likely to correct missteps in sharing false information when someone they trust responds with accurate information.[16]
A Call to Action: Limit the Influence of False and Misleading Information
The spread of false and misleading information can lead to a lack of trust in institutions, a breakdown in civil discourse, and increased polarization. By being responsible and using such strategies as checking multiple sources and avoiding emotional manipulation, people can reduce the impact of bad information during the next election cycle. By working together, individuals can create a more-informed society and protect democracy. Take action by using these strategies and discussing them with others.
Think before you share, because posts have power. And so do you.
Notes
[1] Stephan Lewandowsky, Ullrich K. H. Ecker, and John Cook, “Beyond Misinformation: Understanding and Coping with the ‘Post-Truth’ Era,” Journal of Applied Research in Memory and Cognition, Vol. 6, No. 4, 2017.
[2] Tanya Notley, Simon Chambers, Sora Park, and Michael Dezuanni, Adult Media Literacy in Australia: Attitudes, Experiences and Needs, Western Sydney University, Queensland University of Technology, and University of Canberra, 2021.
[3] Andrew Guess, Jonathan Nagler, and Joshua Tucker, “Less Than You Think: Prevalence and Predictors of Fake News Dissemination on Facebook,” Science Advances, Vol. 5, No. 1, 2019; Sander van der Linden, Jon Roozenbeek, and Josh Compton, “Inoculating Against Fake News About COVID-19,” Frontiers in Psychology, Vol. 11, 2020.
[4] Jacob Fabina and Zachary Scherer, “Voting and Registration in the Election of November 2020: Population Characteristics,” Current Population Reports, U.S. Census Bureau, P20-585, January 2022.
[5] John Cook, Stephan Lewandowsky, and Ullrich K. H. Ecker, “Neutralizing Misinformation Through Inoculation: Exposing Misleading Argumentation Techniques Reduces Their Influence,” PLOS One, Vol. 12, No. 5, 2017; Andrew M. Guess, Michael Lerner, Benjamin Lyons, Jacob M. Montgomery, Brendan Nyhan, Jason Reifler, and Neelanjan Sircar, “A Digital Media Literacy Intervention Increases Discernment Between Mainstream and False News in the United States and India,” Proceedings of the National Academy of Sciences, Vol. 117, No. 27, 2020; Jon Roozenbeek and Sander van der Linden, “The Fake News Game: Actively Inoculating Against the Risk of Misinformation,” Journal of Risk Research, Vol. 22, No. 5, 2019.
[6] Information about our survey methods is included in the box entitled “How This Survey Was Conducted.”
[7] Toby Bolsen and James N. Druckman, “Counteracting the Politicization of Science,” Journal of Communication, Vol. 65, No. 5, 2015.
[8] Sam Wineburg and Sarah McGrew, “Lateral Reading and the Nature of Expertise: Reading Less and Learning More When Evaluating Digital Information,” Teachers College Record, Vol. 121, No. 11, 2019.
[9] Wineburg and McGrew, 2019.
[10] Adam D. Kramer, Jamie E. Guillory, and Jeffrey T. Hancock, “Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks,” Proceedings of the National Academy of Sciences of the United States of America, Vol. 111, No. 24, 2014.
[11] Vlan Bakir and Andrew McStay, “Fake News and the Economy of Emotions: Problems, Causes, Solutions,” Digital Journalism, Vol. 6, No. 2, 2018; Soroush Vosoughi, Deb Roy, and Sinan Aral, “The Spread of True and False News Online,” Science, Vol. 359, No. 6380, 2018.
[12] Bakir and McStay, 2018.
[13] Cameron Martel, Gordon Pennycook, and David G. Rand, “Reliance on Emotion Promotes Belief in Fake News,” Cognitive Research: Principles and Implications, Vol. 5, No. 47, 2020.
[14] Daniel A. Effron and Medha Raj, “Misinformation and Morality: Encountering Fake-News Headlines Makes Them Seem Less Unethical to Publish and Share,” Psychological Science, Vol. 31, No. 1, 2020.
[15] Notley et al., 2021.
[16] Leticia Bode and Emily K. Vraga, “See Something, Say Something: Correction of Global Health Misinformation on Social Media,” Health Communication, Vol. 33, No. 9, 2018; Leticia Bode, Emily K. Vraga, and Melissa Tully, “Do the Right Thing: Tone May Not Affect Correction of Misinformation on Social Media,” Harvard Kennedy School Misinformation Review, 2020.
Alice Huguet is codirector, Center for Qualitative and Mixed Methods; policy researcher; professor of policy analysis, Pardee RAND Graduate School; Julia H. Kaufman is associate research department director, Behavioral and Policy Sciences Department; codirector, American Educator Panels; senior policy researcher; professor of policy analysis, Pardee RAND Graduate School; Melissa Kay Diliberti is assistant policy researcher, RAND, and Ph.D. candidate, Pardee RAND Graduate School. This article is published courtesy of RAND.