Truth decayFlagging False Facebook Posts as Satire Helps Reduce Belief

Published 9 October 2019

If you want to convince people not to trust an inaccurate political post on Facebook, labeling it as satire can help, a new study finds. Researchers found that flagging inaccurate political posts because they had been disputed by fact-checkers or fellow Facebook users was not as good at reducing belief in the falsehoods or stopping people from sharing them. However, labeling inaccurate posts as being humor, parody or a hoax did reduce Facebook users’ belief in the falsehoods and resulted in significantly less willingness to share the posts.

If you want to convince people not to trust an inaccurate political post on Facebook, labeling it as satire can help, a new study finds.

Researchers at the Ohio State University found that flagging inaccurate political posts because they had been disputed by fact-checkers or fellow Facebook users was not as good at reducing belief in the falsehoods or stopping people from sharing them.

However, labeling inaccurate posts as being humor, parody or a hoax did reduce Facebook users’ belief in the falsehoods and resulted in significantly less willingness to share the posts.

“We thought that fact-checking flags might work pretty well on Facebook, but that’s not what we found,” said R. Kelly Garrett, lead author of the study and professor of communication at Ohio State.

“It only helped to have flags for satirical posts. This raises some really interesting questions about why people are moved to disbelieve a claim when you tell them it is hoax or satire, but not when journalists or even their peers say there is something wrong with the story.”

Garrett conducted the study with Shannon Poulsen, a doctoral student in communication at Ohio State. The results are published online in the Journal of Computer-Mediated Communication.

OSU says that the researchers conducted two separate studies.

In the first, involving 218 adults from across the country in early 2018, participants completed a brief questionnaire that included measures of their positions, knowledge and beliefs about several political topics, as well as their political ideology, party affiliation and demographics.

One of the questions asked them how much they believed two inaccurate claims that had been prevalent in social media. One was a falsehood that was more likely to be believed by Republicans (“millions of illegal votes were cast in the 2016 presidential election”) and one that was more likely to be believed by Democrats (“Russia tampered with vote tallies in order to get Donald Trump elected president”).

The researchers also engaged in a subtle deception.

Participants gave consent and then were asked to sign into their Facebook account, granting researchers access to their user profile. But the researchers did not record their login information or access their accounts.

About two weeks later, the participants were contacted again. The researchers told them that they would be presenting them with “real Facebook content” that appeared on their Facebook feeds. But, in reality, the research team created all the posts.