TRUTH DECAYThe Dynamics That Polarize Us on Social Media Are About to Get Worse

By Colin M. Fisher

Published 14 January 2025

Meta founder and CEO Mark Zuckerberg has announced Facebook, Instagram and Threads, instead of relying on independent third-party factcheckers, will now emulate Elon Musk’s X in using “community notes.”But research shows that political polarization prevents community fact-checking from even occurring – and, what is worse, community-notes systems are vulnerable to manipulation by well organized groups, and foreign governments, with political agendas.

Meta founder and CEO Mark Zuckerberg has announced big changes in how the company addresses misinformation across Facebook, Instagram and Threads. Instead of relying on independent third-party factcheckers, Meta will now emulate Elon Musk’s X (formerly Twitter) in using “community notes”. These crowdsourced contributions allow users to flag content they believe is questionable.

Zuckerberg claimed these changes promote “free expression”. But some experts worry he’s bowing to right-wing political pressure, and will effectively allow a deluge of hate speech and lies to spread on Meta platforms.

Research on the group dynamics of social media suggests those experts have a point.

At first glance, community notes might seem democratic, reflecting values of free speech and collective decisions. Crowdsourced systems such as Wikipedia, Metaculus and PredictIt, though imperfect, often succeed at harnessing the wisdom of crowds — where the collective judgement of many can sometimes outperform even experts.

Research shows that diverse groups that pool independent judgements and estimates can be surprisingly effective at discerning the truth. However, wise crowds seldom have to contend with social media algorithms.

Many people rely on platforms such as Facebook for their news, risking exposure to misinformation and biased sources. Relying on social media users to police information accuracy could further polarize platforms and amplify extreme voices.

Two group-based tendencies — our psychological need to sort ourselves and others into groups — are of particular concern: in-group/out-group bias and acrophily (love of extremes).

Ingroup/Outgroup Bias
Humans are biased in how they evaluate information. People are more likely to trust and remember information from their in-group — those who share their identities — while distrusting information from perceived out-groups. This bias leads to echo chambers, where like-minded people reinforce shared beliefs, regardless of accuracy.

It may feel rational to trust family, friends or colleagues over strangers. But in-group sources often hold similar perspectives and experiences, offering little new information. Out-group members, on the other hand, are more likely to provide diverse viewpoints. This diversity is critical to the wisdom of crowds.

But too much disagreement between groups can prevent community fact-checking from even occurring. Many community notes on X (formerly Twitter), such as those related to COVID vaccines, were likely never shown publicly because users disagreed with one another. The benefit of third-party factchecking was to provide an objective outside source, rather than needing widespread agreement from users across a network.