The Dynamics That Polarize Us on Social Media Are About to Get Worse
Worse, such systems are vulnerable to manipulation by well organized groups with political agendas. For instance, Chinese nationalists reportedly mounted a campaign to edit Wikipedia entries related to China-Taiwan relations to be more favorable to China.
Political Polarization and Acrophily
Indeed, politics intensifies these dynamics. In the US, political identity increasingly dominates how people define their social groups.
Political groups are motivated to define “the truth” in ways that advantage them and disadvantage their political opponents. It’s easy to see how organized efforts to spread politically motivated lies and discredit inconvenient truths could corrupt the wisdom of crowds in Meta’s community notes.
Social media accelerates this problem through a phenomenon called acrophily, or a preference for the extreme. Research shows that people tend to engage with posts slightly more extreme than their own views.
These increasingly extreme posts are more likely to be negative than positive. Psychologists have known for decades that bad is more engaging than good. We are hardwired to pay more attention to negative experiences and information than positive ones.
On social media, this means negative posts – about violence, disasters and crises – get more attention, often at the expense of more neutral or positive content.
Those who express these extreme, negative views gain status within their groups, attracting more followers and amplifying their influence. Over time, people come to think of these slightly more extreme negative views as normal, slowly moving their own views toward the poles.
A recent study of 2.7 million posts on Facebook and Twitter found that messages containing words such as “hate”, “attack” and “destroy” were shared and liked at higher rates than almost any other content. This suggests that social media isn’t just amplifying extreme views — it’s fostering a culture of out-group hate that undermines the collaboration and trust needed for a system like community notes to work.
The Path Forward
The combination of negativity bias, in-group/out-group bias and acrophily supercharges one of the greatest challenges of our time: polarization. Through polarization, extreme views become normalized, eroding the potential for shared understanding across group divides.
The best solutions, which I examine in my forthcoming book, The Collective Edge, start with diversifying our information sources. First, people need to engage with — and collaborate across — different groups to break down barriers of mistrust. Second, they must seek information from multiple, reliable news and information outlets, not just social media.
However, social media algorithms often work against these solutions, creating echo chambers and trapping people’s attention. For community notes to work, these algorithms would need to prioritize diverse, reliable sources of information.
While community notes could theoretically harness the wisdom of crowds, their success depends on overcoming these psychological vulnerabilities. Perhaps increased awareness of these biases can help us design better systems — or empower users to use community notes to promote dialogue across divides. Only then can platforms move closer to solving the misinformation problem.
Colin M. Fisher is Associate Professor of Organizations and Innovation and Author of “The Collective Edge: Unlocking the Secret Power of Groups”, UCL. This article is published courtesy of The Conversation.