ARGUMENT: Social Media & Societal HarmNearing the Tipping Point Needed to Reform Facebook, Other Social Media?

Published 1 October 2021

The recent series of five articles from the Wall Street Journal exposed Facebook’s complicity in spreading toxic content. Yet, social media platforms continue to enjoy free rein despite playing what many consider to be an outsized and destabilizing role in delivering content to billions of individuals worldwide. No one said reigning in social media was going to be easy. But the harm caused of social media is simply too big for us to fail.

The recent series of five articles from the Wall Street Journal exposing Facebook’s complicity in spreading toxic content underscores the need to recognize the harm caused by Facebook. Yet the company and other social media platforms continue to enjoy free rein despite playing outsized and destabilizing roles in determining what content is served up to billions of individuals worldwide.

The WSJ’s “Facebook Files” were based on internal documents, often obtained from company employees, which acknowledged the massive harm caused by the platform and its failure to fix the problems. We already knew about Facebook’s deceptive, misleading, and inconsistent approach to content moderation. However, the articles offered still more proof—and more insider confirmation—that the company was not only aware of the dangers posed to its users, but worked to bury the troubling findings of its own researchers.

As many have pointed out, freedom of speech is not the same as freedom of reach. We have learned over and over again that we cannot trust the tech sector to self-regulate.

Repeatedly, these companies—and Facebook in particular—have been shown to play key roles in undermining democracy, casting doubts on research-backed science, spreading human rights abuses, exacerbating angry echo chambers that deepen political rifts, and fomenting mental health crises.

Facebook has consistently blocked or sought to undermine non-governmental attempts to independently assess the prevalence of disinformation and other harmful content on its sites. For example, Facebook dismissed widely accepted scholarly research linking social media use to increased rates of depression as “inconclusive.” In this regard, Facebook is not unlike the tobacco company Phillip Morris, the asbestos manufacturer Manville (formerly Johns-Manville), or the pharmaceutical giant Purdue Pharma. Like Facebook, these companies already possessed internal research showing that their products were harmful, but buried those findings, disputed the work or credibility of outside experts that they knew to be accurate, and put out information designed to create confusion or specious debate. Ultimately, all these companies faced, or are facing, legal accountability. Big Tech should not be exempt from consequences.