The Danger Is Real: Why We’re All Wired for ‘Constructive Conspiracism’

Shermer notes that Jared Diamond, the UCLA geographer and Pulitzer-Prize winning author of Guns, Germs, and Steel, has identified what he calls “constructive paranoia,” or “the importance of being attentive to hazards that carry a low risk each time but are encountered frequently.” While out in the rain forest with local colleagues in Papua New Guinea one night, Diamond proposed that that they pitch their tents under a big tree. “To my surprise,” wrote Diamond, “my New Guinea friends absolutely refused. They explained that the tree was dead and might fall on us.”

At first, Diamond thought them paranoid. Over the years, however, he formed a different opinion: “I came to realize that every night that I camped in a New Guinea forest, I heard a tree falling. And when I did a frequency/risk calculation, I understood their point of view.” If the odds of a tree falling on you any given night are only one in 1,000, but you sleep under trees every night, “you’ll be dead within a few years.”

“I would like to adapt Diamond’s idea to what I call constructive conspiracism: Sometimes “they” really are out to get you, so it pays to be careful,” Shermer writes, adding:

In his 2018 book Enlightenment Now: The Case for Reason, Science, Humanism, and Progress, Harvard psychologist Steven Pinker argued that, in our ancestral environment, the cost of overreacting to a threat was less than the cost of underreacting, so we have become programmed to err on the side of overreaction. I.e., we expect the worst.

Pinker traces the blame for our evolved constructive paranoia all the way back to the Second Law of Thermodynamics, which states that the total entropy (or disorder) of a closed system (one not losing or gaining mass or energy, such as through human intervention) cannot decrease over time. It can only remain constant or increase. Systems tend to move from order to disorder, from organization to disorganization, from structured to unstructured. Absent outside intervention, metal rusts; wood rots; weeds overwhelm gardens; bedrooms get cluttered; and social, political and economic systems fall apart. Which is to say, the very nature of science dictates that it is far easier for things to go bad than good (except in regard to those few tasks that leverage entropy to human advantage, such as the passive evacuation of heat from a rolled metal ingot or forged object). As Pinker puts it, “the Second Law defines the ultimate purpose of life, mind, and human striving: to deploy energy and information to fight back the tide of entropy and carve out refuges of beneficial order.”

The ne plus ultra explanation for entropy can be found on that ubiquitous bumper sticker, “Shit Happens.” So-called “misfortunes,” like accidents, disease and famine, typically have no purposeful agency behind them—no gods, demons or witches, intending us evil—just entropy taking its course. But people do tend to seek out hidden sources of agency as a means to explain the presence of misfortune in our lives (for reasons described in the paragraphs below),which is why we attribute many of life’s outcomes to far-fetched conspiracies. While a totalizing immersion in conspiracism can destroy one’s perspective and rational faculties, our susceptibility to conspiracy theories isn’t some stray programming bug that infects our cognition. It’s a systematic habit rooted directly in mental reflexes that served us well in our ancestral environment.

In my 2011 book The Believing Brain: From Ghosts and Gods to Politics and Conspiracies—How We Construct Beliefs and Reinforce Them as Truths, I discussed a quality we all share called patternicity, which is the tendency to find meaningful patterns in data that might well be completely random.

To explain why we evolved this feature in our thinking, let’s start with a thought experiment: Imagine you lived three million years ago on the plains of Africa as a tiny small-brained bipedal primate that was highly vulnerable to the region’s many terrifying predators. You hear a rustle in the grass. Is it just the wind or is it a dangerous animal? If you assume that the rustle in the grass is a dangerous predator, but it turns out that it’s just the wind, you have generated a “false positive”—believing something is real when it isn’t. There’s no harm, though: You simply move away and become more alert and cautious. But if you assume that the rustle in the grass is just the wind, and it turns out to be a dangerous predator, you have generated a “false negative,” while the predator has gained a meal. Over the course of many such meals, those primates susceptible to false negatives will enter the fossil record before they can reproduce.

This is what I mean by constructive conspiracism: If it turns out there is no danger, no harm is done and little energy is expended in indulging these momentary spasms of paranoia. If it turns out that there is danger, on the other hand, being constructively paranoid pays off.

Examples of how these mental reflexes project themselves onto grand narratives abound. And in all cases, they involved the historical equivalent of grass rustling—a princess who died in a Paris car crash; a man opening up an umbrella on a sunny day in Dallas; a BBC newscaster announcing the fall of a World Trade Center building before it actually fell. Conspiracy theorists have dedicated their lives to searching the long grass for the still-hidden creatures that supposedly engineered these tragedies. They’ll never find them because they don’t exist. But given all the very real predators that have tried to devour us over the eons, you can’t blame them for looking.