As I post this, the world is supposed to be ending, according to a fervent group of Christian cultists. Despite the insignificant size of their membership, the group has attracted an enormous amount of press attention and internet buzz – mostly, I think, because of the remarkable self-confidence with which they peddle their lunatic project. How, we wonder, could someone believe something so baseless – and embrace it so fervently?
We should not be so smug. Erroneous theories aren’t just the province of the lunatic fringe. They’re part of everyone’s basic cognitive legacy. We are hardwired for a phenomenon I call “theory lock,” a predilection rooted in the fact that there’s one concept that the human brain finds almost impossible to grasp: “I don’t know.”
Our minds recoil from uncertainty. We are wired to find order in randomness and chaos. We look at clouds and see sheep. We look at stock price charts and detect patterns. We read our horoscope and think “yes, that totally applies to me!”
In evolutionary terms, this can be a useful feature. After all, when it comes to making decisions, we’re helpless without a theory, a way to make sense of the situation that we’re in. Powerlessness is a deeply upsetting and stressful condition. So when a theory, even a weak one, presents itself amid an explanatory vacuum, we instinctively seize hold and hang on for dear life.
Once we have a theory in our grasp, we begin to see everything through its lens. Information that otherwise seem ambiguous, or even contradictory to that theory, is understood within its framework. And so just by holding a belief we tend to gradually strengthen our conviction that it is true, a tendency that psychologists dub “confirmation bias.”
It’s hard to overstate the power of this effect. We might expect, for instance, that after the world fails to end, the doomsday cultists will be chastened and suffer a painful return to reality. But as Vaughan Bell points out in a recent Slate piece, past doomsday cultists have not only survived the failure of the world to end, but actually strengthened their faith, by finding in events further evidence for the rightness of their worldview.
Confirmation bias is one of those psychological phenomena that, when you hear about it, you can’t help but think: “Wow, that’s cool. But I’m pretty sure I don’t do that kind of thing.” But for an everyday manifestation, consider the experience of getting lost. Heading from Point A to Point B, we misidentify a landmark, miss an intersection, get turned around – suddenly, we don’t know where we are. But of course, we don’t know that we don’t know. And believing that, we see plenty of evidence confirming that we’re right. In his entertaining book “You Are Here,” experimental psychologist Colin Ellard reports that many lost hikers who manage to return to their starting point do so utterly convinced that they were never lost in the first place. They’re the lucky ones: many less fortunate people who get lost in the wilderness are led by their self-confidence to leave the trail network and travel cross-country, a mistake that can have fatal consequences, especially in cold weather.
The need to have a theory is instinctive, but it is not entirely irrational. Search-and-rescue teams looking for a lost person or a missing aircraft always have to start out with a theory of what might have happened, and start their search based on that supposition. Take the case, for instance, of Air France 447, the passenger flight that disappeared over the South Atlantic two years ago. When the plane failed to arrive at its destination, searchers began by asking: Where was its last known position? How long did it likely remain in the air after that point? How fast was it flying? Once it hit the water, how far was the debris carried, and in what direction? To answer these questions, the French aeronautical authority called on the expertise of no less than 11 international bodies, and from their pooled wisdom defined a search zone.
Unfortunately, the same thing that happens to lost individuals also happened to the international search team: their theory was wrong. After nearly two years of searching, the French recognized that something was amiss. They consulted a new group of experts, came up with a new theory, and began looking in a new search zone. This time, they were able to find the wreckage in less than a week.
“The key thing when conducting a search,” says search-and-rescue trainer Craig McClure, “Is not to get locked into your theory. If it’s not producing results, you have to go back to basics and come up with a different theory.”
This can be difficult, even for a group of highly trained experts. For the individual, it can be impossible. Although I personally am horrified by parents who blame vaccines for the autism of their children, I can understand the intensity of their beliefs. Science just doesn’t yet understand what causes autism, and this failure of understanding only adds to the exhaustion and stress that autistic children’s parents suffer. And so they seize on the first seemingly plausible explanation that they come across. Jenny McCarthy has described how she started her impassioned advocacy by typing the word “autism” into Google Search – and the rest was history.
The more I’ve thought about theory lock, the more I’ve come to believe that it’s a fundamental force that affects the most basic ways we experience our world. The other day, I was talking about religion with my son’s nanny, an evangelical Christian. When I told her that I am an atheist, she hit me with a question that she clearly felt would be an irrefutable torpedo to my godlessness: “If God doesn’t exist, then who made the universe?” I told her that we simply don’t know. She clearly found this response deeply unsatisfying.
I understand her point of view. Atheism at heart isn’t a rejection of God; it’s a rejection of certainty in the face of the unknowable, an embrace of “I don’t know.” While rational, it’s an answer that goes against the grain of our nature. We want, we demand, a theory – any theory.