Why Your Brain’s Wrong About Danger

The world is full of potential hazards. Unfortunately, the things that we’re afraid of oftentimes aren’t the things that are actually mostly likely to hurt us. Our brains’ admirably lightning-fast danger-detection system evolved in an environment different from the modern world, so it tends to trigger our alarms when the situation isn’t really that grievous. Worse, it can fail to activate when we really are faced with a real-and-present danger, leaving us prone to walk right into potentially harmful situations. That doesn’t mean we’re totally SOL, however. With a little effort we can consciously arm ourselves with rational risk-benefit analysis and try to override our erroneous automatic impulses.

For example? Here are some perceived dangers that get blown out of proportion:

Cell phone radiation. Uncertainty is a powerful trigger of anxiety. When new technologies reach a broad public, they can seem uncomfortably mysterious, and hence ready targets for health-scare furors. Remember Alar, silicone breast implants, electromagnetic fields from power lines? In the case of cell phones, the word “radiation” adds additional emotional baggage. The fact is, numerous studies have failed to turn up any conclusive evidence that cell phone emissions pose health risks, and the physics suggests it’s most likely impossible.

Vaccinations. We tend to be less afraid of things that provide an immediate, palpable benefit. The problem with vaccinations is that we never really know if they’ve helped us or not. What’s more, we’re intuitively suspicious of things we perceive as artificial rather than natural. These biases have helped stoke widespread panic about childhood vaccinations, which in turn expose kids to very real risk.

Terrorism. Specific, reliable information is a powerful defense against fear, while vague or untrustworthy information increases anxiety. Though the government has provided no evidence that organized Muslim terrorists are operating within the US, air travelers are still told that the threat level is “orange”—whatever that means.

Conversely, some kinds of threats consistently fail to trigger our inner alarm system, so we’re apt to endanger ourselves by overlooking them. Some of the more pressing:

Driving. A sense of being in control suppresses fear, so most of us feel invincible when we’re behind the wheel. In reality, we’re at risk of other drivers and other hazards. Last year, 82 Americans died every day on our roads.

Obesity. Because we don’t directly perceive the link between taking that extra bite and the catastrophic lethality of a heart attack or stroke, we find it hard to generate an emotional reaction to the dangers of obesity. Worse, because of a psychological phenomenon called “optimism bias,” we tend to overestimate our ability to change our eating habits. That’s unfortunate, because obesity is one of the worst and fastest-growing health problems facing this country.

Nuclear war. For decades, Americans worried that they might be wiped out in a nuclear attack. After the fall of Communism, that danger seemed to melt away. But thousands of nukes remain on standby. Many experts in the field of global risk assessment rank the danger of nuclear war as one of the few threats that could credibly end civilization in the near future. But since hardly anyone talks about it anymore, we can’t easily imagine an Armageddon scenario, fear it, or push for something to be done about it.

In his book How Risky Is It, Really?, risk management expert (and fellow PT blogger) David Ropeik dubs the difference between fear and actual risk “the Perception Gap.” This discrepancy can not only lead us to engage in dangerous behavior, he says, but can foment health-damaging stress. The first step to responding more intelligently is to recognize how instinct steers us wrong. “Just as we use a seat belt to protect ourselves in a dangerous environment,” Ropeik says, “we can use knowledge of the cognitive biases to help protect us from dangerous misjudgment.”

That’s what he believes, at any rate—it could just be the optimism bias talking.