About three years ago, Search and Rescue professionals started to notice a change in the kind of emergency calls that were coming in. Typically, a rescue mission would start when a hiker failed to report in by a designated time. But then, with increasing frequency, the calls started coming from the missing people themselves: “Hi, I can’t find my trail, I don’t know where I am, I don’t have anything to eat, and it’s getting dark. Can you come get me?” The reason for the change was simple: more people were getting cell phones. Today, of course, everyone does, and the majority of missing-person searches are now initiated by the missing.
This has turned out to be a mixed blessing for Search and Rescue. On the one hand, lost hikers with cell phones don’t have to wait for a day or two before a search gets underway. On the other hand, more people are getting into trouble, because they know that if worse comes to worse, they’re just a phone call away from help. You don’t really need to study a map before you go into the back country, or carry a first aid kit, or enough food and clothes to get you through the night. Just make sure your cell phone bill’s paid up. People head off into the wilderness without thinking about it very much, because they feel that they don’t have to.
The dumbing down of backcountry rescue is just one example of a trend that’s affecting almost every aspect of our cognitive life. I call in mental outsourcing. More and more we’re using technology, especially smartphones, as auxiliary brains, delegating to them mental functions—such as memory, sense of direction, and problem-solving—that we used to routinely do ourselves. Which is perfectly understandable: Why do things the hard way when the easy way is right there at your fingertips? But a growing body of research suggests that the more we offload mental effort, the more we lose the ability to perform those functions for ourselves, with measurable degradation of the corresponding brain regions. Our clever gadgets, in other words, are making us dumb.
Consider memory. The human brain is capable of storing incredible amounts of information. London cab drivers have long been required to spend years studying “the Knowledge,” an encyclopaedic understanding of London’s dense road system. A 2000 study found that these cabbies experienced a marked increase in the size of the hippocampus, a crucial memory center. But thanks to mobile computing, the value of old-fashioned memorization is waning. It’s faster and easier to carry around the information you need on a memory chip. The notion of memorizing your friends’ phone numbers seems as antiquated as rotary dialling. Hippocampus, you’re fired.
Then there’s cognition. Once, daily life meant solving a never-ending series of minor mental puzzles, such as calculating tips and planning driving routes, that engage the brain’s cognitive functions. Now there are apps for these sorts of tasks. Using them is the mental equivalent of taking the elevator instead of the stairs. Taking the easy way once in a while makes no difference, but if you make a habit of it, you’re going to get flabby. A 2010 brain-scan study by researchers at McGill University found that people who use GPS instead of going through the effort of mentally mapping out their location have less gray matter in the corresponding region of the brain. The implication, the researchers suggest, is that heavy reliance on GPS can cause such significant cognitive decline and might even spur the early onset of dementia.
Finally, consider the left inferior frontal junction, the region of the brain involved in switching attention from one task to another. It’s the part you use when dealing with a never-ending flood of email, texts, tweets, and phone calls. Mobile computing has turned us into a nation of 24/7 multitaskers, but unfortunately we’re much worse at it than we realize. Drivers who have a phone conversation while driving function as though at a blood-alchohol level of .08, the equivalent of being legally drunk. It turns out that frequent multitasking boosts the left inferior frontal junction, but that doesn’t mean we do it better. On the contrary, in a 2009 paper a team of Stanford researchers reported that “heavy media multitaskers are more susceptible to interference from irrelevant environmental stimuli and from irrelevant representations in memory.” To paraphrase: Hello, ADHD.
I’m not saying that this new technology is an overall detriment to humanity. I’ve got an iPhone, too, and I use it all the time. I’m grateful for its convenience and power, and I believe that when used correctly it can boost brain power instead of melting it. The trick is to use it as a tool, not a crutch. So when I call my wife, I dial her digits from memory. When faced with a simple arithmetic problem, I run the numbers in my head. And when I head out into the woods, I make sure that I study a trail map first.
[This article originally appeared in the August 2012 edition of Red Bulletin magazine.]