Deadly Mind Traps
Simple cognitive errors can have disastrous consequences—unless you know how to watch out for them.
By Jeff Wise published January 1, 2012 - last reviewed on June 9, 2016
The hiker who leaves a well-marked trail and wanders off, cross-country. The pilot who flies his perfectly maintained airplane into the ground. The kayaker who dives into a hydraulic whitewater "grinder" even though he's just seen it suck three buddies to their doom. "Gee," you think when you hear such tales, "I'd never do something like that."
But would you? We like to think of ourselves as pretty rational, but that's hardly how we seem from the perspective of accident investigators and search-and-rescue crews.
People who deal with the aftermath of human error can tell you all too well that otherwise normal, healthy individuals are exceptionally predisposed to making the kind of mistake best described as boneheaded.
Intriguingly, research into this kind of self-defeating behavior shows that it is usually far from random. When we make mistakes, we tend to make them in ways that cluster under a few categories of screwup. There's a method to our mindlessness. Most of the time, we're on autopilot, relying on habit and time-saving rules of thumb known as heuristics.
For the most part, these rules work just fine, and when they don't, the penalty is nothing worse than a scraped knee or a bruised ego. But when the stakes are higher, when a career is in jeopardy or a life is on the line, they can lead us into mental traps from which there is no escape. One slipup leads to another, and to another, in an ever-worsening spiral. The pressure ratchets up, and our ability to make sound decisions withers.
These cognitive errors are most dangerous in a potentially lethal environment like the wilderness or the cockpit of an aircraft, but versions of them can crop up in everyday life, too, such as when making decisions about what to eat, whom to date, or how to invest. The best defense? Just knowing they exist. When you recognize yourself starting to glide into one of these mind traps, stop, take a breath, and turn on your rational brain.
Mountain climbing at high altitudes is a race against time. Human endurance is severely limited in the face of extreme cold and limited oxygen, and windows of good weather can shut abruptly. Lingering too long is an invitation to disaster, so when preparing a final push to the summit, mountaineers need to set a turnaround time and strictly abide by it.
The consequence of failing to heed this sacred rule was made gruesomely manifest on May 10, 1996. On that date an unprecedented number of climbers were preparing to make the final stage of their ascent of Everest, including two commercial teams of 16 customers who had paid as much as $65,000 each to reach the top of the world. For expedition leader Rob Hall, getting his clients safely to the top and back meant abiding by a turnaround time of 2 p.m. But all morning, miscommunication slowed the climbers' progress.
The turnaround time came and went. One by one, climbers straggled to the top, briefly celebrated, then descended. Hall remained, waiting for the last of his clients to summit. Finally, at 4 p.m., the final straggler arrived, and Hall headed down. But it was too late. Already, a deadly storm system had begun to close in, lashing the mountain with hurricane-force winds and whiteout snow. Stuck on Everest's exposed face, eight climbers died, one by one. Hall was one of the last to succumb. Trapped a few hundred feet below the summit, paralyzed by the cold and a lack of oxygen, he radioed his colleagues at base camp and was patched through via satellite to his wife back home in New Zealand. "Sleep well, my sweetheart," he told her. "Please don't worry too much." Today his body remains where he sat.
Hall fell victim to a simple but insidious cognitive error common to many types of high-pressure undertakings. I call it "redlining." Anytime we plan a mission that requires us to set a safety parameter, there's a risk that in the heat of the moment we'll be tempted to overstep it. Divers see an interesting wreck or coral formation just beyond the maximum limit of their dive tables. Airplane pilots descend through clouds to their minimum safe altitude, fail to see the runway, and decide to go just a little bit lower.
It's easy to think: I'll just go over the redline a little bit. What difference will it make? The problem is that once we do, there are no more cues reminding us that we're heading in the wrong direction. A little bit becomes a little bit more, and at some point it becomes too much. Nothing's calling you back to the safe side.
A related phenomenon has been dubbed the "what-the-hell effect," which can occur when dieters try to control their impulses by setting hard-and-fast daily limits on their eating, a kind of nutritional redline. One day, they slip up, eat a sundae, and boom—they're over the line. "Now they're in no-man's-land," says Art Markman, professor of psychology at the University of Texas at Austin, "so they're just going to blow the diet completely. They're going to binge."
As in mountain climbing, the best response to passing a redline is to recognize what you've done, stop, and calmly steer yourself back toward the right side. "Focus on the outcome," says Markman. "For dieters, what's important is the long-term process, not what happens on any individual day."
2: The Domino Effect
The problem began with a minor malfunction. Scott Showalter, a 34-year-old Virginia dairy farmer, was trying to transfer manure from one holding pit to another when the pipe between them became clogged. As he'd done before, he climbed down to free the obstruction. What he neither saw nor sensed was the invisible layer of methane gas that filled the bottom of the pit. Deprived of oxygen, he keeled over within seconds. When an employee, Amous Stoltzfus, climbed down to Showalter's aid, he too succumbed, but not before his shouts drew the attention of Showalter's wife and two of their daughters, aged 9 and 11. One by one, each climbed down to rescue the others, and each one died in turn. Within minutes, five people were dead. "It was a domino effect," Sheriff Don Farley later told reporters.
Similar tragedies play out time and again when people try to rescue companions. A teen jumps from a dangerous waterfall and disappears; his buddies follow, one after the other, until they all drown. A firefighter goes into a burning building to rescue a comrade; another goes in after him, then another.
In each case, the domino effect results from a deep-seated emotion: the need to help others. Altruism offers an evolutionary advantage but can compel us to throw our lives away for little purpose. "In stressful situations, you see a failure in the working memory, which is involved in inhibiting impulses," says Sian Beilock, a psychology professor at the University of Chicago. "People lose the ability to think about the long-term consequences of their actions."
If you ever find yourself in an unfolding tragedy like the Showalters', Beilock recommends pausing for a moment and taking a deep breath. "Even taking one step back sometimes allows you to see it in a different light, to maybe think, My efforts would be better spent running to get help. I imagine that in these situations, that's an alternative that isn't even considered."
Something similar unfolds in some romantic relationships, when partners, perhaps unwittingly, enable or get sucked into their partner's addictions or narcissism. "You end up doing things for the other person even though it's not in your own best interest," Beilock says, "or even in the interest of the relationship." That kind of love is like a manure pit: The only way you can save yourself is to get the hell out.
3: Situational Blindness
In December 2009, John Rhoads and his wife, Starry Bush-Rhoads, headed back to their home in Nevada after a visit to Portland, Oregon. Following the directions of their GPS, they drove south on U.S. Highway 97 through Bend, then turned left on Oregon Highway 31, passing through a dramatic high desert landscape before connecting with the highway to Reno near the California border.
Near the town of Silver Lake, Oregon, their GPS told them to turn off the highway, onto a little-used forest road. If they'd continued straight, they'd have been home in five hours. But the GPS was set to "shortest route," not "fastest." The dirt road took them into ever-deepening snow. After driving more than 30 miles, they got stuck, managed to dig themselves out, drove further, then got stuck again. They tried calling 911, but couldn't get cell phone reception. For three days, they huddled for warmth, until they finally managed to get a signal and call for help. A sheriff's deputy came to winch out their car. "Who knows what would have happened if they would have been up there for a few more days?" the deputy told reporters.
As GPS units and satellite navigation apps have flourished over the past few years, there's been a spate of similar cases, in which travelers follow their devices blindly and wind up getting badly lost. In each case, the underlying mistake is not merely technological but perceptual: the failure to remain aware of one's environment, what aviation psychologists call situational awareness, or SA. People have always had difficulties maintaining SA, psychologists say, but the proliferation of electronics, and our blind faith that it will keep us safe, has led to an epidemic of absentmindedness.
"A big element in SA is paying attention to cues," says Jason Kring, president of The Society for Human Performance in Extreme Environments. "If you're focusing just on that GPS unit, and you see that little icon moving down the road, and say to yourself, OK, I know where I am, technically, that can be a big problem, because you're not looking at the world passing by your windshield."
Full situational awareness requires incorporating outside information into a model of your environment, and using that model to predict how the situation might change. If all you're doing is following the line of the GPS, and it turns out to be wrong, you'll be completely clueless about what to do next.
In daily life we rely on what Beth Blickensderfer, a professor of applied psychology at Embry-Riddle Aeronautical University, calls "social SA" to navigate our way through the human maze. When you miss social cues, says Blickensderfer, an embarrassing faux pas can occur. "Using swear words is completely fine in some settings," she says. "In others, it's not." As the stranger in a crowd, you'll have to pay attention to figure out what's appropriate.
4: Double or Nothing
In February 2003, a group of foreign tourists visiting northern California prepared to watch a hot-air balloon take off at the Domaine Chandon vineyard near Yountville. Shortly before 8 a.m., the ground crew were untethering the inflated balloon when one of the tourists, a 33-year-old Scot named Brian Stevenson, grabbed hold of the basket, apparently in an attempt to help. The pilot lit the propane burners, and with a roar of gas and flame the balloon began to rise.
Stevenson held on, despite a chorus of shouts from the ground urging him to let go. The balloon rose quickly: 10 feet, 20, 40, 100. The empty air below Stevenson's dangling feet stretched to a horrifying distance. At 300 feet, he could hold on no longer. His fellow tourists watched helplessly as their companion's body plummeted fatally to the earth.
If a balloon unexpectedly begins to rise, a person hanging on can follow a deadly logic: When he's only been lifted a foot or two into the air, he may think, Oh, that's no big deal, I can just step down if I need to. Then suddenly he's at six feet, and thinks, I could twist an ankle, I'd better hang on and wait until it gets lower. Before he knows it, he's at 25 feet, realizing that a jump would cause serious injury at best.
To avoid this predicament, balloon ground-handling crews are trained always to observe an inviolable rule: Never let both feet leave the ground.
The runaway-balloon problem is a manifestation of our irrational assessment of risks and rewards. As Daniel Kahneman and Amos Tversky first pointed out back in 1979, we tend to avoid risk when contemplating potential gains but seek risk to avoid losses. For instance, if you offer people a choice between a certain loss of $1,000 and a 50-50 chance of losing $2,500, the majority will opt for the riskier option, to avoid a definite financial hit. From the perspective of someone dangling 20 feet in the air, the gamble that they might be able to ride the gondola safely back down to the ground seems preferable to a guaranteed pair of broken legs. But in the moment, they can't factor in the price they'll pay if they lose.
Casinos make a good profit from our propensity for risk-seeking behavior. Gamblers wind up in a hole, then instinctively take bigger and bigger risks in an attempt to recoup the losses. Most go in hoping for the best, but to a veteran in the field of applied psychology, it's a foregone conclusion. Says Markman: "I always tell my students, if you're tempted to go to Vegas, just write me a check instead."
5: Bending the Map
Our minds recoil from uncertainty; we are wired to find order in randomness. We look at clouds and see sheep. This can be a useful trait when it comes to making decisions, since we're helpless without a theory that makes sense of our quandary. Unfortunately, once we form a theory, we tend to see everything through its lens. It's hard to let go of a fixed belief. A consequence is that when people get lost in the back country, they can convince themselves that they know exactly where they are, a problem known in the search-and-rescue community as "bending the map."
A few years ago three twentysomething skiers went out-of-bounds at the Teton Springs Ski Area in Idaho. Looking for fresh powder in Rock Creek Canyon, they took a wrong turn, headed north instead of south, and wound up at the bottom of Granite Canyon. If they'd been where they thought they were, the stream should have been flowing right to left, and heading left would have taken them back to the ski area. Instead, they found the stream flowing left to right. They knew they needed to go left to get home, but based on the topography of where they thought they were, they also had to go downhill. Eventually, they decided on a solution: In this particular case, the water had to be flowing uphill.
They marched upstream, away from the ski area, and wound up having to spend the night in the snow without any survival gear. The next morning, they reconsidered their earlier logic, and decided that, yes, the stream must indeed be flowing uphill. They'd bushwhacked another quarter mile in the wrong direction before a rescue helicopter found them and flew them to safety.
Such errors of overconfidence are due to a phenomenon psychologists call confirmation bias. "When trying to solve a problem or troubleshoot a problem, we get fixated on a specific option or hypothesis," explains Kring, "and ignore contradictory evidence and other information that could help us make a better decision."
A vast collective error of confirmation bias unfolded in the past decade as investors, analysts, and financial advisers all managed to convince themselves that legions of financial derivatives based on subprime mortgages were all fundamentally sound. There was plenty of evidence to the contrary, and many commentators pointed out the facts. But the money was so good that too many found it easier to believe. They kept convincing themselves right up until the roof caved in.
How can you avoid confirmation bias? You can employ some of the same strategies for sidestepping other mind traps. Take yourself off autopilot. Become aware of your environment. Make a habit of skepticism, including skepticism toward your own assumptions and gut feelings. "Don't use your intuition to convince yourself that things are going right, use it to alert you to potential problems," says Jeff Haack, search-and-rescue specialist for Emergency Management British Columbia. "Listen to those niggling doubts."