Bending the Map: Our minds are wired to find order in randomness. We look at clouds and see sheep. This can be useful for making decisions, since we’re helpless without a theory that makes sense of our quandary. But once we form a theory, we tend to see everything through it. A consequence is that when people get lost in the backcountry, they can convince themselves they know exactly where they are, a problem sometimes called bending the map.
A few years ago, three twentysomething skiers went out-of-bounds at the Jackson Hole Mountain Resort at Teton Village in Wyoming. Looking for fresh powder in Rock Springs Bowl, they took a wrong turn, headed north instead of south, and wound up at the bottom of Granite Canyon. If they’d been where they thought they were, the stream should have been flowing right to left, and heading left would have taken them back to the ski area. Instead, they found the stream flowing left to right. They knew they needed to go left to get home, but based on the topography of where they thought they were, they also had to go downhill. Eventually, they decided on a solution: In this particular case, the water had to be flowing uphill.
The group marched upstream, away from the ski area, and wound up spending the night in the snow without any survival gear. The next morning, they reconsidered their earlier logic and decided that, yes, the stream must indeed be flowing uphill. They had bushwhacked another quarter mile in the wrong direction before a rescue helicopter found them and flew them to safety.
Such errors of overconfidence are due to a phenomenon psychologists call confirmation bias. “When trying to solve a problem, we get fixated on a specific option or hypothesis,” explains Jason Kring, president of the Society for Human Performance in Extreme Environments, “and ignore contradictory evidence and other information that could help us make a better decision.”
A vast collective error of confirmation bias unfolded in the past decade as investors, analysts, and financial advisers all convinced themselves that legions of financial derivatives based on subprime mortgages were all fundamentally sound. There was plenty of evidence to the contrary, but the money was so good that many found it easier to believe. They kept convincing themselves right up until the roof caved in.
Avoid the trick: To outsmart confirmation bias, make a habit of skepticism, including skepticism toward your own gut feelings and assumptions. If you’re part of a group that seems prone to agreement, play devil’s advocate to encourage others to share different points of view. “Don’t use your intuition to convince yourself that things are going right; use it to alert yourself to potential problems,” says Jeff Haack, a former search-and-rescue specialist for Emergency Management British Columbia. “Listen to those nagging doubts.”
MIND TRICK: “There’s a risk that in the heat of the moment, we’ll be tempted to overstep our safety parameters.”
Redlining: Mountain climbing at high altitudes is a race against time. Our endurance is severely limited in the face of extreme cold and limited oxygen, and windows of good weather can shut abruptly. Lingering too long is an invitation to disaster, so when mountaineers prepare to summit, they need to set a turnaround time and strictly abide by it.
The consequence of failing to heed this sacred rule was made gruesomely manifest on May 10, 1996. On that date, an unprecedented number of climbers were preparing to make the final stage of their ascent of Mount Everest, including some who had paid as much as $65,000 each. For expedition leader Rob Hall, getting his clients safely to the top and back meant a turnaround time of 2 p.m. But the turnaround time came and went. Finally, at 4 p.m., the last straggler arrived at the summit, and Hall headed down. But it was too late.
Already a deadly storm had begun, lashing the mountain with hurricane-force winds and whiteout snow. Stuck on Everest’s exposed face, eight climbers died, one by one. Hall was one of the last to succumb. Trapped a few hundred feet below the summit, paralyzed by the cold and a lack of oxygen, he radioed base camp and was patched through via satellite to his wife home in New Zealand. “Sleep well, my sweetheart,” he told her. “Please don’t worry too much.” Today his body remains where he sat.
Hall fell victim to a simple but insidious cognitive error that I call redlining. Anytime we plan a mission that requires setting a safety parameter, there’s a risk that in the heat of the moment, we’ll be tempted to overstep it. Divers see an interesting wreck just beyond the limit of their dive tables. Pilots descend through clouds to their minimum safe altitude, fail to see the runway, and go just a little bit lower.
It’s easy to think, I’ll just go over the redline a little bit. What’s the big deal? The problem is that once we do, there are no more cues reminding us that we’re heading in the wrong direction. A little bit becomes a little bit more, and at some point, it becomes too much. Nothing’s calling you back to the safe side.
A similar phenomenon has been dubbed the what-the-hell effect, such as when dieters control impulses with strict limits on their eating, a nutritional redline. One day, they slip up, eat a sundae, and boom—they’re over the line. “Now they’re in no-man’s-land,” says Markman, “so they just blow the diet completely. They binge.”
Avoid the trick: As in mountain climbing, the best response to passing a redline is to recognize what you’ve done, stop, and calmly steer yourself back toward the right side. When it’s not life-or-death stakes, know that redlining is a reality, and try to check it as much as possible.