Deadly Mind Tricks

Updated: Apr. 06, 2020

Solid intuition or not, sometimes our gut instincts lead us to fatal errors. Learn about the brain science behind five mental traps and how to avoid them.

Illustration by Diego PatiñoMIND TRICK: “Similar tragedies play out time and again when people try to rescue companions.”

Domino Effect: The problem began with a minor malfunction. Scott Showalter, a 34-year-old Virginia dairy farmer, was trying to transfer manure from one holding pit to another when the pipe between them became clogged. As he’d done before, he climbed down to free the obstruction. But what he neither saw nor sensed was the invisible layer of hydrogen sulfide gas that filled the bottom of the pit. He keeled over within seconds. When an employee, Amous Stolzfus, climbed down to Showalter’s aid, he too succumbed, but not before his shouts drew the attention of Showalter’s wife and two of their daughters, ages 9 and 11. One by one, each climbed down to rescue the others, and each one died in turn.

Similar tragedies play out time and again when people try to rescue companions. A teen jumps from a dangerous waterfall and disappears; his buddies follow, one after the other, until they all drown. A firefighter goes into a burning building to rescue a comrade; another goes in after him, then another.

In each case, the domino effect results from a deep-seated emotion: the need to help others. The fear response shuts down areas of the brain that handle complex thoughts and planning, but it doesn’t affect simple emotions or well-learned habits like altruism. So we’re driven to think about helping others instead of rationally identifying potential hazards, like invisible poison gas or an underwater hydraulic. “People lose the ability to think about the long-term consequences of their actions,” says Sian Beilock, PhD, a professor of psychology at the University of Chicago.

Avoid the trick: If you ever find yourself in an unfolding tragedy like the Showalters’, Beilock recommends pausing for a moment to take a deep breath and think about what’s going on. “Even taking one step back sometimes allows you to see it in a different light, to maybe think, My efforts would be better spent running to get help,” she says. Of course, it’s extremely difficult to separate rational thought from emotion during an unfamiliar crisis. Planning for potential dangers can help; for instance, every family should practice a fire drill routine in their home.

Illustration by Diego PatiñoMIND TRICK: “When the balloon began to rise, he held on, despite a chorus of shouts from the ground urging him to let go.”

Double or Nothing: In February 2003, a group of foreign tourists visiting Northern California prepared to watch a hot-air balloon take off at the Domaine Chandon vineyard near Yountville. Shortly before 8 a.m., the ground crew was repositioning the inflated balloon when one of the tourists, a 33-year-old Scot named Brian Stevenson, grabbed hold of the basket, perhaps in an attempt to help.

But when the balloon began to rise, Stevenson held on, despite a chorus of shouts from the ground urging him to let go. The balloon rose quickly: 10 feet, 20, 40, 100. The empty air below Stevenson’s dangling feet stretched to a horrifying distance; pretty soon, he could hold on no longer. His fellow tourists watched as their companion plummeted fatally to the earth.

If a balloon unexpectedly begins to rise, a person hanging on can follow a deadly logic: When he’s only been lifted a foot or two in the air, he may think, Oh, that’s no big deal. I can just step down if I need to. Then suddenly he’s at six feet and thinks, I could twist an ankle, I’d better hang on and wait until it gets lower. Before he knows it, he’s at 25 feet, realizing that a jump would cause serious injury at best.

The runaway-balloon problem is a manifestation of our irrational assessment of risks and rewards. We tend to avoid risk when we’re contemplating potential gains but seek risk to avoid losses. For instance, if you offer people a choice between a certain loss of $1,000 and a fifty-fifty chance of losing $2,500, the majority will opt for the riskier option, to avoid a definite financial hit. From the perspective of someone dangling 20 feet in the air, the gamble that he might be able to ride the gondola safely back to the ground seems preferable to a guaranteed pair of broken legs. But in the moment, he can’t factor in the price he’ll pay if he loses.

Avoid the trick: Casinos make a good profit from our flawed ability to calculate true risk. Gamblers wind up in a hole, then instinctively take bigger and bigger risks in an attempt to recoup the losses. To a veteran in the field of applied psychology, it’s a foregone conclusion. “I always tell my students, if you’re tempted to go to Vegas, just write me a check instead,” says Art Markman, PhD, a professor of psychology at the University of Texas at Austin.

MIND TRICK: “The narrow road took them into ever-deepening snow.”

Situational Blindness: In December 2009, John Rhoads and his wife, Starry Bush-Rhoads, headed back to their home in Nevada after a visit to Portland, Oregon. Following the directions of their GPS, they drove south on U.S. Highway 97 through Bend, then turned left onto Oregon Highway 31, passing through a dramatically beautiful high desert landscape before they connected with the highway to Reno near the California border.

Near the town of Silver Lake, Oregon, their GPS told them to turn off the highway onto a little-used forest road. If they’d continued straight, they’d have been home in under six hours. But their GPS was programmed to take the “shortest route,” not the “fastest.” The narrow road took them into ever-deepening snow. After driving more than 30 miles, they got stuck, managed to dig themselves out, drove farther, and then got stuck again. They tried calling 911 but couldn’t get cell phone reception. For three days, the couple huddled for warmth until they finally managed to get a cell phone signal and call for help. A sheriff’s deputy came to winch out their car.

As GPS units and satellite navigation smart-phone apps have flourished recently, there’s been a spate of similar cases in which travelers follow their devices blindly and wind up getting badly lost. The under­lying mistake is not merely technological but perceptual: the failure to remain aware of one’s environment, what aviation psychologists call situational awareness, or SA. People have always had difficulties maintaining SA, psychologists say, but the proliferation of electronics, and our blind faith that these devices will keep us safe, has led to an epidemic of absentmindedness.

Avoid the trick: Full situational awareness requires incorporating outside information into a model of your environment and using that model to predict how the situation might change. If all you’re doing is following the lines of the GPS, and it turns out to be wrong, you’ll be completely clueless about what to do next.

In daily life, we rely on what Beth Blickensderfer, PhD, a professor of applied psychology at Embry-Riddle Aeronautical University, calls social SA to navigate our way through the human maze. It’s especially relevant when you’re traveling in another country, for example. If you’re not paying attention, you might not realize that it’s considered unacceptable for a man to talk to a woman in some cultures or to refuse to eat a delicacy, and you wind up committing a serious faux pas that could ruin the occasion.

Illustration by Diego PatiñoMIND TRICK: “Once we form a theory, we tend to see everything through it.”

Bending the Map: Our minds are wired to find order in randomness. We look at clouds and see sheep. This can be useful for making decisions, since we’re helpless without a theory that makes sense of our quandary. But once we form a theory, we tend to see everything through it. A consequence is that when people get lost in the backcountry, they can convince themselves they know exactly where they are, a problem sometimes called bending the map.

A few years ago, three twentysomething skiers went out-of-bounds at the Jackson Hole Mountain Resort at Teton Village in Wyoming. Looking for fresh powder in Rock Springs Bowl, they took a wrong turn, headed north instead of south, and wound up at the bottom of Granite Canyon. If they’d been where they thought they were, the stream should have been flowing right to left, and heading left would have taken them back to the ski area. Instead, they found the stream flowing left to right. They knew they needed to go left to get home, but based on the topography of where they thought they were, they also had to go downhill. Eventually, they decided on a solution: In this particular case, the water had to be flowing uphill.

The group marched upstream, away from the ski area, and wound up spending the night in the snow without any survival gear. The next morning, they reconsidered their earlier logic and decided that, yes, the stream must indeed be flowing uphill. They had bushwhacked another quarter mile in the wrong direction before a rescue helicopter found them and flew them to safety.

Such errors of overconfidence are due to a phenomenon psychologists call confirmation bias. “When trying to solve a problem, we get fixated on a specific option or hypothesis,” explains Jason Kring, president of the Society for Human Performance in Extreme Environments, “and ignore contradictory evidence and other information that could help us make a better decision.”

A vast collective error of confirmation bias unfolded in the past decade as investors, analysts, and financial advisers all convinced themselves that legions of financial derivatives based on subprime mortgages were all fundamentally sound. There was plenty of evidence to the contrary, but the money was so good that many found it easier to believe. They kept convincing themselves right up until the roof caved in.

Avoid the trick: To outsmart confirmation bias, make a habit of skepticism, including skepticism toward your own gut feelings and assumptions. If you’re part of a group that seems prone to agreement, play devil’s advocate to encourage others to share different points of view. “Don’t use your intuition to convince yourself that things are going right; use it to alert yourself to potential problems,” says Jeff Haack, a former search-and-rescue specialist for Emergency Management British Columbia. “Listen to those nagging doubts.”

MIND TRICK: “There’s a risk that in the heat of the moment, we’ll be tempted to overstep our safety parameters.”

Redlining: Mountain climbing at high altitudes is a race against time. Our endurance is severely limited in the face of extreme cold and limited oxygen, and windows of good weather can shut abruptly. Lingering too long is an invitation to disaster, so when mountaineers prepare to summit, they need to set a turnaround time and strictly abide by it.

The consequence of failing to heed this sacred rule was made gruesomely manifest on May 10, 1996. On that date, an unprecedented number of climbers were preparing to make the final stage of their ascent of Mount Everest, including some who had paid as much as $65,000 each. For expedition leader Rob Hall, getting his clients safely to the top and back meant a turnaround time of 2 p.m. But the turnaround time came and went. Finally, at 4 p.m., the last straggler arrived at the summit, and Hall headed down. But it was too late.

Already a deadly storm had begun, lashing the mountain with hurricane-force winds and whiteout snow. Stuck on Everest’s exposed face, eight climbers died, one by one. Hall was one of the last to succumb. Trapped a few hundred feet below the summit, paralyzed by the cold and a lack of oxygen, he radioed base camp and was patched through via satellite to his wife home in New Zealand. “Sleep well, my sweetheart,” he told her. “Please don’t worry too much.” Today his body remains where he sat.

Hall fell victim to a simple but insidious cognitive error that I call redlining. Anytime we plan a mission that requires setting a safety parameter, there’s a risk that in the heat of the moment, we’ll be tempted to overstep it. Divers see an interesting wreck just beyond the limit of their dive tables. Pilots descend through clouds to their minimum safe altitude, fail to see the runway, and go just a little bit lower.

It’s easy to think, I’ll just go over the redline a little bit. What’s the big deal? The problem is that once we do, there are no more cues reminding us that we’re heading in the wrong direction. A little bit becomes a little bit more, and at some point, it becomes too much. Nothing’s calling you back to the safe side.

A similar phenomenon has been dubbed the what-the-hell effect, such as when dieters control impulses with strict limits on their eating, a nutritional redline. One day, they slip up, eat a sundae, and boom—they’re over the line. “Now they’re in no-man’s-land,” says Markman, “so they just blow the diet completely. They binge.”

Avoid the trick: As in mountain climbing, the best response to passing a redline is to recognize what you’ve done, stop, and calmly steer yourself back toward the right side. When it’s not life-or-death stakes, know that redlining is a reality, and try to check it as much as possible.

Reader's Digest
Originally Published in Reader's Digest