"Fear is a very powerful weapon. Fear doesn't give you the freedom to decide... Don't act of out fear." —The Sea Inside (2004)
Decision-Making: Cognitive and Emotional
How do we make decisions? Many of us like to imagine that we’re rational creatures, deciding what to do by weighing the anticipated risks against the benefits of our actions. But 50 years of psychology and behavioral economics research have told a much different story.
Princeton University psychologist Daniel Kahneman won a Nobel Prize in 2002 for the work he did in the 1970s with colleague Amos Tversky on “prospect theory,” based on the finding that decision-making often occurs through cognitive shortcuts called “heuristics” that have the potential to cause errors in how we estimate risk.1 For example, our brains use an “availability heuristic” to make judgments based on information that’s easily recalled due to recent exposure or because it’s based on personal experience. We might therefore overestimate the risk of getting killed in a drunk-driving accident if a family member died that way.
As of 2020, it has been proposed that there are nearly 200 such heuristics, or “cognitive biases,” that have evolved to make decision-making more efficient but have also resulted in making us prone to errors in accurately judging the risks and benefits of our actions2,3 (for a summary, see the comprehensive “cognitive bias codex” developed by Buster Benson and John Manoogian and this shorter list of 50 common cognitive biases). Understanding how cognitive biases can lead to both good and bad decisions reveals a more complex, if sobering, view of human decision-making.
For all that cognitive biases have revealed about how we make decisions, a criticism has been that cognitive models are based too much “in the head” and not enough “in the heart,” neglecting the role of emotions which clearly also play an important role in guiding our actions. Obvious examples abound—for instance, while we might deliberate over the pros and cons of marrying someone, few of us make that decision without being deeply influenced by how we feel.
In his 2011 book, Thinking, Fast and Slow, Kahneman proposed two different modes of decisional thinking—an automatic, fast judgment based on instinct and emotion and a slower, more rational, and deliberative process—that optimally work together, but often come into conflict.4
Sometimes, when one mode of decision-making wins out over the other, the results can be problematic. For example, the impulsivity of quick, instinctive thinking can result in us “jumping the gun” and making bad decisions based on inaccurate prejudices. Thinking carefully and deliberately about the best course of action might be wise for some decisions, but not for those where swift action is needed, like jumping out of the way of a car.
How Does Fear Affect Assessments of Risk?
Over the past few decades, researchers have furthered our knowledge about how emotions act as their own heuristics, influencing how we estimate risk within other “dual-process” models of decision-making like Kahneman’s.5,6 When it comes to assessments of risk, there may be no more pertinent emotion than fear. In a 2001 paper, behavioral economist George Loewenstein and his colleagues wrote:
“Fear causes us to slam on the brakes instead of steering into the skid, immobilizes as when we have greatest need for strength, causes sexual dysfunction, insomnia, ulcers, and gives us dry mouth and jitters at the very moment when there is the greatest premium on clarity and eloquence.”5
In 2004, “decision researcher” Paul Slovic and his colleagues likewise wrote, “feelings of dread [are] the major determiner of public perception and acceptance of risk for a wide range of hazards,” and more specifically noted that negative emotions like fear of things like nuclear power tend to result in greater assessments of perceived risk and lower assessments of perceived benefits.7 More recent research has demonstrated that fear is also associated with greater pessimism and feelings of unpredictability about the future as well as lower feelings of self-control.8
That fear would make us more cognizant of risk should come as little surprise. And to be clear, these findings don’t mean that fear is always a pathological emotion that leads to overestimates of risk and overreaction. Fear is extraordinarily useful when it’s optimally matched to a given threat. If we’re being held at gunpoint during a robbery, or are surprised by a rattlesnake at our feet during a hike, our fear-based risk assessment and even the paralysis that it might bring on might be completely appropriate and life-saving.
But fear can become problematic when it’s disproportionate to actual risk. This can often happen when actual risks are unknown. For example, fear of the uncertain, or the imagined, is a frequent saboteur of relationships in their early stages.
Fear can also lead to overestimates of risk for infrequent but catastrophic outcomes, such as plane crashes, mass shootings, or acts of terrorism.9,10 Harvard Law School professor Cass Sunstein has used the term “probability neglect” to describe how we tend to overestimate the chances of such worst-case scenarios, with evidence that we sometimes do so to the point of our protective responses causing more harm than good.10,11 For example, Sunstein has often argued that after 9/11, the fear-based implementation of burdensome airport screening measures may have resulted in people avoiding air travel and instead dying in car accidents as a result.
More recently, Sunstein made similar claims early on about COVID-19, but later backed off as the realities of the pandemic threat became more clear, highlighting the importance of adjusting our fear and our responses to it based on actual, or actuarial, risks.
How Does Fear Affect Perceptions of Interventions Designed to Reduce Risk?
Another key finding from research on fear and decision-making is that perceived risks are often inversely proportional to perceived benefits.6,7 This helps us understand why we tend to underestimate risk when we have positive feelings about a particular activity, such as having sex or sending and receiving text messages. Along with influence from optimism bias and another “positive illusion” of personal control—cognitive biases that tend to make us overestimate good outcomes and the degree to which we can avoid bad outcomes for ourselves—we’re more likely to discount risks when we envision a significant benefit or reward, leading to dangerous behaviors like having unprotected sex with strangers or texting while driving. This may be especially true for teenagers who are often more willing to engage in risk-taking, in spite of fear, due to lack of personal experience, the irresistible allure of novelty, and peer pressure.
While risk and benefits are often inversely correlated for a particular activity, fear is often associated not only with high and potentially overestimated perceptions of risk, but also overestimates of the benefits of protective measures. Again, Sunstein cites the illustrative example of the largely ineffectual and ceremonial Transportation Security Administration (TSA) measures that remain in place nearly two decades after 9/11. I've made a similar argument about how fears of mass shootings can lead to overestimated benefits of interventions on both sides of the polarized gun debate, whether bans or open carry.12 Research has also shown that patients often overestimate the effectiveness of tests and treatments for medical illnesses while underestimating their harms (this finding has recently been illustrated by unwarranted faith in interventions like hydroxychloroquine for the treatment of COVID-19).13,14
How Much Fear Is "Just Right?"
If fear is adaptive and self-protective on the one hand, but problematic when it’s disproportionate to actual risks on the other hand, is it possible to achieve the “Goldilocksian” ideal of avoiding too much and too little fear in favor of just the right amount? What, if anything, can we do to increase healthy fear and decrease irrational fear?
The answer is complicated by several issues. First, we often don’t know the actual risks, especially when faced with a threat from a new technology, pandemic, or terrorist group. Second, even when we do have actuarial statistics about known risks, attempts to educate ourselves about them and adjust our commensurate levels of fear are often thwarted by the fact that fear doesn’t just increase our own random estimates of risk; it can lead to biased interpretations of presented statistics.11,15 Depending on cognitive biases and emotions like fear, we might also alternately pay too much attention, or not enough, to relevant risk factors that might alter our individual risk compared to statistical averages. So, knowing the numbers alone might not be enough to put them into proper perspective and react accordingly.
In addition, the amount of fear that each of us has is at least partially “dispositional”—that is, based on some combination of genetics and life experience such that some of us are inherently more anxious, fearful, and risk-averse while others are undaunted, risk-accepting, and even thrill-seeking as “adrenaline junkies.” Accordingly, some of us might benefit from being more fearful and some less. As for how much we can actually modify that through self-help or psychotherapy, as the well-known disclaimer goes, “your experience may vary.”
If the major effect of fear on decision-making is to overestimate both risk and the effectiveness of interventions intended to keep us from danger, it could be argued that interventions that decrease fear but do little to reduce actual risk are justified. Sunstein has argued as much, suggesting that ineffective interventions that can reduce “baseless fear” may be a “social good” in their own right.10,11 That argument carries more weight if reducing fear can also lead to more rational and risk-accepting behavior if it’s warranted.
Whenever possible though, we should aspire to the Goldilocksian ideal of just the right amount of fear by educating ourselves about actual risks, putting them into proper perspective relative to other known threats, and collectively developing interventions that decrease both fear and actual danger.
But of course, that’s easier said than done, whether for an individual or for society. After all, even in the best telling of the well-known cautionary tale for children, Goldilocks’ quest for perfection ends with her escaping with her life but developing an intractable phobia of bears and the forest in the process.
1. Tversky A, Kahneman D. Judgment and uncertainty: heuristics and biases. Science 1974; 185:1124-1131.
2. Hibert M. Toward a synthesis of cognitive biases: how noisy information processing can bias human decision making. Psychological Bulletin 2012, 138:211–237.
3. Haselton MG, Nettle D (2006) The paranoid optimist: an integrative evolutionary model of cognitive biases. Personality and Social Psychology Review 2006, 10:47–66.
4. Kahneman D. Thinking, fast and slow. New York: Farrar, Straus and Giroux, 2011.
5. Loewenstein GF, Weber EU, Hsee CK, et al. Risk as feelings. Psychological Bulletin 2001; 127:267-286.
6. Slovic P, Finucane ML, Peters E, et al. The affect heuristic. European Journal of Operational Research 2007; 177:1333-1352.
7. Slovic P, Finucane ML, Peters E, et al. Risk as analysis and risk as feelings: some thoughts about affect, reason, risk, and rationality. Risk Analysis 2004; 24:311-322.
8. Lerner JS, Li Y, Valdesolo P, et al. Emotion and decision making. Annual Review of Psychology 2015; 66:799-823.
9. Chanel O, Chichilnisky G. The influence of fear in decisions: experimental evidence. Journal of Risk and Uncertainty 2009; 39:271-298.
10. Sunstein C. Terrorism and probability neglect. The Journal of Risk and Uncertainty 2003; 26:121-36.
11. Sunstein C, Zeckhauser R. Overreaction to fearsome risks. Environmental and Resource Economics 2011; 48:435-449.
12. Pierre JM. The psychology of guns: risk, fear, and motivated reasoning. Palgrave Communications 2019; 159.
13. Hoffmann TC, Del Mar C. Patient’s expectations of the benefits and harms of treatments, screening, and tests. JAMA Internal Medicine 2015; 175:274-286.
14. Hanoch Y, Rolison J, Freund AM. Reaping the benefits and avoiding the risks: unrealistic optimism in the health domain. Risk Analysis 2019; 39:792-804.
15. Slovic P, Monahan H, MacGregor DG. Violence risk assessment and risk communication: the effects of using actual cases, providing instruction, and employing probability versus frequency formats. Law and Human Behavior 2000; 24:271-296.