This essay holds up a mirror to one of the most influential movements in psychology — the Heuristics and Biases (HB) framework initiated by Danny Kahneman and Amos Tversky in 1969, almost half a century ago. 

The HB community can boast many important accomplishments, many discoveries, many counter-intuitive demonstrations. It has spawned the field of Behavioral Economics, with its record of nudging people to make wise choices.

I have no intent to denigrate the HB framework. I simply want to describe the tendencies and inclinations of the members of the HB community.  That’s how I am defining bias in this essay.  Not errors, but tendencies, like we might describe a bias for action or a bias for justice.

Any community of practice is bound together by attitudes, preferences, and reactions.  Often these inclinations become so automatic that the members of the community don’t even notice them. I think that the biases, the tendencies and reactions, are part of the makeup of the H&B community no less than formal axioms and a body of research.

By holding up a mirror, this essay can help those outside the HB community coordinate with them. And it can help those inside the community understand themselves.

So I have assembled a list of 8 biases found in the HB community.

One. Eliminate errors. Obviously, we all want to reduce if not eliminate errors, no argument there. However, there can be a down side to this bias: it’s necessary but not sufficient. We also need to make discoveries. An over-emphasis on errors can potentially reduce the chance to gain insights. That’s why it’s a balancing act, taking actions that cut down on mistakes but not going overboard to the extent that performance suffers. Because of their tendency to fixate on errors, HB researchers are troubled by one of the causes of errors — the use of heuristics.

Two.  Discourage heuristics. The HB community has convincingly shown that we use heuristics. The researchers set up conditions in which the heuristics result in sub-optimal performance and demonstrate that people use the heuristics anyway. So heuristics aren’t perfect — they’re not algorithms. They open the door for making errors. However, the HB community hasn’t done cost/benefit analyses to gauge how much we gain and how much we lose by using heuristics. The community hasn’t examined the value of heuristics. Without heuristics none of us would be able to function very well in complex settings. Even Tversky & Kahneman stated that, “In general, these heuristics are quite useful, but sometimes they lead to severe and systematic errors.” Yet HB researchers only look at the disadvantages of heuristics. The researchers conclude that because people rely on heuristics, they will necessarily exhibit cognitive weaknesses.

Three.  Look for the flaws in cognitive performance. There is nothing wrong with this skeptical mindset that has helped the HB community generate a long list of cognitive limitations. However, H&B researchers are generally insensitive to the strengths of cognitive performance. Their glass is half-full. Instead of appreciating our ability to make difficult decisions under time pressure and uncertainty, HB researchers catalog the ways that our intuitions — based on heuristics — can get us in trouble.

Four. Distrust intuitive judgments. When can you trust intuition? The HB community would say, “Never,” and I agree, because intuition isn’t infallible. But in most settings it works well enough, and is the best we have. Even if we shouldn’t put total trust in intuitions, we should at least listen to our intuitions because they are reflecting our experience. Kahneman and Klein identified conditions under which we can develop skilled intuitions: a sufficiently stable environment and the chance to get timely and accurate feedback. Nevertheless, many if not most of the HB community has a deep-seated antipathy towards intuitions, even the intuitions of experts.

Five.  Distrust experts and expertise. Experts aren’t infallible, and they sometimes fall into the traps that HB researchers set in their experiments. But in very many domains experts do an impressive job; Ericsson et al. have documented the varieties of expertise that have been demonstrated. What concerns me is the pleasure that members of the HB community take in stories of expert comeuppance. The HB community happily cites work by Paul Meehl and others showing that expert judgments do worse than statistics, omitting the fact that the statistics are based on the factors identified by the experts. The primary advantage of the statistical methods is to provide consistency in the use of these factors. How important is consistency?

Six.  Maximize consistency. The HB community holds this bias very strongly, and consistency is definitely a virtue. Noise — random variation — detracts from performance. But is consistency more important than accuracy? Further, some variation, some inconsistency, may help individuals and teams explore alternatives and become less rigid. Biological evolution depends on variation. Individual and team variation promotes adaptability.  

Seven.  Rely on rational analysis. The HB community places great value on rational arguments, using principles such as deductive logic and on analytical methods such as Bayesian statistics. These methods are certainly powerful, but they don’t apply to most of the judgments and decisions we face. They aren’t suited for ambiguity and complexity. If principles such as deductive logic and Bayesian statistics were so essential, we would expect that people who systematically violated them would pay the price. But the violators actually have greater success in life than those who are aligned with the principles of rational analysis.

Eight.  Rely on procedures and checklists. The HB community also values procedure guides and checklists, in part because these tools impose consistency and can be empirically validated. There is no denying the utility of procedural guides and checklists. But as I pointed out in an earlier essay, some managers, influenced by the HB community, seek to substitute rational analysis, procedures and checklists for expertise. These managers envision a workplace in which people no longer make decisions. However, very few complex tasks can be reduced to procedures or to steps in a checklist. The unrealistic dream of doing away with experts is insensitive to the tacit knowledge needed to get most jobs done.

Each of these eight biases is valid, and has merit. But when pressed too far, each becomes counter-productive. I admire the HB community for taking bold positions, for making the most extreme statements they believe they can defend. I like that stance better than a stance of trying not to make dramatic assertions or claims that might be incorrect. I appreciate what the HB community has attempted, and what it has accomplished. The extreme positions the community has adopted, the biases it has embraced, can allow the rest of us to find a more reasonable and balanced position. 

You are reading

Seeing What Others Don't

Mapping the Sources of Power

The knowledge and abilities that come with experience.

How Not to Prepare for Emergencies

Procedures aren't a substitute for coordination

Positive Heuristics

Strategies for engaging in speculative thinking.