This essay explores the way we do anticipatory thinking: imagining how unexpected events may affect our plans. Anticipatory thinking lets us recognize and prepare for difficult challenges. It’s different from making predictions because we don’t necessary expect events to play out the way we imagine — complex situations are too hard to predict. Instead, we are getting ourselves ready, bracing ourselves, preparing ourselves. We particularly need anticipatory thinking to flag low probability events that pose severe threats.

Anticipatory thinking (AT) is not trying to guess the future, it is trying to adapt to possible futures. It’s about readying ourselves. And it’s about guiding our attention — gambling with our attention based on what we are preparing ourselves to handle.

The topic of anticipatory thinking has been coming up for me a lot recently. At the end of January 2017 I put on a two-day workshop at the Laboratory for Analytic Sciences at North Carolina State University in Raleigh, NC.  They perform research for the intelligence community. They have gotten very interested in AT and how it works because AT is obviously essential for being a good intelligence analyst. During the workshop, we noted that AT was also essential for social workers engaged in Child Protective Services. When caseworkers do a site visit, they rely on AT to judge whether the home will be sufficiently safe for the children living there. Any cognitive process that is critical to both intelligence analysts and Child Protective Service caseworkers is probably worth thinking about.

In 2011 I published a chapter on AT with Dave Snowden and Chew Lock Pin, so when I put on the AT workshop in North Carolina I wanted to build on the 2011 material and press beyond it.

Three topics, in particular, seem important for making progress on the nature of AT: How does AT work, the role of expertise in AT, and dysfunctional tendencies that interfere with AT.

How does AT work? It must engage our ability to generate expectancies, and to draw on our mental models. For me, the causal factors we learn to appreciate are the core of our mental models, allowing us to perform the mental simulations that transform our understanding of what is happening right now into what may happen in the future.

AT is a sensemaking activity, and perhaps the Data/Frame model of sensemaking can inform our understanding of how AT works. In addition, my research on insight identified three pathways for gaining insights: a connection path (which aligns with the process of convergence in the 2011 paper), a contradiction path, and a correction path. These pathways may describe different means of achieving AT.

The role of expertise in AT. Expertise allows us to appreciate more causal factors and build more sophisticated mental models. It also helps us navigate the cognitive challenges to AT: managing ambiguity and uncertainty, making diagnoses, gathering information, determining whether to trust data, spotting leverage points, and appreciating the normal range of variation so that we can detect the anomalies that spark AT. 

In addition, we can imagine how AT could be overdone — people paralyzed and indecisive as they consider all kinds of consequences. Therefore, there must be a meta-skill governing when and how thoroughly to perform AT.

The dysfunctional tendencies that interfere with AT. The 2011 paper on AT by Klein, Snowden and Chew listed a few types of breakdowns at the individual level, fixation and weak mental models. I think it can be fruitful to expand our analysis of dysfunctional tendencies. What gets in our way when we need to do AT? Here are six dysfunctional tendencies we identified at the NCSU workshop.   

a) We may stop monitoring a situation once we make a critical decision, failing to notice subsequent events that might alter the conditions. These subsequent events should initiate some degree of AT.

b) We may be overconfident in our abilities. In his book Fundamental Surprises, Zvi Lanir has argued that much of the Israeli Defense Forces failures at the beginning of the 1993 Yom Kippur War stemmed from an unrealistic confidence in their capability to withstand an Egyptian attack.

c) Fixation. We fail to revise our assumptions when the data do not fit. DeKeyser and Woods have examined fixation errors in dynamic situations. In my discussions with physicians, I have learned that a common reason for diagnostic failures in medicine is that the attending physician makes an initial determination based on the salient cues that are available, but then locks into this determination rather than reconsidering it as new data are received. The physician then fails to perform the necessary AT.

d) Knowledge Shields. Feltovich et al. (2001) have described the range of strategies we use to hold on to our initial beliefs by explaining away inconvenient counter-indicators.

e) Oversimplification. In separate work, Feltovich et al. have described the ways we oversimplify situations — what they call the reductive tendency. Of course, some simplification is necessary in order to manage complex situations. So the challenge is to simplify skillfully and in ways that do not excessively distort the dynamics of a situation.

f) Mindsets. Effective AT seems to depend on making some critical shifts in our mindset. People who fail to make these shifts should have trouble engaging in AT. Here are five mindset shifts we identified in the NCSU workshop: From a procedural mindset to a problem solving mindset — the dysfunction is to remain locked in to a mindset that a job consists of following procedures, rather than being alert to anomalies. From a reactive mindset to an anticipatory mindset. Novices, in particular, seem to be content to reacting to events rather than trying to get ahead of the curve but even people with years of experience sometimes retain this passive reactive mindset. From a mindless stance to an active and curious stance — in my research on insight I found that the people who gained insights were actively curious and speculating whereas those who had the same information but missed the insight were passive and stopped speculating. Need for Closure — a mindset that tries to tie everything down as quickly as possible is likely to be insensitive to unexpected events that should trigger AT and opens up new possibilities and vulnerabilityes. In addition, designers need to shift from a mindset of how devices work to a mindset of how devices might fail — it is too easy to fall into the rut of imagining how a device, or a plan, can succeed rather than engaging in AT to imagine the things that might go wrong. (The PreMortem method might help here.)

Notice that the dysfunctional tendencies and immature mindsets are all about failures to initiate AT. In contrast, people whose AT is too shallow probably suffer from weak mental models, which stems from a lack of expertise and is not a dysfunctional tendency.

One possible research approach is to do a contrast study, comparing people who successfully performed AT versus those who had the same opportunity but failed to engage in AT.  I used this paradigm, success/failure contrast pairs, in my research on insight and I believe the paradigm could be useful in studying the factors that contribute to success and faiure in anticipatory thinking. 


Feltovich, P., Coulson, R., & Spiro, R. (2001). Learners' (mis)understanding of important and difficult concepts: A challenge to smart machines in education. In K. Forbus & P. Feltovich (Eds.), Smart Machines in Education. Cambridge, MA: AAAI/MIT Press.

You are reading

Seeing What Others Don't

Tools That Aid Expert Decision Making

Supporting frontier thinking, social engagement and responsibility


How do we prepare ourselves for the unexpected?

The Myths of Moneyball

The dangerous messages of a best-seller.