Intelligence
The Mental Model Matrix
Important aspects of mental models often get ignored.
Posted January 27, 2021 Reviewed by Gary Drevitch
The simple definition of a mental model is a description of how something works. We all have mental models for different types of systems and machines and organizations and even protocols for social interactions. Our mental models provide us with a blueprint for how the device or the interaction produces its results. They let us describe a system’s form, explain how it functions, and predict its future states (Rouse & Morris, 1986).
But maybe our simple definition is too simple. Maybe it misses some of the most important features of our mental models. That’s the conclusion my colleagues Joseph Borders and Ron Besuijen and I reached after we conducted a field study sponsored by the Center for Operator Performance (Borders, Klein & Besuijen, 2019). This post is a continued collaboration between the three of us.
We observed and interviewed eight qualified panel operators in a petrochemical plant as they responded to upset scenarios on a high fidelity training simulator of a distillation unit separating Ethylene and Ethane. Some of the operators had more than a decade of experience, but most had less than three years and one had only six months; they averaged 4.5 years on the panel. The scenarios were very demanding; no two operators approached them in the same way.
How the system works. As expected, we found that the operators relied on a set of beliefs about how the system worked. Sometimes these beliefs were limited in ways the operators didn’t appreciate, and sometimes they were flawed, but generally they were accurate and the operators usually were able to diagnose their own confusions.
How the system fails. We also found that operators understood ways that the system could fail — its limitations and its vulnerabilities to breakdown. These “negative” beliefs were a very important aspect of the operators’ mental models — providing them with ideas about what might be going wrong.
Being able to consider and anticipate system limitations and failures is obviously important for troubleshooting. It is also a very important aspect of system design — imagining how a system might fail rather than just considering how it is supposed to work. A Pre-mortem exercise might come in handy here. Too many designers fixate on delivering a system that meets the requirements, and don’t stop to imagine where the system might break down, the conditions under which a system, say a commercial airliner, might crash. Mumaw et al. (2000) found that workers monitoring a nuclear power plant couldn’t just rely on the schematics. They had to appraise the plant’s performance against a noisy background. They had to be alert to recent developments such as valves that were sticking or sensors that were acting up.
Workarounds. The operators had beliefs about how to do workarounds to overcome limitations and failures. These workarounds were important for recovering from upsets. Knowing how to perform workarounds is obviously important for adapting to unexpected situations. The more experience operators had, the more sophisticated were their ideas for keeping the system running.
Confusions. Finally, the concept of a mental model should include beliefs about the limitations of people, such as the users of a system — the ways they can become confused.
For example, someone might direct us to a location (e.g., Go two blocks, turn left, etc.), but a person with a stronger mental model of the route and of our navigation abilities might anticipate where we might get confused or mistaken and annotate the directions accordingly (e.g., Go two blocks and turn left; it’s a narrow street and there's no street sign so it might look like a driveway, but there’s a little antique store on the far corner). Here, the mental model is about our limitations and potential failures, not those of a system. It’s really impressive when people can anticipate the ways that others might get confused, and make the appropriate adjustments.
The diagram illustrates this mental model matrix account:

The initial concept of a mental model, a set of beliefs about how a system works, is not wrong. But it is incomplete. It misses the kinds of beliefs, gained through experience, that underpin an expert’s mental model.
References
Borders, J., Klein, G., & Besuijen, R. (2019). An operational account of mental models: A pilot study. International Conference on Naturalistic Decision Making, San Francisco, CA.
Mumaw, R.J., Roth, E.M., Vicente, K.J., & Burns, C.M. (2000). There is more to monitoring a nuclear power plant than meets the eye. Human Factors, 42(1), 36-55.
Rouse, W.B., & Morris, N.M. (1981). On looking into the black box: Prospects and limits in the search for mental models. Psychological Bulletin, 100(3), 349.