When Do We Change Our Minds? Think of a Jenga Tower
One metaphor that sheds light on when we change our minds.
Posted Oct 18, 2017
I recently read Philip Tetlock’s excellent book Superforecasting: The art and science of prediction.(1) It is about improving our ability to make forecasts, which includes updating our beliefs as we learn new information about how the future may unfold.
There is one section of the book that discusses belief updating and Tetlock (along with his co-author Dan Gardner) use a helpful metaphor for when we will or will not change our minds when encountered with new information. They say:
“Commitment can come in many forms, but a useful way to think of it is to visualize the children’s game Jenga, which starts with building blocks stacked one on top of another to form a little tower. Players take turns removing building blocks until someone removes the block that topples the tower. Our beliefs about ourselves and the world are built on each other in a Jenga-like fashion” (p. 162).
With this metaphor, they discuss how, if you are forecasting within your specialty, you can be more reluctant to discard certain beliefs when the domain is tightly-connected with your identity and self-worth.
I like the metaphor of beliefs as a Jenga Tower because it’s easy to be pessimistic that we’ll never change our minds when confronted with information that conflicts with our beliefs. In political science, they even find a strengthening of prior beliefs when confronted with conflicting evidence—called the “backfire” effect (Nyhan & Reifler, 2010). For example, if we are strong supporters of gun control or another hot-button issue, when we receive conflicting information that doesn’t support our preexisting beliefs, we’ll argue against it, thus strengthening our prior beliefs. This is certainly true in many circumstances, but occasionally we will update our views (even those close to the base of the Jenga tower) as “incongruency builds” over time (See Redlawsk, Civettini, & Emmerson, 2010). Yes, for many issues we won’t budge (at least in the short term), because it would bring the whole Jenga Tower crashing down, but for other beliefs near the top, we’ll slowly remove and update some of them in accordance with the evidence we are presented with.
So, the next time you seem to be in an intractable conflict, think about where you are in the Jenga Tower (and what this says about belief centrality). If you are near the base of someone’s self-worth and deeply-held identity, you have little chance of dislodging a core belief (at least in the moment). Instead, move up the Jenga Tower toward less foundational assumptions. For example, we can discuss whether tax credits for low-emission vehicles are a worthy policy to combat pollution, without having to agree and debate the causes of climate change (See Lewandowsky, Ecker, Seifert, Schwarz & Cook, 2012), This may be a more fruitful (and hopeful) way of conversing in domains that seem intractable, and eventually changing our (and others) minds.
(1) Don't be put off by the heroically dramatic title. The book is highly-nuanced and impressively-grounded. I’d put it in the same category as Kahneman’s masterful Thinking Fast and Slow.
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131.
Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32, 303-330.
Redlawsk, D. P., Civettini, A. J. W., & Emmerson, K. M. (2010). The affective tipping point: Do motivated reasoners ever “get it”? Political Psychology, 31(4), 563-593.
Tetlock, P. E. & Gardner, D. (2015). Superforecasting: The art and science of prediction. New York: Crown Publishers.