Reasoning is often thought of as being the exact opposite of intuitions. A typical example of intuition is the first impression we form when we meet someone new. It comes spontaneously and quickly to mind and, in many cases, we can't quite pinpoint why we think that this guy is nice while this one is likely to be a jerk. By contrast when people think of reasoning they think of, say, solving math problems in the classroom: a slow, effortful, conscious process. People -- Westerners at least -- also think that reasoning is often more efficient than intuition; after all, why go through all that trouble to reason if the result is not any better than intuitions?
This commonsensical distinction between intuition and reasoning has been turned into elaborate psychological theories, which have rebranded the distinction as "analytic" vs "heuristic," "associative" vs. "rule based" or, more generally, "System 1" (the intuitions) vs. "System 2" (reasoning). Hiding behind this technical vocabulary is something not terribly different from the lay view of intuition and reasoning. Both of them are usually characterized by a list of traits. Intuitions are supposed to be fast, effortless, unconscious, with little reliance on working memory and prone to mistakes and biases. Reasoning is supposed to be slow, effortful, conscious, with a crucial reliance on working memory and able to correct the mistakes and biases of intuitions.
Despite being widespread and indeed quite 'intuitive', I want to argue that this distinction-this opposition in fact-mostly stems from a 'sampling mistake'. While the characterization of intuitions is more or less on spot, that of reasoning relies on a highly artificial use of reasoning. Imagine that you had to characterize memory. You can think of the conscious, strenuous exercise of trying to remember a long string of random numbers. Or you can think of the automatic recollection of how to go to your house or what your name is. Most intuitions can be made to be conscious, effortful, taxing on working memory: reading if you try to decipher some very poor handwriting, visual search if you're looking for a particular face in a large crowd, etc. Yet the basic, simple form of the intuition is what we should focus on: it is the mechanism that makes the more effortful version possible. To be fair to reasoning we should also look at its most simple expression, the smaller step that can still qualify as reasoning.
Margo and Simon disagree about the movie they should see tonight. Simon says: "Last week you picked the movie, so this week it's my turn." Margo replies: "Fair enough, your turn to pick." This exchange is quite trivial, but it still requires reasoning. Simon has to be able to find a reason for why he should be the one deciding which movie to see. Margo has to be able to evaluate this reason and decide it's good enough that she should concede the point.
Looking at this minimal sample of reasoning, we realize that it is in fact very much like an intuition. It happens very quickly: neither Simon nor Margo needs to stop for a few minutes to ponder upon the strength of "Last week you picked the movie, so this week it's my turn." It doesn't take much effort or working memory to garner such an argument, and even less to evaluate it. Importantly, people don't really know why this argument is persuasive. It relies on intuitions of fairness that we can't easily make explicit, and that psychologists are still trying to figure out. Even though the reason is consciously processed, they way it is processed is kept under the hood.
Beyond the fact that it can be fast, effortless and partly unconscious, reasoning shares another crucial trait with intuitions: its pattern of performance. Far from being foolproof, reasoning is subject to systematic bias-most importantly, the confirmation bias, subject of a later post. In fact, reasoning is so much like intuition that it's more adequate to say that reasoning is mostly intuitive. Or, rather, that reasoning relies on a set of intuitions: reasoning taps into intuitions about what is a good reason to accept a given conclusion. We have an intuition that if Margo has picked the movie last week, Simon can use that as a reason to pick the movie this week. Obviously, we also have intuitions about what is not a good argument. The 'meme' "your argument is invalid," illustrated above (found here) works because it is a "superstimulus," an extreme form of a normal stimulus, an absolutely ludicrous argument taping into our intuitions for merely poor arguments. Here as well, recognizing that the argument is absurd is fast, effortless and hard to explain by anything deeper than "it has nothing to do with anything."
The picture of reasoning that is most easily conjured-the strenuous solving of math problems-is misleading. When people reason on their own, reasoning can indeed by slow and effortful. But when they argue, finding and evaluating reasons comes very spontaneously-sometimes, all too spontaneously: we can all think of cases in which we kept arguing long after we should have given up. This difference is no accident. If reasoning comes much more easily in the context of a discussion, it's simply because it is designed to work in such a context, and not when we engage in private ratiocination. But this will be the topic of (many) later posts.
The take on reasoning I've developed with Dan Sperber is exposed there:
Mercier, H., & Sperber, D. (2009). Intuitive and reflective inferences. In J. S. B. T. Evans & K. Frankish (Eds.), In Two Minds. New York: Oxford University Press.
Mercier, H., & Sperber, D. (2011). Why do humans reason? Arguments for an argumentative theory. Behavioral and Brain Sciences, 34(2), 57-74.
A paper on people's intuitions about when to use intuitions and reasoning:
Buchtel, E. E., & Norenzayan, A. (2008). Which should you use, intuition or logic? Cultural differences in injunctive norms about reasoning. Asian Journal of Social Psychology, 11(4), 264-273.
Here are some sources for dual process views of the mind:
Evans, J. S. B. T. (2007). Hypothetical Thinking: Dual Processes in Reasoning and Judgment. Hove: Psychology Press.
Kahneman, D. (2003). A perspective on judgment and choice: Mapping bounded rationality. American Psychologist, 58(9), 697-720.
Sloman, S. A. (1996). The empirical case for two systems of reasoning. Psychological Bulletin, 119(1), 3-22.
Stanovich, K. E. (2004). The Robot's Rebellion. Chicago: Chicago University Press.
On intuitions of fairness, see Nicolas Baumard's work.
This post is part of a "reasoning" series.