Delusions are fixed and false personal beliefs that are resistant to change in the light of conflicting evidence. Delusions are the extreme case of irrational beliefs. These beliefs are obsessive and cause emotional distress.
The delusional belief is something very important for those who hold them. That is why they are blind to counter-evidence because they do not want to change their belief. For example, when we are passionate about the superiority of our preferred political candidate, we tend to stick to the belief despite mounting counter-evidence/arguments.
Delusions exist on a continuum with irrational beliefs (Bortolotti, 2010). Even some otherwise rational people appear to believe bizarre things that are not true. To some degree, we are all sensitive to being watched, talked about, or deceived by someone. For example, about 10 to 15 percent of the general population regularly experiences paranoid thoughts involving suspicion and mistrust of others (Freeman, 2008).
In order to explain any delusion, we need to answer two questions (McKay, 2007). The first question is: What is it that brought the delusional idea to mind in the first place? The second question is: Why is this idea not rejected when so much evidence against its truth is available to the person?
The dual-process framework of decision-making can provide some insights to the theory of delusional belief (Kahneman, 2011). This framework suggests two systems of thought. Most of our thought is System 1 effortless thought (intuitive) that produces quick and automatic answers to decision-making dilemmas. In contrast, System 2 is slow, far more analytical, effortful, and conscious in its approach to the decision-making task.
Delusional reasoning can be described by an over-reliance on instinctive (rapid and non-reflective) thinking and an under-reliance on analytical thinking (deliberative, effortful). People with delusions are prone to making snap judgments and may form decisions quickly on the basis of little evidence. They jump to conclusions because they crave a decisive solution to the task. For example, a person can look at two people whispering and jump to the conclusion that they are plotting against him or her.
System 2 in the intact mind is responsible for belief evaluation and formation. Belief evaluation involves System 2 inhibiting reflex reactions. Odd ideas occur to all of us but we prevent these from becoming odd beliefs by using deliberative mind (System 2). For example, one person may hear a crackling sound when they use the telephone and assume there is simply a bad connection. However, another person may hear the crackling sound and believe that their phone has been bugged in order for someone else to eavesdrop on their conversation.
The tendency to fall back on System 1 thinking may stem from a depletion of cognitive resources brought on by distress (De Neys, 2006). When cognitive resources are depleted, people tend to act on System 1 (impulse) and lose the ability to be reflective. For example, a reliance on a hasty judgment may be intensified by anxiety making System 2 thought more difficult. We become more vulnerable to conspiracy theories when we feel events are complex or beyond our control. We see patterns and causal connections that are not there. And we quickly decide on a single interpretation (e.g., Big events like economic recessions and the outcomes of elections are controlled by small groups of people) (Miller, et al., 2016).
Cognitive behavioral therapy (CBT) can be effective in treating delusions by encouraging patients to evaluate their beliefs. To goal is to promote System 2 analytic reasoning to modify particular conclusions derived from System 1 processes (Galbraith, 2015). The very essence of this therapeutic technique is to ask people to evaluate their ideas and consider whether there may be another way of seeing the situation.
Bortolotti L. (2010). Delusions and Other Irrational Beliefs. Oxford University Press.
DeNeys W (2006) Dual processing in reasoning: two systems but one reasoner. Psychol Sci.; 17(5): 428-33.
Freeman D., & Freeman J. (2008). Paranoia: The 21st century fear. Oxford, England: Oxford University Press.
Galbraith Niall (ed) (2015). Aberrant Beliefs and reasoning. Psychology Press
Kahneman Daniel (2011) Thinking, Fast and Slow, Farrar, Straus and Giroux
McKay R., Robyn Langdon and Max Coltheart, (2007). Models of misbelief: integrating motivational and deficit theories of delusions. Consciousness and Cognition, 16 (4), 932-941.
Miller, Joanne M., Kyle L. Saunders, and Christina E. Farhart. (2016). “Conspiracy Endorsement as Motivated Reasoning: The Moderating Roles of Political Knowledge and Trust.” American Journal of Political Science, 60(4): 824-244