Why Do People Rationalise Poor Decisions?
There’s no "rational" in rationalisation.
Posted July 26, 2019 | Reviewed by Abigail Fagan
We’ve all rationalised bad decisions. Whether it was buying something you couldn’t afford or didn’t need, doing something you knew was wrong, or opting out of something you should have done, we always seem to be able to justify our decisions. But when we make these kinds of decisions, we deceive ourselves into believing that there’s a logic behind them. Sometimes, we’re oblivious to this deception and sometimes, deep down, we know what we’re doing.
There are two factors that need to be considered when asking ourselves why we rationalise poor decisions: (1) We are all emotional beings that, more often than not, act on intuitive, gut-level decision-making; and (2) We don’t like to be wrong.
But, don’t be fooled here–this second factor is directly linked to the first. We don’t like to be wrong — sure, but why? Well, it’s about protecting our ego, protecting the manner in which we perceive ourselves and how we feel about ourselves. With that, a good way of looking at rationalisation is as a defense mechanism used to facilitate what we want or how we want to feel, while at the same time preserving positive self-perception in the face of a poor decision.
Likewise, there are two ways of rationalising decisions that should be considered — prospective and retrospective. Prospective rationalising refers to rationalising a decision before making it, whereas retrospective rationalising refers to rationalising a decision after the fact. The distinction here is important in terms of understanding the basis of our rationalisation, particularly if we know, deep down, what we’re doing. If we’re aware of our poor choice prior to acting on it, then we really have no excuse at all other than our decision being a result of wanting to make that decision (i.e. emotion-based).
For example, "I know I can’t afford this luxury item, but I never buy myself anything nice and this is something I really want". On the other hand, retrospectively rationalising a decision might be about saving face (i.e. protecting our self-perception). For example, you know now that you have made a bad decision, because you’ve experienced the outcome(s); so, you explain, as reasonably as you can, why that decision was initially made (e.g. "Yeah it wound up being wrong, but had it not been for x, y or z, then the mistake would never have been made!").
Again, rationalising our decisions boils down to engaging emotion during the decision-making process. Deep down, we know this. But, at the same time, we don’t want to give in to it because when it comes to decision-making, we want to be right and, likewise, perceived as right. Emotionally-based decisions do not facilitate this the way logic does, because a logical decision is one based on evaluating inferential relationships among various objective justifications.
However, the use of emotion can be dressed up. As discussed at length throughout this blog, though many (incorrectly) value the importance of gut-based decision-making, there is often no justification for why a decision or conclusion made in this manner is inferred. "It felt right" or, "Something inside me told me this was the right way to go" are common, though completely illogical justifications for such decisions. Simply, this is emotion dressed up as something to value–gut-based decision-making. The latter sounds far more attractive and is perhaps why so many seem to have embraced it. Why? Again, we are all emotional beings that, more often than not, act on our emotions.
What facilitates continued belief in this perspective is that, in reality, our intuitive judgments are right much of the time—going with your gut does work—but, not all of the time! On the other hand, if you apply emotion-free thinking, you significantly increase the chances of making a correct decision. So, if we like to be right, then why don’t we use logic and critical thinking? Well, there are a number of answers to that, including the potential for decision fatigue, the length of time it might take to draw a logical conclusion, and the potential that someone may not be skilled enough in reasoning for the relevant dilemma. However, in the context of rationalisation, there’s one particular reason—the person doesn’t really want to be right.
To understand why the person doesn’t want to be right, we must first consider that, in addition to being dressed up as gut-level intuition, emotion can also be disguised as logic. We rationalise when we know we are wrong so that the poor decision doesn’t seem to result from poor thinking, rather from a forgiving line of reasoning that is disguised as logic. For example, “I bought this new tablet because my old one was going to die soon and this one was on sale.” It can be forgiving because, as we’re generally kind to ourselves, we might accidentally-on-purpose omit a caveat or two from what could have potentially been good reasoning (e.g. “Though it was on sale, there were still cheaper, equally suitable options” or “Do I really need a tablet when I have a phone and laptop?”), which we know, deep down, might debunk the logic behind our actions.
Notably, acknowledging confirmation bias as an underlying mechanism of rationalising decisions should reinforce for us the concept of emotion’s influence on the decision-making process. For example, “I have decided what I want.” Perhaps, if we all acknowledged what we want or feel when making a decision, addressing such emotion honestly could potentially facilitate better reasoning. But then, the question regarding rationalisation becomes, ‘Who are we lying to, others or ourselves?’
In conclusion, in order to make better decisions, we need to rely less on emotion and the fear of being wrong and more on logic and critical thinking. Sure, gut-level intuitive judgment is an attractive concept—it’s easy and works often; but if the decision is important, why take the chance? If you genuinely care about the outcomes of your decisions, then engage logic and critically think about all relevant factors.