Cognitive Bias Is the Loose Screw in Critical Thinking
Recognizing your biases enhances understanding and communication.
Posted May 17, 2021 | Reviewed by Jessica Schrader
- People cannot think critically unless they are aware of their cognitive biases, which can alter their perception of reality.
- Cognitive biases are mental shortcuts people take in order to process the mass of information they receive daily.
- Cognitive biases include confirmation bias, anchoring bias, bandwagon effect, and negativity bias.
When I was a kid, I was enamored of cigarette-smoking movie stars. When I was a teenager, some of my friends began to smoke; I wanted to smoke too, but my parents forbid it. I was also intimidated by the ubiquitous anti-smoking commercials I saw on television warning me that smoking causes cancer. As much as I wanted to smoke, I was afraid of it.
When I started college as a pre-med major, I also started working in a hospital emergency room. I was shocked to see that more than 90% of the nurses working there were smokers, but that was not quite enough to convince me that smoking was OK. It was the doctors: 11 of the 12 emergency room physicians I worked with were smokers. That was all the convincing I needed. If actual medical doctors thought smoking was safe, then so did I. I started smoking without concern because I had fallen prey to an authority bias, which is a type of cognitive bias. Fortunately for my health, I wised up and quit smoking 10 years later.
It's Likely You're Unaware of These Habits
Have you ever thought someone was intelligent simply because they were attractive? Have you ever dismissed a news story because it ran in a media source you didn’t like? Have you ever thought or said, “I knew that was going to happen!” in reference to a team winning, a stock going up in value, or some other unpredictable event occurring? If you replied "yes” to any of these, then you may be guilty of relying on a cognitive bias.
In my last post, I wrote about the importance of critical thinking, and how in today’s information age, no one has an excuse for living in ignorance. Since then, I recalled a huge impediment to critical thinking: cognitive bias. We are all culpable of leaning on these mental crutches, even though we don’t do it intentionally.
What Are Cognitive Biases?
The Cambridge English Dictionary defines cognitive bias as the way a particular person understands events, facts, and other people, which is based on their own particular set of beliefs and experiences and may not be reasonable or accurate.
PhilosophyTerms.com calls it a bad mental habit that gets in the way of logical thinking.
PositivePsychology.com describes it this way: “We are often presented with situations in life when we need to make a decision with imperfect information, and we unknowingly rely on prejudices or biases.”
And, according to Alleydog.com, a cognitive bias is an involuntary pattern of thinking that produces distorted perceptions of people, surroundings, and situations around us.
In brief, a cognitive bias is a shortcut to thinking. And, it’s completely understandable; the onslaught of information that we are exposed to every day necessitates some kind of time-saving method. It is simply impossible to process everything, so we make quick decisions. Most people don’t have the time to thoroughly think through everything they are told. Nevertheless, as understandable as depending on biases may be, it is still a severe deterrent to critical thinking.
Here's What to Watch Out For
Wikipedia lists 197 different cognitive biases. I am going to share with you a few of the more common ones so that in the future, you will be aware of the ones you may be using.
Confirmation bias is when you prefer to attend media and information sources that are in alignment with your current beliefs. People do this because it helps maintain their confidence and self-esteem when the information they receive supports their knowledge set. Exposing oneself to opposing views and opinions can cause cognitive dissonance and mental stress. On the other hand, exposing yourself to new information and different viewpoints helps open up new neural pathways in your brain, which will enable you to think more creatively (see my post: Surprise: Creativity Is a Skill, Not a Gift!).
Anchoring bias occurs when you become committed or attached to the first thing you learn about a particular subject. A first impression of something or someone is a good example (see my post: Sometimes You Have to Rip the Cover Off). Similar to anchoring is the halo effect, which is when you assume that a person’s positive or negative traits in one area will be the same in some other aspect of their personality. For example, you might think that an attractive person will also be intelligent without seeing any proof to support it.
Hindsight bias is the inclination to see some events as more predictable than they are; also known as the “I knew it all along" reaction. Examples of this bias would be believing that you knew who was going to win an election, a football or baseball game, or even a coin toss after it occurred.
Misinformation effect is when your memories of an event can become affected or influenced by information you received after the event occurred. Researchers have proven that memory is inaccurate because it is vulnerable to revision when you receive new information.
Actor-observer bias is when you attribute your actions to external influences and other people's actions to internal ones. You might think you missed a business opportunity because your car broke down, but your colleague failed to get a promotion because of incompetence.
False consensus effect is when you assume more people agree with your opinions and share your values than actually do. This happens because you tend to spend most of your time with others, such as family and friends, who actually do share beliefs similar to yours.
Availability bias occurs when you believe the information you possess is more important than it actually is. This happens when you watch or listen to media news sources that tend to run dramatic stories without sharing any balancing statistics on how rare such events may be. For example, if you see several stories on fiery plane crashes, you might start to fear flying because you assume they occur with greater frequency than they actually do.
Bandwagon effect, also known as herd mentality or groupthink, is the propensity to accept beliefs or values because many other people also hold them as well. This is a conformity bias that occurs because most people desire acceptance, connection, and belonging with others, and fear rejection if they hold opposing beliefs. Most people will not think through an opinion and will assume it is correct because so many others agree with it.
Authority bias is when you accept the opinion of an authority figure because you believe they know more than you. You might assume that they have already thought through an issue and made the right conclusion. And, because they are an authority in their field, you grant more credibility to their viewpoint than you would for anyone else. This is especially true in medicine where experts are frequently seen as infallible. An example would be an advertiser showing a doctor, wearing a lab coat, touting their product.
Negativity bias is when you pay more attention to bad news than good. This is a natural bias that dates back to humanity’s prehistoric days when noticing threats, risks, and other lethal dangers could save your life. In today’s civilized world, this bias is not as necessary (see my post Fear: Lifesaver or Manipulator).
Illusion of control is the belief that you have more control over a situation than you actually do. An example of this is when a gambler believes he or she can influence a game of chance.
Understand More and Communicate Better
Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking.