Skip to main content

Verified by Psychology Today

Dunning-Kruger Effect

You're Really Not That Smart: The Dunning-Kruger Effect

How do you get through to someone who thinks they know it all?

Photo by Adi Goldstein on Unsplash
The Dunning-Kruger Effect has weighty implications for how we live.
Source: Photo by Adi Goldstein on Unsplash

When I posted a podcast episode on imposter syndrome on social media, a follower brought up a good question. What's the name for the opposite behavior of imposter syndrome—when someone thinks they know more than they actually do? There is really a name for it. It's called the Dunning-Kruger Effect.

The Dunning-Kruger Effect is a cognitive bias named for two authors of a journal article describing how people overestimate their competence or cognitive abilities and don't realize these errors. The issue isn't just that people make poor choices from erroneous conclusions —it's also that they are unable to recognize those errors.

Components of the Dunning-Kruger Effect

To understand the Dunning-Kruger Effect, let's break it down into parts.

Lack of Knowledge

In the Dunning-Kruger effect, the less someone knows about a topic, the more they are likely to have strong opinions about that topic. This means that if you are arguing a point with someone that has beliefs that are not rooted in facts, that person will stick to their opinion even when presented with evidence to the contrary. That person will also disregard expert opinions. The person may tell you that the expert opinion is "fake," the expert was "paid off" to give that opinion, or he or she may just talk over you so that you can't get your facts heard. Since you can't prove that something didn't happen, you walk away frustrated, while the other person is more entrenched in his or her beliefs.

Misinformation Endorsement

Dunning-Kruger means that people will endorse erroneous information if it fits their opinion. Misinformation endorsement means that someone doesn't do their work in researching sources to see if they are legitimate. It also means that independent research is not done, so beliefs are not challenged by other information.

Reinforcement By Social Groups

If you are surrounded by others that have the same views as you, you are more likely to stick to those views. A community's opinions can become social norms. At the root of much of human behavior is a desire to belong. Going against the predominant view of one's group risks ostracization.

They Think They Know More Than Experts

In a study by Motta, Callaghan, and Sylvester (2018), it was found that one-third of study subjects thought they knew as much or more than doctors and scientists about the causes of autism. This rate was highest in people that had low levels of knowledge about autism and high levels of misinformation endorsement. In addition, researchers found that low knowledge and high misinformation was correlated with opposition to mandatory vaccine policies. In addition, they were more likely to support non-experts than experts on matters of policy.

Caveats of the Dunning-Kruger Effect

Arguing with people experiencing the Dunning-Kruger Effect rarely is effective. And you are not immune to the Dunning-Kruger Effect, even if you think you're pretty well-read and unbiased.

Challenging Them Just Reinforces Their Beliefs

When you provide proof that a person's belief is incorrect, a person experiencing the Dunning-Kruger effect will become more entrenched in their beliefs—and even more so if the opposing person's argument is emotionally charged (Nyhan, Reifler, Richey, & Freed, 2014).

Anyone Is Susceptible

Think you're immune to the Dunning-Kruger Effect? Nope. Being prone to the Dunning-Kruger effect has little to do with intelligence quotient or morals. Anyone is prone to this cognitive bias. And just because you are in a certain profession does not mean that you follow the counsel of a fellow professional. Physicians were only slightly more likely to follow a medication regimen than non-physicians (Frakes, Gruber, & Jena, 2019).

How to Conquer It

Information from professionals doesn't seem to make a dent in entrenched beliefs. Neither does presenting empirical evidence in a rational way. And presenting evidence in a way that sends a message that you can't believe someone doesn't believe facts definitely doesn't help.

So what does work when someone is unable to admit that they lack information or are incorrect? Find out why someone believes something. Meet a person where they are at. It's not so much about the data as to how it is being presented.

A great example of this can be found in Buster Benson's book Why Are We Yelling?: The Art of Productive Disagreement. Mr. Benson gathered a group of people that had various opinions on gun control. By the end of the evening, everyone had expressed their opinions, and they all left on friendly terms. How did Benson do it?

First, he provided food to the group. Never underestimate the power of food to bring a group together. He then asked each person in the group to share their personal history with guns, and how it formed their current opinion. Note how powerful it is to ask each person to share their personal experience. This can be a powerful springboard to having meaningful discussions. Again, meet people where they are at.

He then asked everyone, "How will we know we will have unquestionably fixed the problem of guns?" Through the phrasing of this question, Benson sent a message that he was not making value judgments—and he also helped the group work towards a common goal.

There are some people with such firmly entrenched opinions that regardless of what data you present, they will not change their minds. But a majority of people are willing to have a discussion about issues. It is all in how those issues are presented. A common theme of respect prevails in productive discussions. And if you don't change someone's mind, it's time to move on.

Copyright 2019 Sarkis Media

For more, please check out my website.

References

Benson, B. (2019). Why are we yelling? The art of productive disagreement. New York: 750 Words LLC.

Frakes, M. D., Gruber, J., & Jena, A. (2019). Is great information good enough? Evidence from physicians as patients (No. w26038). National Bureau of Economic Research.

Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of personality and social psychology, 77(6), 1121.

Motta, M., Callaghan, T., & Sylvester, S. (2018). Knowing less but presuming more: Dunning-Kruger effects and the endorsement of anti-vaccine policy attitudes. Social Science & Medicine, 211, 274-281.

Nyhan, B., Reifler, J., Richey, S., & Freed, G. L. (2014). Effective messages in vaccine promotion: a randomized trial. Pediatrics, 133(4), e835-e842.y, 77(6), 1121.

advertisement
More from Stephanie A. Sarkis Ph.D.
More from Psychology Today