Knowing Just Enough to Be Dangerous
How overconfidence subverts rational thinking
Posted Oct 07, 2017
By Joel Lehman
“The fool doth think he is wise, but the wise man knows himself to be a fool.” —Shakespeare
If you've ever watched the beginning round tryouts on the talent show American Idol, you've probably seen a man confidently take the stage, sparks in his eyes. He thinks he’s destined for greatness, this is his moment.
The music starts, and with anticipation quaking, he smiles full of composure, and boldly unleashes — a cracked and cringe-worthy voice! Clearly, he can’t carry a tune if his life depended on it. The judges prepare their cut-throat feedback, and you wonder, how could he have been so wrong about his ability?
Stop wondering: You might be setting yourself up to be in his place. This overconfident lack of self-awareness isn’t unique to reality-show auditions. In general, we tend to motivate our reasoning so that the world appears well-aligned with our pre-existing story about how it should work.
Overconfidence and Dunning-Kruger
Do you think you’re a good driver, at least better than average? A full ninety-three percent of Americans say they’re better than average drivers. In another study, college students who performed the worst in tests of logic, humor, or grammar still believed they were better than most. Most of us are the heroes of our own story, even when the reality is more ordinary.
These studies shed light on the Dunning-Kruger effect, which is one of many predictable flaws in how we think. The main idea is that when we know just a little about something, we often struggle to evaluate just how little we know. Maybe you know a friend who read one article on a topic and now believes he’s an expert. That’s the Dunning-Kruger effect in action.
A similar thinking flaw is the overconfidence effect, when we take our feeling of certainty as proof that we actually have the right answer. There’s an important difference between how well we actually understand something, and how certain about it we feel. Maybe you’ve noticed this in your friends, when you’ve had to fact-check a disagreement over who sings the tune currently playing over the loudspeakers. Both sides are 100% sure of their conflicting answers, but one friend is blindsided by what you find googling on your cell phone.
Science shows that this isn’t an isolated case. For example, one study shows that when people were completely confident they knew the right spelling of a difficult word, they were actually wrong 20% of the time. Think about what that means, if it’s true in general — you may be overconfident about many “sure things.”
Okay, so maybe it’s not completely surprising that we’re often overconfident, but does it matter? It’s sometimes amusing when we‘re blindly optimistic about our own charisma and musical ability. But it’s tragic when the same mistaken confidence can hurt the world, like when it distorts beliefs about political policy.
A Case Study: Minimum Wage
Take, for example, the idea of raising the minimum wage. One side argues it would raise the standard of living for hard-working citizens (everyone should get a living wage!), while the other argues it would actually make the poorest worse-off (higher minimum wage will destroy jobs!). A policy change like this can have huge effect, bringing serious consequences upon the lives of real people, either for good or for bad.
Generally, liberals support big boosts to minimum wage, while conservatives tend to oppose such a move. And every year, liberal and conservative politicians rehash the similar battles over taxes and government spending. These economic issues likely have better and worse answers, although in the public debate it seems we never get any closer to finding out which is which. Why should that be? And can the Dunning-Kruger effect help us understand it?
Isn't it strange that Democrats and Republicans have near-opposite opinions about the economy, and yet neither group seemingly has any more economic expertise? Perhaps most of the political soundbites we hear about raising (or lowering) the minimum wage, or taxes, or stimulus plans, are just over-simplified noise. So often we depend upon our favorite politicians to teach us about economics. Yet they are clearly one-sidedly invested in their party’s answer, and rarely have any strong qualifications. Why not take lessons from the community of experts that study economics, instead?
The base platforms of the Democratic and Republican parties seem to pre-commit them to particular economic policies, no matter what evidence from economics actually suggests. Are either a Democrat running to lower the taxes on the wealthy, or a Republican running to raise taxes on the wealthy, likely to be nominated or elected? It’s interesting then that individual Democrat and Republican voters each find the theory of their preferred politician and political party so plausible, even though those theories are so different.
One explanation is that it’s the Dunning-Kruger effect biting us. Each of us rarely recognizes how little we actually know. We latch onto simple stories that make sense on their surface, but don’t respect the complexity of the real world. We tend to parrot the “folk economics” (simple common-sense theories) of our political bandwagon. There’s some truth that raising minimum wages could ease the lives of hard-working people struggling to raise a family. And there’s also truth in the idea that a large minimum wage increase could cause struggles for small businesses. But the problem with folk economics is that both of these simple stories sound reasonable, and yet contradict each other.
There are many different simple stories that could be right, but the real world works only in one particular way, and likely has complicated wrinkles. To uncover truer stories, we need to be rooted in evidence. But most of us don’t look to the evidence. Instead, we’re satisfied with whatever folk economic theory appeals to how we think the world should work. The poet Alexander Pope anticipated the Dunning-Kruger effect in 1709 when he wrote, “A little learning is a dangerous thing.” We feel certain we understand minimum wage policy and its effects, even though the economy is hugely complicated, and there is so much we personally don’t know.
As a result, we make poor use of what knowledge and evidence has been accumulated by the hard work of economists — some of whom have studied questions like minimum wage their entire career. The Dunning-Kruger trap leads our country to again and again debate minimum wage through sound-bites and folk arguments fiercely divorced from reality. Most of us simply don’t understand economics well enough to weigh in with any real authority, and yet the overconfidence effect still gives us the certain feeling that our political team obviously has the right answer.
If we stop and think, we can see through the illusion. If we notice ourselves unwavering in our opinion on minimum wage or another controversial policy, we can summon the courage to question our true knowledge. This requires some self-awareness, but it’s something you can develop through practice, if you really care about truth. We can learn to notice overconfidence, and become curious enough to search for possible weaknesses in our cozy beliefs. We can explore first-hand whether there’s any difference between what feels right and what the evidence truly suggests.
One rule of thumb is to go on high alert when we’re dealing with a complex issue, and notice a pleasant feeling of certainty. That pleasant confident feeling might only signal that we’re deceiving ourselves with the folk theory of our political tribe. So, the next time you listen to your favorite show with a political bent, whether it’s The Daily Show or Tucker Carlson, pay attention to how good it feels when the host sticks it to the other side. Ask yourself if it’s possible that you’re getting a partisan sketch of reality, one that’s designed not to cut towards truth, but to reinforce the cozy things your tribe believes already.
The inconvenient truth is that reality is often much more complicated than we’d like it to be, and requires real knowledge and expertise to understand correctly. So, if we actually want to put into place political policies that do good instead of those that just feel good, then we need to wrestle with that complexity. We also need to wrestle with the unpleasant truth that most political discussions focus on folksy Dunning-Krugerish assumptions about the world, ones that miss the heart of the matter.
To conclude, a sad facet of human nature is that we’re often too sunny in gauging our competence. Having just a little bit of knowledge, like a simple folk theory, can easily plunge us into a misleading feeling of deep understanding. What’s worse, this kind of overconfidence plays a big role in corrupting a rational discussion of big-impact political policies. But, each of us can strive to bring more self-awareness to situations ripe for this kind of delusion — whenever there’s a huge angry divide over a complicated issue, for example.
Of course, it’s not fun to face the music: We don’t know as much as we think we do. But really, there’s no shame in it. Our world is astoundingly complicated. It’s only ego that we should expect to understand its entirety, anyway. In the end, we’re all better off the more honest we can be with ourselves, so we can hone in on the actual truth together. The beauty of truth in politics is that it leads us to policies with a good chance of working, which means a better world for all of us.
Here are some final questions to think about:
- What divisive political issues might I be overconfident about?
- What did it feel like the last time I noticed myself being overconfident?
- How can I better respond to when I feel unbudging certainty?
P.S. Care about the truth in politics? Take the Pro-Truth Pledge to get politicians and other public figures to stop lying!