Why Do People Believe Things That Aren’t True?
Most of us harbor false beliefs, even though we don’t know it. Do you?
Posted May 12, 2017
As Trump blasts past his first 100 days in office, he continues to change the face of American politics. In addition to many executive orders and political controversies, his administration has brought to prominence disconcerting new concepts like “fake news” and “alternative facts.” The media and the president have criticized one another for propagating falsehoods.
The fact-checking organization Politifact, for instance, rates only 16 percent of the President’s statements as true or mostly true. For his part, Trump has accused outlets like The New York Times, CNN, and others of spreading similar falsehoods, including what the White House recently called a “false narrative” regarding Russian interference in the 2016 election.
All of this should give us pause, particularly considering that many of the most important issues of our day hinge on people having an accurate understanding of the facts. Is voter fraud widespread or not? Is crime on the rise or lower than ever? Are immigrants a drain on the economy or a net plus? Do refugees from certain countries really pose a greater risk to national security than those from other parts of the world? Plus, it can be really annoying when people believe things that seem obviously false, particularly when we’re confident we’ve got all the facts straight.
According to research, however, whether we realize it or not, most of us harbor at least some false beliefs.
Moving away from the political arena for a moment, consider whether the following statements are true or false:
- We only use 10 percent of our brains.
- We lose most of our body heat through our heads.
- If you swallow chewing gum, it will stay in your system for seven years.
- Cracking your knuckles will give you arthritis.
If you answered “true” to any of these, you’re guilty of believing falsehoods. Don’t feel too bad, however. According to the British Medical Journal, even doctors endorse many of these so-called ‘"facts," and they show up frequently in both popular press as well as medical publications. Of course, it never hurt anyone to believe that we only use 10 percent of our brain capacity. When it comes to the hot-button political issues of the day, however, falsehoods can be harmful. Ultimately, our beliefs influence the way we vote, whom we elect, and what policies are enacted.
Why do people so easily believe false things?
There are probably as many answers to this question as there are people who have ever believed falsehoods. Nonetheless, psychologists have shown that a relatively small set of cognitive biases or mental shortcuts can explain a lot about how false notions take root. One of the most agreed-upon ideas in the field of psychology is that people routinely use mental shortcuts to understand what happens around them. All kinds of things occur in the world around us, and we don't always have the time or energy to sit down and carefully examine all of them. So, we tend to use quick and largely unconscious rules-of-thumb to determine what we should believe—and these shortcuts sometimes steer us in the wrong direction. Here are some of the culprits:
The Availability Heuristic
Which job is more dangerous—working as a police officer or a fisherman? If you guessed police officer, you’re wrong. According to figures from the U.S. Bureau of Labor Statistics, fishing workers are 10 times more likely than police to be killed on the job. This doesn't make police work any less important, of course, though it does mean that many of us have underestimated how dangerous other jobs are in comparison. The reason most of us believe that police officers are more likely to die at work is because of the availability heuristic, a mental shortcut that can lead us to overestimate the frequency of an event when that event is more “available” or vivid in our memory. When a police officer is killed in the line of duty, it’s rightly widely reported in the news and sticks with us in memory, so we tend to believe it must be more common than deaths in other professions.
The availability heuristic is also the reason why doctors sometimes believe that diseases are more widespread than they really are—their jobs naturally fill their memories with vivid examples. In fact, when any of us read or watch a news story about an instance of terrorism, voter fraud, or other crime, we’re likely to overestimate how common such events are. Unless we’re careful, the vivid nature of the news story in our memory can unconsciously bias our estimate of how often these events actually happen.
Whether we like it or not, all of us can be powerfully swayed by emotions. We'd like to think that our feelings are driven by logic and reason, particularly when it comes to our political beliefs. Unfortunately, this relationship is often reversed. Sometimes we end up using our reasoning ability to justify or defend a conclusion that we’ve already drawn based on our emotions. This phenomenon, called emotional reasoning, can lead us astray without our ever knowing.
Psychiatrist Aaron T. Beck first noticed this in depressed patients. He observed that many patients drew obviously untrue conclusions about themselves based on how they felt, rather than the actual facts. "If I feel depressed,” one of his patients might say, "then there must be something objectively wrong with my job, my marriage, my children, or other parts of my life."
But feelings are just feelings, even when they're powerful, and they can sometimes lie to us. Even in those of us who aren’t depressed, this tendency can affect our beliefs about virtually any emotionally charged topic, whether we’re talking about sexuality, religion, money, crime, or war. When we feel scared, angry, anxious, or even just uneasy about a topic, we can easily jump to the conclusion that the topic is somehow objectively bad or dangerous. Next time a topic makes you feel uncomfortable, that’s probably a reason to keep an open mind, not to draw a conclusion.
Once we have a belief, we tend to cling to it, even when it’s untrue. The confirmation bias is the tendency to seek out information that supports what we already believe. We do this in two important ways. First, we tend to surround ourselves with messages that confirm our pre-existing opinions. This is why, in the U.S., conservatives tend to get their news from sources like Fox, whereas liberals tune into MSNBC.
Second, we tend to ignore or discount messages that disprove our beliefs. If we’re sure that climate change is a hoax and someone shows us a research study disputing this belief, we might dismiss the study’s findings by saying that the researcher is obviously biased or corrupt. This protects us from having to change our beliefs. When our ideas are true, this probably isn’t such a bad thing. Unfortunately, it also can keep us firmly believing things are false.
While it’s clear that some people lie out of expedience or spite, most of us value the truth. We genuinely desire to accurately understand the facts and help others to do the same. As flawed human beings, however, none of us is a perfect barometer of the truth. Despite our best intentions, it’s easy to unconsciously buy into beliefs that feel right, even though they’re not. But it’s precisely when we’re sure that we’ve cornered the truth that we should take a step back, breath deeply, and open our minds as far as we can. If we were all able to take this basic truth about human nature to heart, perhaps this would allow us to more effectively come together during times of political strife.