Skip to main content

Verified by Psychology Today


Why People Ignore Facts

When it comes to reasoning, identity trumps truth.

Source: chayka1270/Pixabay
Source: chayka1270/Pixabay

Our ability to reason did not develop simply to help us find the truth. Instead, reasoning evolved to fulfill fundamentally social functions, like cooperating in large groups and communicating with others.

This is one of the arguments advanced in “The Enigma of Reason,” a book by cognitive scientists Hugo Mercier and Dan Sperber. According to their theory of reasoning, reason’s primary strengths are justifying beliefs we already believe in and making arguments to convince others. While this kind of reasoning helps us cooperate in a social environment, it does not make us particularly good at truth-seeking. It also makes us fall prey to a number of cognitive biases, like confirmation bias, or the tendency to search for information that confirms what we already believe.

Their ideas also help explain why politics seems to make us so bad at reasoning. If most of reasoning is for social cohesion instead of truth-seeking, then belonging to a particular political party should distort our reasoning and make us pretty bad at finding the truth.

A number of studies document the many ways in which our political party distorts our reasoning. One study found that people who had strong math skills were only good at solving a math problem if the solution to the problem conformed to their political beliefs. Liberals were only good at solving a math problem, for instance, if the answer to that problem showed that gun control reduced crime. Conservatives were only good at solving this problem if the solution showed that gun control increased crime. Another study found that the higher an individual’s IQ, the better they are at coming up with reasons to support a position—but only a position that they agree with.

Belonging to a particular political party can also shape our perception. In one study, researchers were asked to watch a video of protestors. Half of the participants were told the people in the video were protesting the military’s “Don’t Ask, Don’t Tell” policy. The other half were told that the people were protesting an abortion clinic. Liberals reported saying the protestors were more violent and disruptive if they were told they were watching abortion clinic protestors, and the opposite was true for conservatives—even though everyone was watching the same video.

Why does political identity shape our thinking and perception so dramatically? NYU psychology professor Jay Van Bavel explains the results of studies like these with his “identity-based” model of political belief: Oftentimes, the actual consequences of particular party positions matter less to our daily lives than the social consequences of believing in these party positions. Our desire to hold identity-consistent beliefs often far outweigh our goals to hold accurate beliefs. This may be because being a part of a political party or social group fulfills fundamental needs, like the need for belonging, which supersede our need to search for the truth.

A desire for identity consistency may help explain why we can be so uncomfortable engaging with opinions that challenge our beliefs. One recent study found that we are even willing to give up the chance to earn money to avoid reading opinions we disagree with. Participants had the choice to read opinions they agreed with about political topics like same-sex marriage, guns, or abortion for the chance to win $7. Alternatively, they could read opinions they disagreed with for the chance to win $10. About two thirds of participants chose to read the opinions they agreed with, giving up the chance to earn more money. And this tendency isn’t something you can simply pin on the other political party—the researchers found that both Democrats and Republicans were equally likely to avoid information they disagree with.

Blocking out information we disagree with—through creating social media echo chambers, reading partisan news, or only surrounding ourselves with friends who agree with us—can also lead to our opinions becoming more extreme. A number of psychological studies have shown that group discussions can lead people to hold more extreme beliefs than they would on their own—a phenomenon known as group polarization. Our tendency to surround ourselves with only like-minded opinions may be one of the reasons why Republicans and Democrats are rapidly becoming more polarized.

But, even if people do expose themselves to beliefs they disagree with, that won’t necessarily make things better. More exposure to the other side can sometimes backfire and cause people to become more entrenched in their own beliefs. One study paid Twitter users to follow accounts that would retweet tweets from their political opponents—liberals would see conservative tweets, and conservatives would see liberal tweets. It didn’t cause people to open their minds to the other side. Instead, liberals became more liberal, and conservatives became more conservative.

We often react to opinions we disagree with defensively, viewing them as threats to our identity. We also do the same with facts: When confronted with facts we disagree with, we often do not change our perceptions. Past research suggested the possibility that fact-checking could lead to a “backfire effect,” causing people to double down and become even more stubborn in their beliefs. Facebook discovered, for instance, that warning users that an article was false caused people to share that article even more. While the notion of a “backfire effect” is alarming, more recent research undercuts the idea, suggesting that fact-checking, if done properly, can often successfully correct misperceptions.

However, research suggests that correcting misperceptions isn’t enough to change behavior. For instance, one study found that successfully correcting the false belief that vaccines cause autism didn’t actually encourage some parents to vaccinate their children. Other studies found that correcting false beliefs about Trump caused people to change their beliefs, but this did not change how much they supported Trump. In other words, while you can get people to understand the facts, the facts don’t always matter.

So, how can we make them matter? Jay Van Bavel, in keeping with his identity-based model of political belief, suggests that we can make accuracy goals an important part of an identity in the same way scientists or investigative journalists do. Some research suggests this might be useful: People who are high in a trait called “scientific curiosity,” or who seek out scientific information for the pleasure of finding out novel things, are less likely to engage in politically motivated reasoning. Other research suggests that making accuracy goals more important by either paying people for accurate responses or holding people accountable can make people less likely to engage in politically motivated reasoning.

Van Bavel also proposes that making people feel more secure in their individual identity may make people more open to accepting information they would otherwise reject. Some evidence suggests that self-affirmation exercises, which allow people to reflect on values that are meaningful to their identity, make them more willing to accept information that goes against their political identities. Other evidence finds that validating people’s beliefs in conspiracy theories can make them more willing to accept information that contradicts those theories. However, some recent research about self-affirmations point to mixed findings, so more research is necessary to determine how misinformation and identity threats interact.

Additionally, in-person conversations with people who have different beliefs may help us get outside our partisan echo chambers, as long as they are positive and nuanced. Psychologist Peter T. Coleman runs the difficult conversations laboratory at Columbia University, where he studies how people who deeply disagree with each other can have productive conversations. He has found that if people build up goodwill for each other and have more positive interactions than negative, this can lead people to have more complex, nuanced discussions. Additionally, presenting people with information about an issue in a nuanced way, rather than presenting them with pro-con arguments laid out in a simplified manner, can lead people to have more complex and satisfying conversations about contentious issues. While stories in the media often present simplified narratives, and social media typically does not promote positive interactions with our political opponents, a little bit of complexity and goodwill could make our political discussions more satisfying.

While our ability to reason is one of our greatest strengths, human reasoning can be flawed, especially when we are highly motivated to use reason to support our team. However, psychology research provides us with a few ways to overcome these flaws, like making accuracy goals an important part of our identity, being curious about the other side, or embracing complexity and nuance. These strategies may help open ourselves up to the facts—even when they are inconvenient.

More from Steve Rathje
More from Psychology Today