There are a lot of psychological terms for the fact that people don't like to change their minds; "motivated reasoning", "confirmation bias", "cognitive dissonance". But you don't need academic semantics to know that trying to get somebody to see things your way is tough if they go into the argument with another point of view. You argue the facts, as thoughtfully and non-confrontationally as you can, but the facts don't seem to get you anywhere. The wall of the other person's opinion doesn't move. They don't seem to WANT it to move.
What's going on there? Why do people so tenaciously stick to the views they've already formed? Shouldn't a cognitive mind be open to evidence...to the facts...to reason? Well, that's hopeful but naïve, and ignores a vast amount of social science evidence that has shown that facts, by themselves, are meaningless. They are ones and zeroes to your mental computer, raw blank data that only take on meaning when run through the software of your feelings. Melissa Finucane and Paul Slovic and others call this "The Affect Heuristic" , the subconscious process of taking information and processing it through our feelings and instincts and life circumstances and experiences...anything that gives the facts valence - meaning...which turns raw meaningless data into our judgments and views and opinions.
Okay, but why do we cling to our views so tenaciously after they are formed? Interesting clues come from two areas of study...self-affirmation, and Cultural Cognition. Both areas suggest that we cling to our views because the walls of our opinions are like battlements that keep the good guys inside (us) safe from the enemy without (all those dopes with different opinions than ours). Quite literally, our views and opinions may help protect us, keep us safe, literally help us survive. Small wonder then that we fight so hard to keep those walls strong and tall.
Self-affirmation conditioning studies find that if, before you start to try to change somebody's mind, you first ask them to remember something that gave them a positive view of themselves, they're more likely to be open to facts and to change their opinions. People who feel good about themselves are more likely to be open-minded! (That's far more simplistic than any academic would ever put it!) One study, in press, was done back in 2008 and asked people about withdrawing troops from Iraq. Most Republicans at the time thought the troops should stay. Two separate groups of Republicans were shown statistics about the dramatic reduction of violence in Iraq following the "surge" in American troops. One group was asked to do a self-affirmation activity (they were asked to remember a time when they felt good about themselves by living up to a moral value they held). The other group was just shown the violence statistics, with no self-affirmation. Then both groups were asked whether the dramatic reduction in violence in Iraq was a reason to withdraw U.S. troops. The Republicans who did the self-affirmation activity, the folks who were primed to feel good about themselves, were more likely to change their minds and say that the reduction in violence in Iraq was a reason to begin pulling out of Iraq. The group that had not done the self-affirmation remained adamant that the troops should stay.
Cultural Cognition is the theory that we shape our opinions to conform to the views of the groups with which we most strongly identify. That does two things. It creates solidarity in the group, which increases the chances that our group's views will prevail in society (e.g. our party is in power). And it strengthens the group's acceptance of us as members in good standing. (Like the lithmus test some conservative Republicans have proposed that candidates must pass, making sure their views conform to conservative doctrine before those candidates get party support.)
Strengthening the group, helping it win dominance, and having the group accept us, matters. A lot. Humans are social animals. We depend on our groups, our tribes, literally for our survival. When our group's views prevail, and our group accepts us, our survival chances go up. So the Cultural Cognition motivation to conform our opinions to those of the groups/tribes with which we identify is powerful. And it would be consistent with that interpretation that the more threatened we feel, by economic uncertainty, or threats of terrorism, or environmental doom and gloom, the more we circle the wagons of our opinions to keep the tribe together and keep ourselves safe...and the more fierce grow the inflexible "Culture War" polarities that impede compromise and progress. The self-affirmation research seems to support this. It appears that the less threatened we feel, the more flexible our opinions are likely to be.
So the next time you want to have a truly open-minded conversation on a contentious topic with someone who disagrees with you, don't launch right into the facts. Ask them to tell you about some wonderful thing they did, or success they had, or positive feedback they got for something. And try to remember something like that about yourself. Then you might actually have a conversation, instead of the argument you're headed for instead.
The psychology of risk perception referred to above is described in detail in David Ropeik's new book, How Risky Is It, Really? Why Our Fears Don't Match the Facts.