Are Your Morals Reasonable?

Our morals may actually be guided by emotion.

Posted Nov 14, 2016

By © Nevit Dilmen, CC BY-SA 3.0
Can your moral compass be trusted?
Source: By © Nevit Dilmen, CC BY-SA 3.0

What is the right thing to do? When faced with a moral problem, many people believe that they rely on reason to decide what is what is right or wrong. This view is mistaken. In fact, our moral judgments often come from an immediate feeling, rather than reason alone.

In an article on moral reasoning, Paxton and Greene (2010) claim that moral reasoning is possible. They define moral reasoning as “conscious mental activity through which one evaluates a moral judgment for its (in)consistency with other moral commitments, where these commitments are to one or more moral principles and (in some cases) particular moral judgments.” For Paxton and Greene, your moral reasoning is based on whether your moral judgments are consistent with your overall moral principles.

To better illustrate their definition, they provide an example of a conversation between two individuals, Adam and Greg. Adam is a proponent of consequentialism, meaning that his moral judgments are based on maximizing happiness and reducing suffering. Adam is a vegetarian who wants to persuade Greg to stop eating meat. Adam then reaches an agreement with Greg that doing things that increase overall happiness in the world and reduce suffering is a good rule to follow. Adam then tells Greg that if he accepts this, he should be opposed to eating meat because becoming a vegetarian would reduce the suffering of animals. Consistency of action with principles is the key.

Still, just because you are consistent does not mean that you’re reasoning your way toward a moral decision. The psychologist Leon Festinger developed cognitive dissonance theory when he recognized that individuals experience discomfort at the idea of holding two contradictory ideas in mind. Festinger states that cognitive dissonance theory, “centers around the idea that if a person knows various things that are not psychologically consistent with one another, he will, in a variety of ways, try to make them more consistent.” Additionally, he writes, “Dissonance produces discomfort and, correspondingly, there will arise pressures to reduce or eliminate the dissonance.” In other words, it is mildly painful to realize that a moral judgment you make is not consistent with your principles.

An interesting study from Ditto, Pizarro, and Tannenbaum (2009) can help illustrate moral reasoning in action. Researchers recruited college students, asked them their political affiliations, and divided them into two groups. In one group, participants were given the choice to push a man named “Chip Ellsworth III” onto a set of train tracks to save “100 members of the Harlem Jazz Orchestra.” In the other group, participants were given the option to push a man named “Tyrone Payton” onto a set of tracks to save “100 members of the New York Philharmonic.” The different names and memberships were meant convey to participants that they were sacrificing one White person to save 100 mostly African American lives or one African American person to save 100 mostly White lives. Liberals tended to favor sacrificing Chip to save 100 mostly African American lives, whereas conservatives showed no bias in either scenario.

But in a variation of the first study, when liberals were asked to respond to one scenario immediately after the other, responses were suddenly consistent. In other words, a liberal who elected to sacrifice “Chip Ellsworth III” in the first scenario was highly likely (.98 correlation) to sacrifice “Tyrone Payton” when offered the second scenario immediately afterward.

The participants made sure that their responses in the first and second scenarios were consistent. Participants may have initially felt a desire to not push “Tyrone Payton” onto the tracks, but then, recognizing that this would run counter to their previous judgment, felt emotional discomfort. This emotion then guided their reasoning to override their initial judgment and then give a consistent response.

One could argue that we are still capable of moral reasoning without actually experiencing the emotional state of discomfort. It is possible that someone might not actually feel any pain that leads her to use her reasoning. Instead, she might forecast that psychological pain is on the horizon if she makes an inconsistent moral judgment. Rather than experiencing discomfort and recruiting her powers of reason to avoid it, she determines that if she makes a particular judgment she will then feel discomfort. Thus, one may argue that the expectation of pain is an example of reason rather than emotion.

However, even the impending consideration of psychological discomfort and acting to avoid it is still driven by emotion. When a person recognizes that she will experience discomfort by producing an intuitive, inconsistent moral judgment, she will then override her intuition to avoid the discomfort. While this person is not directly experiencing an emotion, the prospect of an emotion is still motivating her moral reasoning. 

We aim for consistent moral judgments to satisfy our desire for comfort and avoid psychological pain. While reason plays a role, emotion is what guides our moral judgments. Using our powers of reason to be morally consistent is not driven by a desire for truth, but rather a desire to avoid discomfort. There is nothing wrong with this, but we should recognize that reason alone is not how we come up with our judgments of right and wrong.

You can follow me on Twitter here: @robkhenderson.

If you enjoyed this post, please share it on Facebook, Twitter, or by email.

References

Ditto, P. H., Pizarro, D. A., & Tannenbaum, D. (2009). Motivated moral reasoning. Psychology of learning and motivation, 50, 307-338.

Festinger, L. (1962). A theory of cognitive dissonance.

Paxton, J. M., & Greene, J. D. (2010). Moral reasoning: Hints and allegations. Topics in cognitive science, 2(3), 511-527.