According to a recent study by Harvard psychologists, telling the truth is the more challenging of the two-but only for those who are also willing to cheat.
Researchers invited participants to play a game in which they could, if they wanted to, lie for profit. Every round, the participants had to report whether they had correctly guessed the answer-but only after they saw what they answer was. If they guessed correctly, they earned money.
As participants played, the researchers tracked changes in brain activation. They were particularly interested in brain areas related to self-control (e.g., the anterior cingulate cortex and dorsolateral prefrontal cortex).
Researchers found that a person who was generally honest in the game showed no increased activation in these brain areas when they answered honestly. Telling the truth appeared to be an automatic process.
In contrast, people who were occasionally dishonest showed increased activation in self-control regions when they didn't lie. The pattern of activation is similar to when someone actively resists a tempting reward, or tries to override an automatic behavior. In other words, participants who sometimes cheated had to consciously resist the temptation to lie. Participants who didn't cheat didn't seem to be overcoming any temptation or instinct to lie.
Does this mean that there are two kinds of people in the world: those who are naturally good, and those who must struggle to be good? Not necessarily. What this study may have found is that participants who kept the option to lie open had to struggle to be honest. Participants who had a principled stance against cheating didn't have to consider the benefits of lying each time they gave their answer. Those who had to exert self-control to be honest weren't necessarily "dishonest" people, but people who had a more flexible set of responses in this situation. Because cheating was an option, they had to override the instinct to take the easy reward.
This kind of flexible morality is more common than many of us would like to admit. Most of us strive to be truthful and trustworthy, but our sense of moral obligation also has its limits. We may try to get away with more when the chances of getting caught are low, the potential benefits are high, or when the person being duped is a stranger. It's in these scenarios-when we carefully consider the costs and benefits of lying-that telling the truth is an act of self-control.
But if you have a commitment to honesty, and don't weigh the pros and cons of each opportunity to lie for your own benefit, it's not nearly as difficult to tell the truth. This is one of the reasons why so many religions and philosophers suggest an absolute policy of telling the truth, whether it's Christianity's commandment "Thou shall not lie" or Yoga philosophy's core principle of "asteya" (honesty). If you don't deliberate on the value of lying in each situation, you're less likely to lie at all.
Honesty is not the only behavior this applies to. It goes for any behavior you're likely to talk yourself into (e.g. smoking, snacking, shopping) or out of (e.g. exercising, getting up early, tackling the procrastination pile on your desk) because it's easier or more rewarding in the short-term. In these cases, the freedom to choose just makes it more likely that you will choose temptation and fail at your long-term goal.
What's the best strategy, then, for making moral decisions or sticking to a behavior change? Take a principled stance that sets automatic restrictions on your behavior. Weighing the risks and benefits in each situation may seem like the more logical approach, but it's more effective for most people to commit broadly and then not reflect on each opportunity.
If there's something in your life that you want to stick to but keep seeming to talk yourself out of, try reframing the choice not as a series of individual choices. Try reframing your next choice as the choice between always sticking to your goal or always giving in. Framed this way, each choice carries not the immediate risks and benefits, but the long-term consequences of being someone who consistently makes this choice.
Study cited: Greene, J.D., & Paxton J.M. (2009). Patterns of neural activity associated with honest and dishonest moral decisions. Proc Natl Acad Sci U S A. Epub ahead of print.