Ulterior Motives: Just Don't Do It
Is it better to get up and act or do nothing?
By July 1, 2010 - last reviewed on June 9, 2016published
Tragically, the baby was dead by the time Hu ran back to the car to rescue him many hours later. Under pressure from a zealous district attorney, an Austin Grand Jury indicted him for a felony. But public opinion about this prosecution was highly critical of the DA, and ultimately Hu was allowed to plead guilty to a misdemeanor. Hu was terribly negligent, but he hadn't intended to hurt his child.
Hu's fate is consistent with what psychologists have discovered about the way we assign credit and blame. Say a boy plays catch with the girl next door. He throws the ball high. It goes over her head and breaks a neighbor's car window.
The boy is definitely going to get in trouble. He threw the ball and the ball crashed through the glass. It's an easy call, right?
What about the girl, though? She watched as the ball passed over her head. She didn't jump for it or try to swat it away. Perhaps she could have done something that would have stopped the ball from hitting the window. Yet no one is going to send her to bed without dinner.
Jonathan Baron and his colleagues at the University of Pennsylvania dubbed this tendency to blame outcomes on actions rather than inactions the omission bias: The boy took an action—he threw the ball. The girl failed to act. So the boy gets the blame.
The omission bias creeps into our judgment calls on domestic arguments, work mishaps, and even national policy discussions. In March, President Obama pushed Congress to enact sweeping health care reforms. Republicans hope that voters will blame Democrats for any problems that arise after the law is enacted. But since there were problems with health care already, can they really expect that future outcomes will be blamed on Democrats, who passed new laws, rather than Republicans, who opposed them? Yes, they can—the omission bias is on their side.
The Virtues of Nothingness
Actions are more obvious than inactions, which partly accounts for the omission bias. It's easy to notice when someone does something. It's hard to take note of something that does not happen.
Also, the results of actions are usually more certain than the results of inactions. The boy threw the ball, it hit the window, the window broke. That much is known. We can't know the result of an action that was not taken. Perhaps the girl would have saved the window with a leaping catch. Even if she had tried, though, the ball still might have eluded her grasp.
Finally, people's actions are often a better gauge of their intentions than their inactions. When you perform an action, you usually mean to do it. You may not have intended the outcome, but you did mean to start the action.
The intention behind inaction is harder to discern. Perhaps the girl was secretly happy to watch the ball crash into her cranky neighbor's car. However, she might not have noticed the ball at all.
What Did You Mean By That?
Intentions are particularly important to our notion of blame. Our judgments are harsher when we believe that an action (or even an inaction) was intentional rather than accidental. A study by Jason Plaks, Nicole McNichols, and Jennifer Fortune pointed out that there are two kinds of intentions: abstract intentions to cause an outcome and specific plans for causing that outcome. Both play a role in moral reckonings.
Imagine a CEO whose company pollutes the environment via its factories' smokestacks. She may decide that she wants to clean up the environment and so has her engineers design a system to scrub dangerous particles from the smoke. If this plan succeeds, the CEO gets the credit (the positive side of blame) for her heroic green act.
But what if the CEO believes that the company is losing money because people think it's harming the environment? She implements the same smoke-cleaning plan, but to improve the bottom line. In this case, she gets some credit for cleaning up the environment, but not as much as in the previous example. The outcome is the same (the company pollutes less), but the CEO's heart wasn't in it.
Finger-Pointing for the Future
We use credit and blame to help us shape people's behavior. By assigning blame to actions, we lead people to think about the consequences of what they do.
Intentions are important because they signal the kinds of actions that a person may well take in the future. The profit-minded CEO did the morally correct thing, but we give her little credit, because her desire to increase profits may later lead to morally suspect behaviors. Likewise, the ecologically minded CEO gets credit for helping the environment, because her abstract intention makes it likely that she will continue to engage in good behaviors.
Generally speaking, the omission bias is a good thing. We would like to assign credit and blame based on what people mean to do in addition to the outcomes of their actions. But no brain scan will ever fully reveal inner thoughts. People's actions are the only guide we have to their souls.
It's worth keeping the omission bias in mind when you're trying to get credit: Make sure your actions are in line with your intentions, for all to see. And if you launch a new product at work that tanks, or your kitchen renovation goes awry, try pointing out that at least you attempted something—while others sat back, avoiding both credit and blame.
What Would You Do?
- You are walking up the street in San Francisco when you see a trolley careening out of control. It is about to hit and kill 5 people. You happen to be standing next to a switch that could divert the trolley to a second track where it would kill only one person. Do you flip the switch?
- You are standing on a bridge in San Francisco when you see a trolley below you careening out of control. It is about to hit and kill 5 people. You’re standing next to a very fat man who is leaning over the railing to see what is happening. If you push him off the bridge, he will fall on the track. He’ll be killed, but his weight will stop the trolley, saving the 5 people. Do you push him?
- You are a hospital director. A 5-year-old boy named Timmy needs a rare medical procedure that only your hospital can provide. The surgery costs $1.2 million dollars, but will save his life. He doesn’t have insurance, so the hospital would have to bear the expense. Taking funds from the treasury would force you to delay renovating an operating room, which could put patients in that OR at risk. Do you save Timmy?
Here’s what other people would do:
When confronted with the first dilemma, people generally agree to throw the switch. But those given the second version generally don’t elect to push the man, because directly causing a death seems much worse than indirectly doing so.
In the third case, people say the hospital should save Timmy, because we normally respond to one human’s predicament more strongly than we do to general suffering. Furthermore, people don’t want the director to contemplate the problem for long, because putting a monetary value on human life makes us all uncomfortable.