Skip to main content

Verified by Psychology Today


Pro-Blame Bias: The Don Corleone Principle

Pro-blame biases evolved to minimize the costly error of underblaming.

Key points

  • People view others as more responsible for bringing about harmful outcomes (relative to neutral or positive outcomes).
  • Underblaming (a false negative error) signals exploitability and is thus costlier than overblaming.
  • Overblaming (a false positive error) may be evolutionarily advantageous when the amount of blame warranted is ambiguous.

Consider the following scenario (or if you are a moral psychologist or philosopher, you may skip ahead):

The vice president of a company went to the chairman of the board and said, 'We are thinking of starting a new program. It will help us increase profits, but it will also harm the environment.'

The chairman of the board answered, 'I don't care at all about harming the environment. I just want to make as much profit as I can. Let's start the new program.'

They started the new program. Sure enough, the environment was harmed.

Did the chairman intentionally harm the environment?

Pro-Blame Bias

In 2003, philosopher Joshua Knobe discovered that the majority of people judged that the chairman intentionally harmed the environment. But when people were presented with the exact same scenario only the side effect helped rather than harmed the environment, most said he did not intentionally help the environment. People ascribed more intentionality to harmful than helpful side effects of otherwise identical actions.

This finding has been replicated numerous times, and conceptually similar work finds the same pattern (dating back to at least the 1960s): people are particularly inclined to attribute responsibility to harmful or immoral actions. For a few examples, people assigned more responsibility to a person whose car brake broke while his car was parked on a hill causing an accident than when a fortunately located tree stump prevented the car from causing an accident; people attributed more causal culpability to a speeding driver in a car accident when he was speeding home to hide a vial of cocaine than when he was speeding home to hide an anniversary present; and people attributed more free will to a person who stole someone’s valuables than to a person who stole someone’s recycling bin contents.

In general, people have a pro-blame bias such that they are more likely to view people as responsible when their behavior results in harmful outcomes than when their behavior results in helpful, neutral, or less harmful outcomes.

Error Management Theory

Scholars have forwarded numerous explanations for these findings, but error management theory might be useful here. Error management theory contends that people demonstrate cognitive biases (or small tendencies to favor particular judgments or conclusions in ambiguous information environments) when the costs of false-positive and false-negative errors were asymmetrical over evolutionary history. In other words, when people have incomplete information upon which to base their judgments, they will err on the side of the less costly error.

For example, whereas men tend to overestimate a woman’s sexual interest in him, women tend to underestimate a man’s commitment to her. For a man, it may be costlier to miss out on a successful mating opportunity than to make an unwanted sexual advance toward a woman, whereas for a woman, it may be costlier to risk pregnancy from a man who will abandon her and any potential future offspring than to miss out on a man who might commit to her. Consequently, men err on the side of assuming a woman is sexually interested in him, whereas women err on the side of assuming a man is not committed to her.

When it comes to blaming a harmdoer, the costs seem asymmetric. Letting people off the hook for their transgressions may signal to others that one is easily exploitable, which could be a highly costly false negative error. In contrast, being a bit too harsh in one’s moral judgments may not be particularly costly, and even might be evolutionarily advantageous to the extent that people work extra hard to avoid crossing harsh punishers (a less costly false positive error). Thus people may have evolved to err on the side of blaming harmdoers. I call this The Don Corleone Principle:

Source: Pixabay

I'm a superstitious man, and if some unlucky accident should befall him ... if he should be shot in the head by a police officer, or if he should hang himself in his jail cell, or if he's struck by a bolt of lightning, then I'm going to blame some of the people in this room, and that I do not forgive.

–Don Corleone, The Godfather

By warning that he will seek vengeance if harm befalls his youngest son, the Godfather likely intimidates potential antagonists. Of course, Don Corleone is fictional and the example is a tad hyperbolic. Blaming someone for a lightning strike might actually hurt someone’s credibility as a reasonable person and backfire with a loss of status and affiliations. Biases reveal themselves most strongly in ambiguous information environments, where third-party observers would be unlikely to know whether the blame was excessive (and thus unlikely to counter-blame for the excessive blame). And so people should err on the side of blame in cases where it is unclear how much blame is warranted.

Of course, moral judgments are often ambiguous—one cannot read minds or know with certainty whether a person intended to cause a harm or had ill will—and so such judgments should frequently be vulnerable to biased interpretations. And people likely attribute more ill will to others than exists in reality.

This error management perspective may help explain why (1) many demonstrations of pro-blame biases involve ambiguous cases of side-effects of actions or accidental harms, (2) people judge completely ambiguous actions as immoral, and (3) pro-blame biases are stronger for harms directed toward the self or good others than bad others, in concrete rather than abstract cases, and in their own social worlds rather than in other worlds. The costlier it would be to fail to ascribe moral responsibility, the more inclined people should be to err on the side of moral responsibility.

In contrast to judging harmdoers, the error costs seem more symmetric (and relatively low) for morally good and especially morally neutral actions. When another person does something morally good, failing to ascribe responsibility to them might reduce their future altruistic behavior, and overestimating their responsibility might lead to an overly positive impression of their character. And no consequences come to mind for under or overestimating responsibility for neutral outcomes. Corresponding to these relative costs, people attribute slightly more free will and responsibility to good than neutral actions, but less to good than bad.

The Rationality of Biases

Biases are often contrasted with rationality because modern humans value fair application of principles. It may seem unjust or incoherent to hold people more responsible when they had the bad luck of their neutral action (e.g., parking their car on a hill) leading to a negative outcome (e.g., causing a car accident) rather than a neutral outcome (e.g., a tree stump interfering with the accident). But where biases evolved, they likely did so because they benefited human fitness. People may have evolved tendencies to ascribe more responsibility to behaviors that cause harm than to otherwise identical behaviors that do not to minimize the costly error of underblaming. Because accidents don’t happen to people who take accidents as a personal insult.


Alicke, M. D. (1992). Culpable causation. Journal of Personality and Social Psychology, 63(3), 368-378.

Clark, C. J. (in press). The blame efficiency hypothesis: An evolutionary framework to resolve rationalist and intuitionist theories of moral condemnation. In T. Nadelhoffer & A. Monroe (Eds.), Advances in Experimental Philosophy of Free Will and Responsibility. London, UK: Bloomsbury Publishing.

Clark, C. J., Bauman, C. W., Kamble, S. V., & Knowles, E. D. (2017). Intentional sin and accidental virtue? Cultural differences in moral systems influence perceived intentionality. Social Psychological and Personality Science, 8(1), 74-82.

Clark, C. J., Luguri, J. B., Ditto, P. H., Knobe, J., Shariff, A. F., & Baumeister, R. F. (2014). Free to punish: a motivated account of free will belief. Journal of Personality and Social Psychology, 106(4), 501-513.

Clark, C. J., Shniderman, A., Luguri, J. B., Baumeister, R. F., & Ditto, P. H. (2018). Are morally good actions ever free?. Consciousness and Cognition, 63, 161-182.

Clark, C. J., & Winegard, B. M. (2020). Tribalism in war and peace: The nature and evolution of ideological epistemology and its significance for modern social science. Psychological Inquiry, 31(1), 1-22.

Clark, C. J., Winegard, B. M., & Shariff, A. F. (2019). Motivated free will beliefs: The theory, new (preregistered) studies, and three meta-analyses. Journal of Experimental Psychology: General.

Cushman, F., Knobe, J., & Sinnott-Armstrong, W. (2008). Moral appraisals affect doing/allowing judgments. Cognition, 108(1), 281-289.

Everett, J. A. C., Clark, C. J., Meindl, P., Luguri, J. B., Earp, B. D., Graham, J., ... & Shariff, A. F. (2021). Political differences in free will belief are associated with differences in moralization. Journal of Personality and Social Psychology, 120(2), 461-483.

Haselton, M. G., & Buss, D. M. (2000). Error management theory: A new perspective on biases in cross-sex mind reading. Journal of personality and social psychology, 78(1), 81-91.

Hester, N., Payne, B. K., & Gray, K. (2020). Promiscuous condemnation: People assume ambiguous actions are immoral. Journal of Experimental Social Psychology, 86, 103910.

Knobe, J. (2003). Intentional action and side effects in ordinary language. Analysis, 63(3), 190-194.

Monroe, A. E., & Malle, B. F. (2019). People systematically update moral judgments of blame. Journal of Personality and Social Psychology, 116(2), 215-236.

Roskies, A. L., & Nichols, S. (2008). Bringing moral responsibility down to earth. The Journal of Philosophy, 105(7), 371-388.

Struchiner, N., De Almeida, G. D. F., & Hannikainen, I. R. (2020). Legal decision-making and the abstract/concrete paradox. Cognition, 205, 104421.

Walster, E. (1966). Assignment of responsibility for an accident. Journal of Personality and Social Psychology, 3(1), 73-79.

More from Psychology Today

More from Cory Clark Ph.D.

More from Psychology Today