Is There Such a Thing as an Honest Mistake?

New research suggests how to judge whether a mistake is actually a lie.

Posted Aug 18, 2020

Everyone’s committed a so-called honest mistake, in which you engage in some form of less than reputable behavior but for no other reason than that you, well, made a mistake. Perhaps you’ve sent a company an online payment for a service they performed and typed in “$50” instead of “$60” on your smartphone app. You know that you meant to pay the right amount, so your error wasn’t due to a desire to cheat or get away with something. On the other hand, how does the person you’re paying know this? From the company’s perspective, you’ve just tried to shave off 10 dollars, but from your perspective, your finger mistakenly slipped before you pushed “send.”

Little errors like this happen all the time, and because of this, most people are willing to accept the explanation that there was no intent to cheat. However, it’s also possible that a little mistake such as the $10 underpayment is part of a person’s larger pattern of dishonesty. A $10 here and a $10 there can add up over time, and because there’s a good chance that deceptive people can get away with this in the larger scheme of things, some figure it’s a chance worth taking. They figure it’s unlikely that, even if caught in the act, they would face serious consequences.

Indeed, when you’re dealing with someone whose mistake like this might seem forgivable, there’s an inner metric that you might try to apply before deciding on its ultimate honesty. According to Maastricht University’s Gianna Lois and Johannes Gutenberg-University Mainz’s Michèle Wessa (2020), everyday life is full of examples where morally ambiguous contexts lead to trivial acts of dishonesty” (p. 1). The prevalence of these “trivial acts” may depend on the strictness of a society’s social norms when it comes to dishonesty.

What the authors describe as the “theory of bounded ethicality” proposes that when a society fails to differentiate clearly honest from dishonest behavior, people will have fewer guideposts to follow. They’ll become “ethically blind” and unable to judge the “moral repercussions” of their own behavior (p. 2). Consider what happens during a social or historical period marked by large-scale deceptions on the part of political leaders. You become so used to hearing an elected official uttering falsehoods that your own sense of what’s a lie and what’s the truth can start to fade away in a general cloud of confusion.

Another contributor to the individual’s tendency to behave dishonestly comes from the ways that people rationalize their own unethical behavior. Again, the more ambiguous the social norms, the more this is likely to happen. The so-called self-serving justifications you may engage in after behaving dishonestly stem from your desire to see yourself in a positive light. Without clear social norms saying that it’s wrong to cheat, you’ll have a much easier time continuing to see yourself as honest despite the fact that your behavior was anything but.

As the Dutch-German author team points out, there are two types of norms that help to govern honest behavior. A “descriptive norm” tells you what people actually do, and an “injunctive norm” tells you what people should do. Ambiguous situations set up vague injunctive norms, making it more likely you'll look around at what others are doing before you make your choice to act.

The current pandemic has created what’s undoubtedly the type of ambiguous moral situation the authors studied. Many governments count on warnings to work in trying to get citizens to follow COVID-related facemask, social distancing, and quarantining guidelines. Yet, as you undoubtedly know, these warnings do not consistently have any teeth to them, creating a situation filled with descriptive norm ambiguity. As a result, people will do what others do, and if it’s to pay no attention to these warnings, then they won’t either. When, and if, a local government decides to fine facemask scofflaws, that behavior might change.

Lois and Wessa created their own ambiguous moral situations in a set of two lab experiments involving decision-making performance by participants on a difficult cognitive task across three rounds of seven trials each. Participants believed they were competing against others, even though they weren’t. The rule was set that if participants felt the task on a particular round was getting too difficult, they could ask for extra rewards (experimental points). However, some of the rounds were not that difficult. Yet, participants could make an honest mistake by asking for the reward on one of these easy tasks if they mistakenly recalled it as difficult.

The dilemma involving descriptive norms was created by experimentally manipulating the feedback participants received about their supposed competitors. In one condition, participants were told that their competitors had asked for more rewards than did the participants. In the second condition, the researchers told participants that competitors had asked for the same amount, and in the third, the researchers reminded participants of the rule for asking for extra rewards. The question would be how the participants would behave on subsequent trials. Would they commit more dishonest mistakes by asking for more rewards than they deserved or would they continue to behave honestly?

Adding another element to the study, Lois and Wessa also administered a measure of psychopathy to their participants to see whether personality would enter into the equation of who committed honest mistakes. People high in this trait should commit rule violations regardless of the experimental setup.

The findings showed that when participants were led to believe that others had violated the experimental rule, they too asked for more rewards than were warranted, what the authors referred to as a self-serving strategy. However, in some cases, participants asked for no rewards on rounds that were indeed difficult, referred to as a self-hurting strategy. The findings showed that when led to believe that others violated the rules governing rewards, they did too, responding, the authors suggested, to "conscious or unconscious self-serving biases” (p. 11). You'll rationalize your own cheating, in other words, when you see everyone else doing so as well.

Other factors entered into the situation involving rule flouting. As you might expect, people higher in psychopathy were indeed more likely to cheat by asking for more than they deserved. Beyond personality, though, the question remains whether people cheat because they don't know they're violating a rule. That issue of rule ambiguity now comes into play.

Consider once again the facemask situation. People from a locality or country in which facemasks aren’t required may not even realize they’ve violated the regulations of the place they’re visiting. This could be an honest mistake. However, if they are aware of the local guidelines, they may use, as the authors point out, a “self-maintenance strategy” in which they pretend that they didn’t know about these guidelines. They can retain their “moral self-image” even while their behavior suggests the opposite. These mental gymnastics are even more likely to occur when the other facemask-less people in the vicinity also flagrantly violate the guidelines. A combination of more widely-advertised guidelines plus a society that follows these guidelines should lead to greater compliance, according to this logic.

Returning to the question of how you decide whether someone’s mistake is honest or not, the factors to consider involve a combination of the extent to which people know the ethical path, whether others are following that path, and whether the individual is someone you’ve found in the past to try to circumvent the moral high ground. By the same token, you also need to be willing to take a hard look at your own potentially dishonest behavior by examining whether your own tendency to engage in self-serving biases has given you that “ethical blindness” described by Lois and Wessa.

To sum up, an honest mistake may be just that. Deciding who to believe is a complex process, but one that can ultimately help you establish more trusting and fulfilling relationships.

References

Lois, G., & Wessa, M. (2020). Honest mistake or perhaps not: The role of descriptive and injunctive norms on the magnitude of dishonesty. Journal of Behavioral Decision Making. doi: 10.1002/bdm.2196