Ethics and Morality
How Our Morals Might Get in the Way of Behavior Change
It's not just what behavior change interventions do, but it's how they do them.
Updated September 29, 2025 Reviewed by Abigail Fagan
Key points
- People resist behavior tools that clash with beliefs about willpower and personal responsibility.
- We value internal willpower over external strategies — even when the latter are more effective.
- Designing effective interventions requires understanding the values people feel are being violated.
In 1936, the psychologist Kurt Lewin proposed a deceptively simple idea: Behavior is a function of the person and their environment. He even gave it an equation: B = f(P, E). Your actions are shaped not just by who you are, but also by the situation you’re in, and how the two interact.
Since then, decades of behavioral science have confirmed and extended this insight. In the 1960s and 70s, researchers showed that while people’s actions are heavily influenced by the context around them, we tend to explain behavior by focusing on internal traits. This tendency, for example, to say someone was rude because they are a rude person, rather than because they were in a stressful situation, is called the Fundamental Attribution Error. We pay less attention to the context and attribute behavior to the content of a person’s character.
We also learned something else: that changing behavior can lead to changes in attitude. If you start recycling because it's required in your building, over time, you may come to believe it’s morally important. This runs counter to our intuition that beliefs come first and behaviors follow. But the evidence, especially from research on cognitive dissonance, shows that behavior change can, and often does, lead to lasting attitude changes.
These ideas have been around for over 50 years. And yet… people still seem to resist them.
Why?
Because we don’t just have theories about behavior — we have moral intuitions about what should shape behavior. And those intuitions can get in the way of doing what works.
In my research, I find that many people, whether they realize it or not, believe that:
-
Behavior should come from the person, not from the environment.
-
Attitude change should precede behavior change — that changing people’s beliefs first is necessary in order to lead to meaningful long-term behavior change.
In other words, we judge not just what people do, but how they come to do it. And this has consequences.
We see this in how people evaluate interventions designed to help others behave differently — even when those interventions are effective. For example, precommitment tools (like apps that block distracting websites) are often judged as “crutches,” despite their proven ability to help people focus and stay productive. In my own research, I’ve found that people see others who use these tools as having less integrity — as if shaping your environment to support your goals is somehow morally inferior to resisting temptation with pure willpower.
We also see this when it comes to diversity training. Companies spend billions of dollars a year on trainings that, at best, have been shown to be ineffective, and at worst, cause backlash and are counterproductive. Whereas ways of designing policies and making work systems fair are more effective, and do not rely on changing individual beliefs.
This is where the psychology of taboo trade-offs and sacred values comes in. Some things, like personal responsibility, free will, or even intrinsic motivation, have come to be seen as morally sacred. When we violate these sacred values, even in the service of effective behavior change, it can feel wrong.
And so we end up avoiding the more effective fix, rejecting tools that “manipulate” behavior, and instead we cling to interventions that feel morally upright, even if they don’t work.
As a behavioral scientist, I’m interested in these moral intuitions not just as obstacles, but as data. If we want to design interventions people will actually use, we need to understand the values they’re trying to protect, even when those values make behavior change harder. Once we understand those values, we can better leverage them, to work with them, not against them, in designing behavior change interventions.
