Skip to main content

Verified by Psychology Today

Sport and Competition

How Do You Decide Things Are Getting Worse?

You need less evidence that things are getting worse than better.

FotoCuisinette/Shutterstock
Source: FotoCuisinette/Shutterstock

Lots of things in life happen in sequences: The unemployment rate will grow for a few months and then decline for a month or two. A sports team will go on a losing streak and then win a few games in a row. You will have several frustrating days at work, and then a project will start to go well.

Sometimes, you chalk up the change to random variation—what we usually call dumb luck. But sometimes you believe the change is a reflection that something significant has shifted and that the streak is likely to continue.

How much evidence do you need to decide that something fundamental has changed? This question is addressed in a paper in the February 2017 issue of the Journal of Personality and Social Psychology by Ed O’Brien and Nadav Klein. The authors present a series of findings suggesting that people need less evidence to decide that things are getting worse than they require to conclude that things are getting better.

In an initial series of studies, participants were asked to imagine various streaks across several domains, such as sports, the economy, and health. Sometimes people were asked to imagine that things were going very well. They were asked to envision that 10 more events happened and were asked how many of the next 10 would have to be bad in order to believe that there had been a lasting change for the worse. Other people were tasked to imagine that things were going badly and then asked how many of the next 10 events would have to be good in order for them to believe that there had been a lasting change.

When the starting point was good, people needed about five of the upcoming events to be bad in order to conclude that things were getting worse. When the starting point was bad, though, they needed about 6.5 good events on average to believe that things were getting better. The researchers found similar differences across a number of different formulations of the question, including modifying the duration of the changes and the size of the changes.

Another study presented participants with a graph of an economic indicator. For some participants, the graph started high and then went lower. Some participants were told that the economic indicator was one in which high values indicated that the economy was healthy. Other participants were told that the indicator was one for which high values indicated that the economy was unhealthy. So the falling bars on the graph represented good things to some participants and bad things to others.

The beauty of this study is that all participants were seeing exactly the same graph. When participants were asked whether the graph indicated a fundamental shift in the economy, they were more likely to see a small change as indicating a fundamental shift when it meant that things were getting worse rather than that things were getting better.

So, why does this happen?

The researchers ruled out lots of alternatives through studies that I won’t describe in detail here—for example, the effect does not seem to be due to people being more alarmed by decreases than increases, or by whether this is happening to themselves or to someone else.

The team suggests instead an explanation based on the physical concept of entropy. The basic idea of entropy is that maintaining order requires energy. A clean desk requires energy to tidy up, but after a while, many (like mine) will return to a state of disorder. Similarly, many people believe that improving the state of the world requires energy—and, correspondingly, that the state of the world will get worse without the application of energy. For example, I play the saxophone. Because I continue to practice, I continue to get better as a player. If I stopped, my performance would get worse.

To test this idea, the researchers conducted a final experiment in which participants were told about a game that people could learn to play. In one version, the game was quite difficult and required real effort for people to get better. In the other version, the game tapped natural abilities that every human has and so simply playing it more would make them better over time. Then people evaluated sequences of performance, either improving or declining, and had to judge when a change in a player's performance reflected real underlying change.

Participants who were told that the game was difficult and required effort showed the same pattern as in all of the other studies: They required less negative evidence to judge that someone was really getting worse than they needed positive evidence to judge that players were getting better. For the version in which the game tapped natural human abilities, though, the pattern reversed: Now participants actually required less positive evidence to decide that someone was getting better than negative evidence to conclude that an individual was getting worse.

In general, of course, we live in a world in which things get better because effort has been put into them. As a result, this pattern is probably helpful—because we assume things are getting worse from just a few negative observations, we may intervene quickly to try to reverse declines. And, because we require more positive evidence to judge that things are getting better, we may continue to put in effort even after we see some positive results.

References

O'Brien, E., & Klein, N. (2017) The tipping point of perceived change: Asymmetric thresholds in diagnosing improvement versus decline. Journal of Personality and Social Psychology, 112(2), 161-185.

advertisement
More from Art Markman Ph.D.
More from Psychology Today