Skip to main content

Verified by Psychology Today

Eric Horowitz
Eric Horowitz
Bias

How Knowledge Can Make You Stupid

What you know can bias your beliefs about what others know.

The human ability to infer what other people are thinking is a big reason we're able to understand and cooperate with others. Along with the ability to take pictures of our food, it's what separates us from lesser primates.

But we're not born with this ability. Experiments involving what's called the "change-of-location" or "false-belief" task show that it tends to develop between the ages of three and five. In these experiments children observe or are told about a person who hides an object and then leaves the room. While this protagonist is gone, a second person comes in and moves the object. Children are then asked where the protagonist will look for the object when they return. Younger children are unable to separate their own knowledge of the new location from the knowledge of the protagonist, and thus they tend to say the person will look in the object's new location. Older children are able to understand that the protagonist can hold a false belief, and thus they tend to correctly say the protagonist will look in the object's original location.

For years, that's all there was to it. Once kids reached elementary school, we assumed they could keep their own knowledge separate from what they perceived others to know. But a new study led by Jessica Sommerville of the University of Washington throws some variation into the change-of-location task, and the results suggest that we may not grow out of this stage as much as we think.

While the standard false belief task involves putting the object is two distinct locations (e.g. table, cupboard, closet, etc.), Sommerville and her team created a continuous set of locations by conducting the experiment within a sandbox. This allowed the researchers to detect lesser degrees of influence because rather than requiring participants to absurdly guess the wrong location in order to show bias, all participants needed to do was guess the wrong location by a matter of centimeters.

Whereas the classic change-of-location task is designed to assess whether participants appreciate that a protagonist can hold a false belief, our Sandbox task focuses on a different but related issue. The goal of our task is to test the degree to which participants’ knowledge of the object in its new location biases their representation of where the protagonist thinks the object is located. Thus, our task is designed to focus on the amount of bias (measured in centimeters) that the participants’ own privileged knowledge exerts on their representation of another person’ s belief about a location in space.

Sommerville's experiments were similar to the standard change-of-location experiments. In the "false-belief" condition, the experimenter narrated and acted out the story, first placing the object in the initial location in the sandbox, and then moving it to the second location when the protagonist in the story would have been absent. Participants were then asked to predict where the protagonist would look for the object. Participants also engaged in a control condition in which they were simply asked to recall where the object was initially placed.

The researchers found that not only did the 3- and 5-year-olds show bias on the false-belief condition relative to the control, adults did too. That is, when adults were asked where the protagonist would look for the object, they chose a spot that, compared to their memory of the object's initial location, was significantly closer to the object's new location. It appears that adults' own knowledge of where the object was hidden influenced where they thought the person would look for it.

Within the lab, this may not seem like a big deal, but in other contexts this kind of bias can be problematic. For example, imagine that instead of understanding how somebody can falsely believe an object is four feet from the end of the sandbox when it's really two feet from the end, you understand how your neighbor can believe your toaster is worth $20 when you know it's worth $40. If the neighbor wants to buy your toaster, this understanding leads to the conclusion that $30 is a fair compromise.

But Sommerville's research suggests that things may not work that smoothly. Even if all objective evidence points to your neighbor thinking that the toaster is worth $20, your own knowledge that it's *really* worth $40 can bias your estimate of his belief. So instead of $30 being the fair compromise, you might think he actually believes the toaster is worth $22 or $23, and therefore $31 or $32 is the fair compromise. At the margin, this makes reaching an agreement more difficult.

Now imagine that instead of toaster prices, one person is a senator who knows the optimal tax rate on income over $250,000 is 39.3%, and the other person is a senator known to believe the optimal rate is 36.4%. Suddenly the prospect of the first senator's belief influencing his perception of what his colleague believes is a pretty big deal.

Of course this is all a rather lengthy extrapolation from a single study, and it's unclear how biased estimates about an object buried in a sandbox will translate to more natural settings. Furthermore, when it comes to high-stakes negotiations, there are so many factors involved that it's debatable whether these biases would even have a marginal impact.

Nevertheless, Somerville's research is important because it reveals that we may never quite grow out of the phase where we're unable to keep our own beliefs from influencing how we perceive the beliefs of others. Being unable to assess somebody else's beliefs with 100% accuracy is a problem, and if it's your own knowledge that get in the way, that means it's even more important to ensure the beliefs you hold are the correct ones.

------------------------------------------------------------------------------------------

Follow me on Twitter

advertisement
About the Author
Eric Horowitz

Eric Horowitz is a social science writer and education researcher.

More from Eric Horowitz
More from Psychology Today
More from Eric Horowitz
More from Psychology Today