How Language Shapes Our World
What you think is what you see
Posted September 10, 2015
When you open your eyes, you see a colorful three-dimensional world of objects and events. Add to the mixture a panoply of sounds, smells, tastes, and bodily sensations, and you’ve got the consciousness we all take for granted. We think we’re experiencing the world as it is, but we’re not—it’s a virtual reality constructed inside our heads.
Our perception of the world seems to be a passive process. Light comes into our eyes, sound comes into our ears, and our brains sort it out to create a conscious experience that more or less mirrors reality. But our brain isn’t just a passive receiver of information.
Instead, our brains are constantly making predictions about what’s out there. Our perceptions, then, are more about what the brain expects to encounter than what is truly there. In fact, the brain will blatantly disregard the information it receives from the senses, especially in situations where that sensory input doesn’t match up with years of experience.
The Cornsweet illusion is a good example. The upper tile in the picture appears to be darker than the lower tile, but in fact they’re the same color. You can verify this by placing your finger along the edge between them.
Now that you know the two tiles are the same color, you’d think the apparent difference would disappear when you removed your finger. But no! Your brain stubbornly goes back to its original interpretation. But why?
Here’s the reason for the Cornsweet illusion. For all your life, your brain has been dealing with patterns of lighting and shading. If both tiles appear to be the same color, then the top tile must be darker, because it’s in the light. Likewise, the bottom one must actually be lighter, since it’s in the shade. This is how your brain reasons. That is, your brain shows you what it thinks the colors should be, not what your eyes say they are.
Most of the visual illusions you encounter on the internet or in Psychology 101 classes can be explained in terms of “top-down” perception, in which you experience the world as the brain expects it to be. This is in contrast to “bottom-up” perception, in which the brain represents the sensory input more or less faithfully.
Scientists have known about top-down perception for a century now. But psychologists Gary Lupyan of the University of Wisconsin and Andy Clark of the University of Edinburgh argue that top-down perception isn’t just limited to low-level processing like adjusting for lighting and shading. Instead, they argue that this is the brain’s default method for engaging with the world.
Imagine navigating through your house in the dark. You can do this not because you can see the furniture and the stairs, but because you know where they are and how to navigate around them. Likewise, consider backing out of your driveway. You can’t see the edge of the garage door or the mailbox, but you know how to maneuver the car to miss them.
The brain uses information flowing in from the senses to verify its predictions. And when the brain isn’t confident about its expectations, as when driving down an unfamiliar road, it depends much more on bottom-up perception.
Perception-as-prediction influences all sorts of daily experiences. For example, we enjoy music not because the sounds coming in are inherently pleasing. Rather, we’re pleased by the music because it matches our expectations. That’s why music from other cultures (or other generations) can be hard to listen to. Since it’s unfamiliar, we can’t make good expectations about it.
Expectations also play an important role in our interpersonal relationships. We’re constantly making predictions about what others will say or do, and we’re right so much of the time that we don’t even notice it. It’s only when they behave contrary to our expectations that they catch our attention.
While language perception is driven by expectations, Lupyan and Clark point out that language also creates expectations that influence our perception of the world more generally. This is because we don’t just use language to communicate with others, we use it to think to ourselves.
Psychologists and linguists have debated since the early twentieth century whether the language we speak can influence the way we perceive the world. For example, different languages divides the color spectrum somewhat differently. Supporters of the linguistic relativity hypothesis take this as evidence that language influences thought and perception.
As long as we view perception as mainly a bottom-up process, with information mostly flowing from the eyes and ears to the brain, it’s hard to imagine how language could have an influence. But Lupyan and Clark’s model of perception-as-prediction explains why linguistic relativity effects occur.
Each language carves up the world somewhat differently. So each language provides its speakers with a particular worldview that won’t be quite the same as the one that speakers of other languages have. In other words, we see the world according to the framework our language imposes on us.
Lupyan, G. & Clark, A. (2015). Words and the world: predictive coding and the language-perception-cognition interface. Current Directions in Psychological Science, 24, 29-284.
David Ludden is the author of The Psychology of Language: An Integrated Approach (SAGE Publications).