Hearing With Our Eyes, Seeing With Our Ears
How the brain combines sights and sounds to create conscious experience.
Posted November 19, 2015 | Reviewed by Jessica Schrader
Although we depend on both vision and hearing to interact with our environment, we generally consider blindness a greater disability than deafness. At least when we’ve lost our hearing, we can still see to navigate through the world, and we can learn sign language or lip reading to communicate. While blindness leaves our language abilities intact, however, our mobility and independence are greatly impacted.
So goes the received wisdom, but Daniel Kish doesn’t think of his blindness as a disability. Having lost both eyes to retinal cancer before the age of two, Kish has no memory of vision. He was also too young to understand that blindness was an impairment. So instead he taught himself to see by using his ears.
Kish makes clicking sounds as he moves through the world, and his brain uses the echoes to create a three-dimensional image of his environment. These images, he insists, are rich in shape in texture. He can’t perceive color, yet in another way his image of the world is more developed than those of sighted people. This is because he not only “sees” what’s in front of him but what’s behind him as well. In the following YouTube video, you can watch Kish riding his bicycle along a suburban street.
For sighted people as well, hearing takes the center stage of our attention when the visual input is unclear. Imagine you’re deep in a forest and it's getting dark. All you can see are trees and shadows, but a rich panoply of sounds will tell you what’s going on around you. I don’t know if it’s an illusion or reality, but the woods behind my house sound much noisier after dark.
We don’t experience our senses individually. Rather, our brain meshes with our vision and hearing to create our conscious experience of the world. What you see can influence what you hear, and likewise hearing can affect vision.
Although speech is perceived through the ears, what we see can change what we hear. In the following YouTube video, a man produces the same syllable over and over again. If you watch his mouth, you’ll hear the syllable “fah,” but if you look away you’ll hear “bah.” Although your ears hear “bah,” your eyes see “fah,” and even in speech your brain trusts vision over hearing. This phenomenon is known as the McGurk effect.
Hearing can also affect what you see. If an image flashing once on screen is accompanied by two beeps, you’ll see the image flashing twice. Likewise, two dots crossing diagonally on a screen appear to pass each other if there’s no sound, but they appear to bounce off each other if you hear a “boing” at the precise moment the two dots overlap.
For sighted people, vision dominates conscious experience. We focus our attention on what we look at. Hearing is relegated to a secondary role, mainly to monitor the environment for potential threats or opportunities that call on us to shift our visual attention. Right now you’re reading this blog post, but if you heard a loud noise behind you, you’d turn to look for its source.
People with profound hearing loss, however, need to use their vision both for focused attention and for monitoring the environment. As a result, the brains of deaf people process more information from a single glance than do hearing people. This widened span of visual perception has unexpected consequences.
Literacy rates among the deaf are much lower than they are among the hearing population. This is especially true for those whose mother tongue is American Sign Language. After all, English is a foreign language to them, and they’re being asked to read it without speaking it. You’ll see deaf children signing as they read, just as hearing children read aloud. But the vocabulary and grammar of ASL and English are quite different, so there’s often no word-for-word correspondence between the two.
However, about 5 percent of deaf adults excel at reading English, and they do so at a faster rate than do their hearing peers while maintaining the same level of comprehension. When you read, it seems as though your eyes move smoothly along each line of print. But in fact your eyes jump from point to point, taking in a narrow slice of print with each glance. Since deaf people develop a wider span of visual perception, they take in more information with each glance, and as a result they can read faster.
Our intuition tell us that our senses are separate streams of information. We see with our eyes, hear with our ears, feel with our skin, smell with our nose, taste with our tongue. In actuality, though, the brain uses the imperfect information from each sense to generate a virtual reality that we call consciousness. It’s our brain’s best guess as to what’s out there in the world. But that best guess isn’t always right.
I am the author of The Psychology of Language: An Integrated Approach (SAGE Publications).
Bélanger, N. N. & Rayner, K. (2015). What eye movements reveal about deaf readers. Current Directions in Psychological Science, 24, 220-226.