The Language You Speak Influences What You Look At
Psycholinguistic experiments reveal how the mind works.
Posted Dec 14, 2019
Psycholinguistics is a field at the intersection of psychology and linguistics, and one of its recent discoveries is that the languages we speak influence our eye movements. For example, English speakers who hear the word candle often look at candy because the two words share the first syllable. Research with speakers of different languages revealed that bilingual speakers look not only at words that share sounds in one language, but also at words that share sounds across their two languages. When Russian-English bilinguals hear the English word marker, they also look at a stamp because the Russian word for a stamp is "marka."
Even more stunning, speakers of different languages differ in their patterns of eye movements when no language is used at all. In a simple visual search task in which people had to find a previously seen object among other objects, their eyes moved differently depending on the languages they knew.
For example, when looking for a clock, English speakers also looked at a cloud. Spanish speakers, on the other hand, when looking for the same clock, looked at a present, because the Spanish names for clock and present—reloj and regalo—overlap at word onset.
The story doesn’t end there. Not only do the words we hear activate other similar-sounding words, and not only do we look at objects whose names share sounds or letters even when no language is heard, but the translations of those names in other languages become activated as well in speakers of more than one language. For example, when Spanish-English bilinguals hear the word duck in English, they also look at a shovel, because the translations of duck and shovel—pato and pala, respectively—overlap in Spanish.
Because of the way our brains organize and process linguistic and non-linguistic information, a single word can set off a domino effect that cascades throughout the cognitive system. And this interactivity and co-activation are not limited to spoken languages. Bilinguals of spoken and signed languages show co-activation as well. For example, bilinguals who know English and American Sign Language look at "cheese" when they hear the English word "paper" because cheese and paper share three of the four sign components in ASL (handshape, location, and orientation, but not motion).
What do findings like these tell us? Not only is the language system thoroughly interactive with a high degree of co-activation across words and concepts, but it also impacts our processing in other domains—like vision, attention, and cognitive control. As we go about our everyday lives, how our eyes move, what we look at, and what we pay attention to is influenced in direct and measurable ways by the languages we speak.
The implications of these findings for applied settings range from consumer behavior (what we look at in a store) to the military (visual search in complex scenes) to art (what our eyes are drawn to). In other words, it is safe to say that the language you speak influences how you see the world not only figuratively, but also quite literally, down to the mechanics of your eye movements.
(An earlier version of this article appeared in Scientific American in December 2019.)
 Tanenhaus, M., Spivey-Knowlton, M., Eberhard, K., & Sedivy, J. (1995). Integration of visual and linguistic information during spoken language comprehension. Science, 268, 1632–1634.
 Marian, V., & Spivey, M. (2003). Competing activation in bilingual language processing: Within- and between-language competition. Bilingualism: Language and Cognition, 6 (2), 97-115. [pdf]
 Chabal, S., & Marian, V. (2015). Speakers of different languages process the visual world differently. Journal of Experimental Psychology: General, 144(3), 539-550. [pdf]
 Shook, A., & Marian, V. (2018). Covert co-activation of bilinguals’ non-target language: Phonological competition from translations. Linguistic Approaches to Bilingualism. [pdf]
 Shook, A., & Marian, V. (2012). Bimodal bilinguals coactivate both languages during spoken comprehension. Cognition, 124, 314-324. [pdf]