The Seduction of Syntax
Looking for language in all the wrong places
Posted Feb 06, 2015
A vervet can warn its fellow monkeys about an approaching predator by giving a leopard or eagle call, and everyone will take appropriate evasive action. But one vervet can’t tell another vervet, “I saw a leopard down by the river the other day,” or “If you see an eagle flying overhead, keep a close eye on the kiddies.” Perhaps oversimply put, vervet-speak has words but not sentences.
Language-trained primates will spontaneously produce strings of words that convey meanings far more complex than vervet alarm calls: “Banana, Koko, want, want, banana, want, Koko.” We get it—Koko wants a banana. But there’s something distinctly un-human about the way she’s expressed herself. Even a toddler wouldn’t talk that way.
A sentence is more than just a string of related words in random order. Rather, a sentence consists of words that are ordered according to syntax1. Furthermore, syntax adds a layer of meaning on top what each word contributes to the sentence. “Koko want banana” and “Banana want Koko” don’t mean the same thing to a human speaker of English. But that distinction is beyond the grasp of a nonhuman primate.
Influential linguist Noam Chomsky maintains that syntax is the key defining feature of language. In fact, Chomsky defines a language as a set of sentences. Syntax, then, is the set of rules that generates all and only those sentences in that language. Such an abstract definition allows linguists to express their ideas in the mathematical notation of set theory, giving linguistics the patina of being a hard science.
Syntax is sexy if you’re a language geek. I recall the thrill of syntactic analysis in graduate school—manipulating symbols, drawing sentences trees, trying to predict what a native speaker would say in such-and-such a situation. We were code breakers tackling the world’s greatest enigma.
Half a century into the Chomskyan research program, linguists are no closer to understanding the structure of language than when their Great Leader launched his scathing attack on famous psychologist B. F. Skinner and his behaviorist approach to language development in 1967. In hindsight, the reason for this failure is obvious.
Perhaps no one has yet fallen in love with his cell phone’s operating system, as did Joachin Phoenix’s character in the 2013 movie Her. Yet we’re all accustomed to talking to our computers, and to having them talk back. However, natural language processing had a rocky start.
In the early days of artificial intelligence, computer scientists tried programming the syntax of human language (mainly English) directly into the computer as a set of rules. Yet no matter how complex and detailed those rules were, computers just couldn’t handle all the intricacies of the language.
A more productive approach has been to let a computer program figure out the structure of the language on its own. By statistically analyzing a corpus of texts consisting of millions of words, a computer program can learn the patterns of the language and perform reasonably well on natural language tasks such as grammar check, question-and-answer, and even machine translation.
Human brains are amazing statistical engines, continuously tracking frequencies and extracting patterns from our experiences with the world. Like language-trained computers, human children are exposed to texts—in the form of spoken dialogue—consisting of millions of words. And like their machine counterparts, they perform a statistical analysis, teasing out the patterns of the language. The research on early language development clearly shows that young children rely on patterns—not rules—for speaking and understanding language. And the same is true for adults.
The distinction between rule and pattern is subtle but important. A rule is absolute, and any exception must be explicitly stated—i before e, except after c…..2. A pattern, in contrast, is flexible, its boundaries fuzzy by nature. Some patterns are so pervasive that they influence the structure of virtually every sentence, while others are relevant only to a thin slice of the language.
Chomskyan linguistics has failed because it has searched to discover something that was never there. There are no rules of language, only patterns. And given what we now know about how the brain operates, it couldn’t be otherwise. Human brains are simply not very good at following step-by-step programs—the algorithms that computers run on. They are, however, exceptionally good at extracting patterns from the data of our experience, and they’re also very tolerant of uncertainty, making things up on the fly as needed.
Languages are structured to fit the ways that the brain processes information. There is no language acquisition device containing the genetically encoded rules of universal grammar, as Chomsky claims. Rather, language processing runs through the same neural circuits that the brain has been using to perform other information processing tasks for millions of years.
1Many people use the terms grammar and syntax interchangeably. But strictly speaking, syntax is about word order, while morphology is about word form. These two together form the grammar of a language.
2The poem goes on for six more lines, listing all the other exceptions to the rule. I memorized it in the sixth grade, and I can still recite it today. What can I say? I’m a language geek.
Arnon, I., & Clark, E. V. (2011). Why brush your teeth is better than teeth: Children’s word production is facilitated in familiar sentence-frames. Language Learning and Development, 7, 107–129.
Bonvillian, J. D., & Patterson, F. G. (1993). Early sign language acquisition in children and gorillas: Vocabulary content and sign iconicity. First Language, 13, 315–338.
Chomsky, N. (1957). Syntactic structures. The Hague: Mouton.
Chomsky, N. (1959). A review of B. F. Skinner’s Verbal Behavior. Language, 35, 26–58.
Chomsky, N. (1980). Rules and representations. New York: Columbia University Press.
Chomsky, N. (2011). Language and other cognitive systems. What is special about language? Language Learning and Development, 7, 263–278.
Ellison, M. (Producer), Jonze, S. (Producer, Director), & Landay, V. (Producer). (2013). Her [Motion picture]. United States: Warner Bros. Pictures.
Estigarriba, B. (2010). Facilitation by variation: Right-to-left learning of English yes/no questions. Cognitive Science 34, 68–93.
Higginbotham, D. J., Lesher, G. W., Moulton, B. J., & Roark, B. (2012). The application of natural language processing to augmentative and alternative communication. Assistive Technology, 24, 14–24.
Keren-Portnoy, T., & Keren, M. (2011). The dynamics of syntax acquisition: Facilitation between syntactic structures. Journal of Child Language, 38, 404–432.
Patterson, F. G. (1978). The gesture of a gorilla: Language acquisition in another pongid. Brain and Language, 5, 72–97.
Seyfarth, R. M., Cheney, D. L., & Marler, P. (1980a). Monkey responses to three different alarm calls: Evidence of predator classification and semantic communication. Science, 210, 801–803.
Seyfarth, R. M., Cheney, D. L., & Marler, P. (1980b). Vervet monkey alarm calls: Semantic communication in a free-ranging primate. Animal Behaviour, 28, 1070–1094.
Zuberbühler, K., Cheney, D. L., & Seyfarth, R. M. (1999). Conceptual semantics in a nonhuman primate. Journal of Comparative Psychology, 113, 33–42.
David Ludden is the author of The Psychology of Language: An Integrated Approach (SAGE Publications).