How Does Communication Work?

Part 3: Six ways in which non-verbal cues provide meaning.

Posted Jan 29, 2020

In this third and final part of my series on the nature of communication, I examine the main functions of non-verbal cues. Recall from Part I and Part II that non-verbal cues involve kinesics (namely body language, which includes facial expressions, eye gaze and gestures), and paralanguage (which includes features of tone of voice such as intonation and prosody—but also includes involuntary sounds such as laughter, groans, sobbing, etc. which I haven’t focused on in this series).

It turns out that there are six main ways in which non-verbal cues enhance meaning in face-to-face spoken interaction.

Substitution. The first is substitution. I can say "yes"; or, I can nod. Here we have a case where body language can straightforwardly replace a verbal expression. A clear case where non-verbal cues can replace language is a type of gesture known as an emblem. Examples include the OK sign or the thumbs-up gesture. Emblematic gestures like these speak for themselves: they have a stable meaning and are instantly recognisable without verbal support, in more or less the same way as an English word, such as "cat" is instantly recognisable to other English speakers. Moreover, gestures of this kind may be culture-specific. In Japanese culture, for instance, two hands placed firmly together means please or thank you, while in Western culture this is a symbol of prayer.

Wikipedia Commons, used with permission
Ok sign
Source: Wikipedia Commons, used with permission

Reinforcement. On some occasions, however, we use two modes to say the same thing, facilitating the second function: reinforcement. I can both say no, and I can shake my head. Here the kinesic cue is repeating the verbal cue.

Mixed message. But sometimes using two modes simultaneously can be used to give a mixed message, the third function; and this can be both deliberate and have communicative value. For instance, the linguistic expression "This will be fun," said with a monotone delivery and facial grimace deploys paralinguistic and kinesic cues in order to contradict the verbal expression. But the function here is to create a humorous effect by using non-verbal cues to prompt an ironic reading.

Complementing. The fourth function, a particularly important one, of non-verbal cues is to complement the verbal expression: to add information not otherwise present in the spoken words. For instance, if someone is offered a glass of wine, they might respond, “Yes, please,” while using a gesture involving the thumb and forefinger parallel, just a couple of centimetres apart, to indicate just a small amount of wine. In this case, the gesture is providing supplementary information, indicating the amount of wine required and otherwise not available from the spoken response. This complements and clarifies the information contained in the spoken utterance, helping out the addressee in the process.

Many uses of speech prosody complement the linguistic mode, sometimes even changing their meaning in the process. Here’s an example I’ve sometimes used in media interviews to make this point. Take the utterance “I love you.” In standard American or British English, with falling pitch, the utterance is a declaration of undying love. But with rising pitch, said as a question, it becomes a derisive counterblast that can serve as an ironic put-down (and best not practised out loud, if you wish to keep your nearest as your dearest). In fact, “I love you," with rising pitch, doesn’t do what it says on the tin: it means the opposite of what it says.

Speech intonation also tells us about a speaker’s attitudes and emotional responses—another example of the complementing function. A fall from high pitch on the first syllable of a word, for instance, can signal greater excitement or warmth. Try experimenting by adjusting the pitch contour of “hello." If you start with higher pitch on the "hel" of hello, this is typically judged as warmer, as when greeting an old friend, than when starting from low pitch, when answering an annoying phone call, when engrossed in another task.

Emphasis. The fifth way in which non-verbal cues support verbal expression is to provide an emphasising or accenting function. With regard to kinesic cues, a common way this is achieved is the use of so-called "beat" gestures; these are gesticulations that provide simple, rhythmic motions, made with the hands or fingers, as when someone repeatedly points at the air with their forefinger or repeatedly strikes their palm with their forefinger when enumerating a spoken list of issues to be addressed. These add emphasis to what’s being said or express our emotional state. For instance, as psychologist Daniel Casasanto explains: "[f]ast staccato beats may show agitation; precise beats may show determination or sincerity; large, forceful beats may show either frustration or enthusiasm."

Paralinguistic cues also signal the relative importance of different parts of an utterance. For instance, falling pitch can be used to signal what is new information. The sentence: “I saw a ↘ burglar” might answer the question “What did you see?” Here, falling pitch (signalled by the arrow) from the previous word onto burglar emphasises the answer—the new information.

Discourse management. Finally, non-verbal cues also play an important function in the management and flow of ongoing discourse. Let’s start with gestures; for instance, the same gesture made at an earlier point in the conversation is repeated at a later point. And this works to tie the two points in the conversation together. We also use gestures when we can’t find the word we’re looking for, such as the finger placed against the lips to denote thinking.

Other sorts of kinesics can be used to regulate our communicative exchanges. Nodding and shaking the head, at appropriate places in a conversation, provides backchannel support, showing the other person that we are following and engaging with what they are saying and that we agree and disagree.

As I noted earlier, eye gaze is also known to have an important regulatory function. Speakers avert their gaze from an addressee when talking but establish eye contact to signal the end of their utterance. 

And of course, paralinguistic cues also play a ubiquitous role in managing the flow of our spoken interactions. For instance, intonation has an important function in regulating the way in which we interact with one another in conversation. In written text, we know where one word ends and another begins due to the white spaces between words. And punctuation conventions, like the comma, full stop (or period), exclamation and question marks, signal how written sentences should be parsed, divided up and interpreted. Think about it for a second: in spoken language, there are no question marks; and there are no spaces between words. Instead, we have the melodic rise and fall of pitch contours, acceleration and deceleration of the speech tempo, the variation in loudness, the emphasis or stress placed on specific syllables, all of which collectively punctuate our spoken language.

In essence, and as I stated in Part II of this series, the following is true: Language, paralanguage and kinesics form a unified communicative signal. Non-verbal cues are as important as spoken language in effective face-to-face communication.