Why non-verbal behavior matters

Why non-verbal behavior matters

Basil Preisig and his colleagues at the University of Bern, Switzerland, were especially interested in communicative development of people diagnosed with language disorders like aphasia.

Published on

Wed 21 Jan. 2015

Take a typical conversation: Hello, how are you? Fine! How are you?

{pause}

Fine! Shall we …….etcetera.

The opening of this conversation is highly structured, isn’t it? The process of turn taking is a crucial and cooperative aspect of conversational speech. Gestures are also of great importance. When your feet are already turned to the door, you are getting ready to run out.

Try to read your own signals and you will see!

Conversation closers

In a conversation we include hints like conversation openers, closers, and shifters. It’s a feeling we’re all familiar with: you’re trying desperately to end a conversation, and the other person keeps on talking and does not read the clues you are giving.

You say: “Anyway… “ {…Person B keeps talking} “see you Thursday, all right?” and edge toward the door… {Person B talks over you.} “Thursday noon, was it, right?” {Person B still doesn’t get the hint to leave.}…and so on. In this frustrating exchange, Person B repeatedly ignores your conversation closers, both verbal (“right”, “all right”, “anyway”) and nonverbal.

Social interactions

Social interactions are subjective to culture, but some behaviors, especially those related to conversation, remain consistent across the world.

What methods do researchers use to gain insight into verbal behavior?

Nonverbal cues are an equally important part of conversation, as they provide additional information between interlocutors. Basil Preisig and his colleagues at the University of Bern, Switzerland, were especially interested in communicative differences between healthy control subjects and people with language disorders like aphasia.

Why non-verbal behavior matters: Understanding the message

In his study, Preisig tracked eye movement in aphasic patients and concluded that co-speech gestures helped guide the observer’s attention towards the speaker’s face. As the source of semantic input, the speaker’s face was the natural focal point for both healthy and aphasic subjects; but in some cases, accompanying nonverbal signals were needed to direct the listener’s focus. Preisig’s project was aimed specifically at gaining more insight into the visual exploration strategies of aphasic patients.

Measuring focus

A total of sixteen aphasic patients and 23 healthy control subjects participated in the study. To each subject, Preisig and colleagues presented videos of two people engaging in spontaneous dialogues. An infrared eye tracking device enabled the researchers to follow each subject’s gaze behavior and calculate the amount of time they fixated on the speaker’s face, hands, and body. 

Furthermore, eye movement recordings were synchronized with event data – co-speech gestures and voice activity of the actors – to get the complete picture.

Eye movement and co-speech gestures

The presence of gestures was rated frame by video frame using The Observer XT software. As expected, aphasic patients fixated less on the speaker’s face than did healthy controls. The researchers also found that the presence of co-speech gestures greatly encouraged subjects to look at the speaker; however, the aphasic patients did not seem to fixate on the gestures themselves.

According to Preisig and colleagues, people with aphasia tend to have limited attentional resources that then become allocated to focusing more on the acoustic speech signal than the visual speech signal. They refer to other studies to illustrate this conclusion.








FREE TRIAL: Try The Observer XT yourself!

Request a free trial and see for yourself how easy behavioral research can be!

  • Work faster
  • Reduce costs
  • Get better data

Guiding attention

In conversation, gestures help guide attention to the speaker, the source of semantic input. However, it might be the case that the visual signal interferes with the auditory signal and subtracts from the attentional resources of a listener. This is interesting material for further research. 

References

Preisig, B.C.; Eggenberger, N.; Zito, G.; Vanbellingen, T.; Schumacher, R.; Hopfner, S.; Nyffeler, T.; Gutbrod, K.; Annoni, J.; Bohlhalterc S.; Muri, R.M. (2015). Perception of co-speech gestures in aphasic patients: A visual exploration study during the observation of dyadic conversations.  Cortex, 64, 157-168.

Related Posts

Nurse-child interaction - observing medical encounters
05 Feb human behavior research Healthcare

Nurse-child interaction - observing medical encounters

In order to help health professionals and parents manage child distress more effectively, researchers started observations of real interactions in healthcare in a community setting.
Simulation-based team training in obstetrics
01 Nov human behavior research Healthcare

Simulation-based team training in obstetrics

The study of Annemarie Fransen and her team shows that simulation-based team training is effective in improving communication and cooperation of obstetric care teams.
Three examples of nurse-patient interaction research
21 Mar human behavior research Healthcare

Three examples of nurse-patient interaction research

The use of video technology in nurse-patient interaction research offers important advantages to scientists in unraveling complex behavior patterns. Learn more!