Using Baby FaceReader for automated analysis of infant emotions

What if you had a way to understand a baby's unspoken needs? Analyzing facial expressions in infants helps researchers and caregivers understand more about the early development of emotions and social interactions.

Posted by

Annemieke Hendriks

Published on

Wed 21 Feb. 2024

Using Baby FaceReader for automated analysis of infant emotions

Facial expression analysis is vital in understanding human emotions and social interactions. It can be even more valuable when studying infants, as they have not yet developed verbal communication skills. Babies have to rely on nonverbal cues to convey their feelings. 

What if you had a way to understand their unspoken emotions and needs? Analyzing facial expressions in infants helps researchers and caregivers understand more about the early development of emotions and social interactions








FREE WHITE PAPER: Tools for Infant Studies

Download the free White paper to learn more about the software tools available for infant studies.

  • Video observations to capture behaviors
  • Coding behaviors accurately
  • Unobtrusive emotion analysis

Why automated facial expression analysis is better

Traditionally, experts relied on manual observation to study emotions. The downsides of this way of working are that it can yield inconsistent results and is highly time-consuming. The development of automated facial expression analysis has promised a significant leap in measuring emotions. 

That's because using dedicated software for facial expression analysis - like Baby FaceReader - not only speeds up the process but also provides a much higher level of consistency and objectivity. Moreover, it makes it easier to obtain large datasets and to replicate earlier studies.

Martina Zaharieva explains more about the validation of Baby FaceReader in this Noldus webinar: Validating Baby FaceReader 9 Analysis of Infant Affective Facial Expressions in Face-to-Face Interactions.Watch the webinar on demand now!

Technical innovation: Baby FaceReader 

Baby FaceReader represents a significant milestone in automated facial expression analysis. Developed with advanced algorithms and machine learning techniques, it uses a convolutional neural network (CNN) to identify and interpret various facial action units (AUs). The software is specialized in infant facial expressions and is based on Oster’s Baby Facial Action Coding System (Baby FACS) . 

Baby FaceReader uses AUs to calculate overall emotional valence. This way, the software can detect subtle changes in facial expressions, providing nuanced insights into an infant's emotional state. 

Unlike manual coding, which can be prone to subjective interpretation, Baby FaceReader offers a standardized approach to understanding facial expressions. To use this software in research more confidently, it is vital to explore its reliability and validity in more detail.

Baby FaceReader is a state of the art system to automatically detect infant facial expressions in order to help address questions in developmental psychology.

Comparing Baby FaceReader with manual coding

That's why researcher Zaharieva and her team evaluated how well Baby FaceReader measured emotions compared to manual coding. In their study, they observed 58 infants aged 4 and 8 months during face-to-face naturalistic interactions with different caregivers. During these early stages of life, facial expressiveness evolves rapidly, making it an ideal time to study emotional communication. 

The researchers paid close attention to the software's ability to differentiate between positive, negative, and neutral expressions, understanding that accurately interpreting these nuances is crucial in developmental psychology. For their manual coding, they used The Observer XT.

Better results with positive emotions

The results of the study provided mixed insights. When compared to manually coded facial expressions, Baby FaceReader demonstrated reasonable accuracy in distinguishing positive expressions from negative or neutral ones. However, its performance in differentiating between negative and neutral expressions was more challenging. 

Therefore, the researchers studied the performance of specific action units in discriminating between positive, negative, and neutral expressions. They found that automated detection of smiling (AU12) performed well in discriminating positive expressions from negative or neutral ones. And by analyzing brow lowering (AU3 + AU4) in the software, the team was able to reliably distinguish negative facial expressions from neutral ones.

You may also like these blog posts on infant behavior research:

Advancing infant research with Baby FaceReader

The introduction of automated facial expression analysis in developmental psychology and pediatric care is a significant step forward. This technology offers new insights into the development of infant emotional expression, leading to better-informed caregiving and treatments of developmental disorders. As the technology continues to evolve, it will help us to understand the earliest stages of human emotion and communication even better.

References

  1. Oster, H. (2006). Baby FACS: Facial action coding system for infants and young children. Unpublished monograph and coding manual.
  2. Zaharieva, M.; Salvadori, E.; Messinger, D.; Visser, I.; Colonnesi, C. (2024). Automated facial expression measurement in a longitudinal sampleof 4‑ and 8‑month‑olds: Baby FaceReader 9 and manual coding of affective expressions. Behavior Research Methods, doi https://doi.org/10.3758/s13428-023-02301-3

Related Posts

Comparing machine emotion recognition with a human benchmark
06 Sep human behavior research Emotion

Comparing machine emotion recognition with a human benchmark

Our emotions come across clear in our facial expressions. Due to this, facial expressions can be used in a wide variety of studies.
AI and deep learning: Responsible use of facial expression analysis
12 Jul human behavior research Emotion

AI and deep learning: Responsible use of facial expression analysis

The European Parliament adopted its negotiating position on the first-ever regulation of AI. In this blog post, we explore how to use facial expression analysis responsibly and how it can contribute to scientific research.
Validating FaceReader's attention detection: a reliable measure using gaze and head pose
03 Dec human behavior research Emotion

Validating FaceReader's attention detection: a reliable measure using gaze and head pose

Measuring attention is crucial across many fields, from marketing to user experience design and human-computer interaction. Learn more about how we validated custom expressions for attention in FaceReader.