FaceReader and different scientific theories on emotion

In this blog post, Tess den Uyl, PhD, Peter Lewinksi, PhD, and Amogh Gudi, PhD from VicarVision outline how FaceReader is designed with scientific rigor and in accordance with responsible AI principles.

Posted by

Guest Blogger

Published on

Mon 26 Jun. 2023

FaceReader and different scientific theories on emotion

In this blog post, Tess den Uyl (PhD), Peter Lewinksi (PhD), and Amogh Gudi (PhD) from VicarVision outline how FaceReader is designed with scientific rigor and in accordance with responsible AI principles.

FaceReader and different scientific theories on emotion

Emotion has been a popular research topic for decades. This is evident by the vast amount of papers on the topic and the scientific debate surrounding it. For example, a well-known discussion is about the existence of a basic set of emotional expressions and whether emotions are universally similar or culturally diverse1

There is currently still not a singular theory on emotion that is widely accepted. Some critics also put doubt on the general idea of classification of emotion based on facial expressions2. We understand this leads to questions about the scientific foundation of FaceReader

In this post, we outline how FaceReader is designed with scientific rigor and in accordance with responsible AI principles, which allows all researchers to use FaceReader within the scientific school of thought that they fit in.

Scientific theories of emotion

There are many different approaches to researching emotions. This is nicely outlined in Figure 1, created by Gross & Barrett (2011)3. Notice the position on the spectrum of the theories of Ekman (red) and Russell (green). They hail from different schools of thought, which are Basic Emotion Theory and Psychological Construction Theory. These main theoretical perspectives have different viewpoint on what constitutes an emotion, what causes an emotion, how emotions are categorized, and how they are researched.

FaceReader offers flexibility to accommodate alternate theories of emotion, enabling users to apply the software within diverse theoretical frameworks and research contexts, thereby broadening its applicability and relevance across various fields of study.

Theorists on emotions

Figure 1. Representative theorists on emotions grouped into 4 major perspectives (from Gross & Barrett, 2011).

Facial expressions and emotional responses

Although these different theories address emotions from a wider perspective, we focus mostly on the relation to facial expressions. Facial expressions are a prominent type of emotional response and the main focus of FaceReader. Many researchers already indicate the relation between facial expression and emotion is not certain, but rather one of probability4. Some researchers focus on the overlapping information and see the universality of emotions (both in individuals and cultures), while others focus on the differences and highlight the diversity within emotional responses.

Currently, there are still researchers theorizing from all these different emotion perspectives. Basic Emotion Theory may be the most openly critiqued, but there are still researchers continuing along the lines of it 56.

In general, the relevance of facial expression research is shown by the large variety of research on emotional expressions7. There is an abundance of research on the recognition and production of facial expressions, and it has been dominated by primary emotions such as fear and happiness. The outputs that FaceReader generates allow the researcher to study emotions from the theoretical perspective they choose.

Build your own interpretation in FaceReader

FaceReader is defined as AFCS (automated facial coding system). FaceReader derives output from the following combinations: facial expressions, eye movements, changes in absorption of light in the face (RPPG), and other auxiliary face parameters. FaceReader can then present that output, arranged in four ways. The first two ways are theory-driven, the third is an agnostic way, and the fourth a user-defined way:

  1. Basic Emotion Theory (Ekman)
  2. Circumplex model of emotions with arousal and valence (Russell)
  3. Facial Action Coding System (FACS) - the most cited agnostic measure of facial expressions
  4. Custom expressions - together with inputs such as gaze tracking, relative distance to the screen, and RPPG, the user is allowed to build their own output in a GUI*-based module

* A graphical user interface (GUI) is a digital interface in which a user interacts with graphical components such as icons, buttons, and menus.

 

FREE WHITE PAPER: Facial Action Coding System

Download here the FREE white paper 'Facial Action Coding System (FACS)' and learn more about:

  • Objective results with FACS
  • Action Units
  • Examples using FACS
Submit to download

By creating custom expressions, researchers can build their own output. They do this by using existing research on established Action Unit combinations that are associated with different facial expressions of emotions. Or they can choose to create additional dimensions (e.g. approach vs. avoidance based on head position) or physical signals, such as smiling without interpretation labels (for example, 'happiness'). After all, the fact that someone is smiling does not necessarily mean that someone is feeling happy.

User is end-responsible

By constructing FaceReader in a modular way, users can build their own (emotion) classification system. For example, emotion eagerness - how badly does someone wants to buy something - is an emotion closely linked to interest. However, maybe you want to measure something slightly different. In FaceReader, it is possible to build this specific emotion.

The main responsibility is now left to the user as the software provides the three most common ways to interpret the output (basic emotions, circumplex model, and FACS). Also, FaceReader gives a structured drag-and-drop GUI to make one’s own recognition system.

Also read: Creating a custom expression for Engagement: A validation study with FaceReader.

Context is necessary for interpretation

In addition, the study design of experiments allows the researchers to know a lot about the context (e.g., the stimuli or interaction). This helps with the interpretation. Moreover, researchers can perform additional measurements on participants to further shed light on the experienced emotion, such as physiology and eye movements.

FaceReader: from responsible science to responsible AI

There are different approaches to researching emotions. FaceReader allows the researchers to choose the theoretical framework they operate in. This way, FaceReader is built from a responsible science approach, where the FaceReader user completely controls how they will interpret and label the expressions and the signals from their participants’ faces.

Facial expressions are not directly related to the underlying emotional process. Facial expressions are signals, and - like in any measurement tool - there can be a large signal variability. Whether the signal is relevant for a certain process, population, or context should be investigated and validated within each specific research field, using multiple tools.

Furthermore, FaceReader is engineered with Responsible AI principles in mind, striving for privacy-by-design and fair algorithms. For example, it does not use sensitive input (e.g., ethnicity) and it cannot execute facial recognition (i.e., recognizing a person as that specific person). In addition, care is taken to limit bias by gathering training datasets with a balanced representation of gender, age, and ethnicity, and by testing for the degree of bias repeatedly.

In conclusion, the use of different (theoretical) perspectives and a lively discussion are very normal and important in science. FaceReader can be a useful tool to help the field.

References

E.g. as indicated by almost 100k papers on facial+expression+emotion on Scopus and ±25k papers on facial+expression+emotion+fear/happiness/anger

Related Posts

Facial expressions - reactions to bitter food vary between high and low BMI
20 Nov human behavior research Emotion

Facial expressions - reactions to bitter food vary between high and low BMI

Overweight and obese individuals are at increased risk for many diseases and health conditions. By 2050, as much as 50% of the UK population could be obese at a cost of £50 billion a year.
Validation-study: Basic emotions and Action Units detection
24 Oct human behavior research Emotion

Validation-study: Basic emotions and Action Units detection

Guest blogger Jan Zumhasch, a certified FACS-coder, explains why FaceReader is amazing if you want to analyze facial expressions and emotions.
How to master automatic Facial Expression Recognition
30 Aug human behavior research Emotion

How to master automatic Facial Expression Recognition

Many researchers have turned towards using automated facial expression recognition software to better provide an objective assessment of emotions.