Learn more about FaceReader

Want to learn more about FaceReader? Discover in-depth information in our white papers, product videos, and customer success stories.

You'll also find relevant publications, as well as product overviews for your research area.

FaceReader customer success stories

Studying user behavior and interactions

Researchers at the Social Media Lab analyze how users interact with social media.

Learn what your users need

This type of research helps civil society organisations and other professionals to identify their customers' needs and implement new tools.

Efficient coding of social behavior

At the Social Behavior Lab at Western University, dr. Erin Heerey explores human behavior during social interactions.

Save hours of manual coding

Frame-by-frame expression analysis of her project would have taken 800 hours of manual coding. FaceReaderTM did it in only 14 hours!

The role of sensory evaluation

At Virginia Tech Food Science & Technology, researchers use FaceReader to capture how people are responding to food products.

Observe unconscious responses

Studying unconscious responses helps researchers to gain insight in the effects of flavor, sensory quality, and nutrient value of food.

FaceReader webinars

Project Analysis Module and custom expressions in FaceReader

In this webinar, you'll learn more about what you can achieve with FaceReader.

  • Learn about new modeling methods
  • Watch a demonstration of the Project Analysis Module
  • Learn how you can use custom expressions

The psychophysiology of emotion

In this webinar, you'll learn about the relationship between emotional states and psychophysiological measures.

  • Discover how evolution shaped our brains
  • Learn how to measure heart rate and skin conductance
  • Observe patterns in physiological responses to certain emotions

Reading materials

FaceReader methodology

Learn more about how FaceReader classifies facial expressions.

You'll also discover what types of data you can collect for your research and how the software is validated.

Facial Action Coding System

The Facial Action Coding System (FACS) was developed by Ekman.

This model describes different Action Units - the smallest visible units of muscular activity in the face.

Custom expressions

In FaceReader, you can combine different metrics to create your own expressions.

Learn how to create custom expressions and get inspired by examples from other researchers.

FaceReader Online

Looking for a way to study participants remotely, from any location?

Discover the benefits of FaceReader Online and read about best practices when designing your study.

Stay ahead in behavioral research!

Subscribe to Noldus Newsline

  • Discover innovations – Be the first to know about new tools and software to advance your research and get insights that help you stay ahead.
  • See real research in action – Get inspired by real-world studies from our community, discovering fresh approaches for your work.
  • Access exclusive perks – Receive early updates on promotions, product releases, and events crafted for you.

Featured blog posts

How emotions are made

Neuroscience research shows that emotions are created in our brains.

It's how our brains give meaning to our experiences and sensations. Learn more in this blog post.

5 tips to optimize your facial expression analyses

Emotion data allows researchers to gain in-depth insights in complex human behaviors.

These 5 tips will guarantee the best results in your facial expression analysis!

Using Baby FaceReader for automated analysis of infant emotions

What if you had a way to understand a baby's unspoken needs?

This study highlights to benefits of analyzing facial expressions in infants.

Facial Action Units

Below you can see the 20 Action Units offered in FaceReader as well as some frequently occurring or difficult Action Unit combinations.

Some images have been zoomed in on the area of interest to explicitly show what muscle movement corresponds to the specific Action Unit.

AU 1. Inner Brow Raiser

Contributes to sadness, surprise, and fear. Muscular basis: frontalis (pars medialis).

AU 2. Outer Brow Raiser

Contributes to surprise and fear. Muscular basis: frontalis (pars lateralis).

AU 4. Brow Lowerer

Contributes to sadness, fear, and anger. Muscular basis: depressor glabellae, depressor supercilii, corrugator supercilii.

AU 5. Upper Lid Raiser

Contributes to surprise, fear, and anger. Muscular basis: levator palpebrae superioris, superior tarsal muscle.

AU 6. Cheek Raiser

Contributes to happiness. Muscular basis: orbicularis oculi (pars orbitalis).

AU 7. Lid Tightener

Contributes to fear and anger. Muscular basis: orbicularis oculi (pars palpebralis).

AU 9. Nose Wrinkler

Contributes to disgust. Muscular basis: levator labii superioris alaeque nasi.

AU 10. Upper Lip Raiser

Muscular basis: levator labii superioris, caput infraorbitalis.

AU 12. Lip Corner Puller

Contributes to happiness and contempt. Muscular basis: zygomaticus major.

AU 14. Dimpler

Contributes to contempt and boredom. Muscular basis: buccinator.

AU 15. Lip Corner Depressor

Contributes to sadness and disgust. Muscular basis: depressor anguli oris.

AU 17. Chin Raiser

This Action Unit contributes to the affective attitudes interest and confusion. The underlying facial muscle is mentalis.

AU 18. Lip Pucker

The underlying facial muscles are incisivii labii superioris and incisivii labii inferioris.

AU 20. Lip Stretcher

Contributes to the emotion fear. The underlying facial muscle is risorius w/ platysma.

AU 23. Lip Tightener

Contributes to the emotion anger, and to the affective attitudes confusion and boredom. Muscular basis: orbicularis oris.

AU 24. Lip Pressor

This Action Unit contributes to the affective attitude boredom. The underlying facial muscle is orbicularis oris.

AU 25. Lips Part

The muscular basis consists of depressor labii inferioris, or relaxation of mentalis or orbicularis oris.

AU 26. Jaw Drop

Contributes to the emotions surprise and fear. Muscular basis: masseter; relaxed temporalis and internal pterygoid.

AU 27. Mouth Stretch

The underlying facial muscles are pterygoids and digastric.

AU 43. Eyes Closed

Contributes to the affective attitude boredom. The muscular basis consists of relaxation of Levator palpebrae superioris.

Combinations of Action Units

AU 1 - 2 - 4

Contributes to the emotions fear and can be recognized by the wavy pattern of the wrinkles across the forehead.

AU 1 - 2

Contributes to the emotion surprise and can be recognized by a smooth line formed by the wrinkles across the forehead.

AU 1 - 4

Contributes to sadness. Recognizable by a wavy pattern of the wrinkles in the center of the forehead. Eye-brows come together and up.

AU 4 - 5

Contributes to the emotion anger.

AU 6 - 12

Contributes to happiness. Notice the wrinkles around the eyes caused by cheek raising, also known as the "Duchenne Marker".

AU 10 - 25

Contributes to the emotion disgust. When AU 10 is activated intensely, it causes the lips to part as the upper lip raises.

AU 18 - 23

Often confused as solely AU 18. Notice the lips almost appear to be pulled by a single string outward (AU 18) and then tightened (AU 23).

AU 23 - 24

The AUs marking lip movements are often the hardest to code. The lips are being pushed together (AU 24) and tightened (AU 23).

FaceReader videos

FaceReader classifications demo

See for yourself how FaceReader classifies facial expressions!

Affective attitudes in FaceReader

Measure interest, boredom, and confusion with FaceReader's Action Unit Module.

Baby FaceReader

Measure the facial expressions of an infant automatically.

Relevant publications

Krause, F.; Franke, N. (2023). Understanding Consumer Self-Design Abandonment: A Dynamic Perspective. Journal of Marketing. https://doi.org/10.1080/00140139.2022.2157493.
De Wijk, R.; Kaneko, D.; Dijksterhuis, G.; van Bergen, G.; Vingerhoeds, M.; Visalli, M.; Zandstra, E. (2022). A preliminary investigation on the effect of immersive consumption contexts on food-evoked emotions using facial expressions and subjective ratings. Food Quality and Preference. https://doi.org/10.1016/j.foodqual.2022.104572.
Märtin, C., Bissinger B.C., & Asta, P. (2021). Optimizing the digital customer journey - Improving user experience by exploiting emotions, personas and situations for individualized user interface adaptations. Journal of Consumer Behavior, 1-12. https://doi.org/10.1002/cb.1964.
Talen, L. & den Uyl, T.E. (2021). Complex Website Tasks Increase the Expression Anger Measured with FaceReader Online. International Journal of Human–Computer Interaction. https://doi.org/10.1080/10447318.2021.1938390.
Bourret, M., Ratelle, C.F., Plamondon, A. & Boisclair Châteauvert, G. (2023). Dynamics of parent-adolescent interactions during a discussion on career choice: The role of parental behaviors and emotions. Journal of Vocational Behavior, 141, https://doi.org/10.1016/j.jvb.2022.103837
Liu, S.; Wang, Y., Song, Y. (2023). Atypical facial mimicry for basic emotions in children with autism spectrum disorder. Autism Research, 16, 1375-1388.
Zaharieva, M.; Salvadori, E.; Messinger, D.; Visser, I.; Colonnesi, C. (2024). Automated facial expression measurement in a longitudinal sample of 4 and 8 month olds: Baby FaceReader 9 and manual coding of affective expressions. Behavior Research Methods, https://doi.org/10.3758/s13428-023-02301-3.
Malfait, A.; Puyvelde, M.; Detaille, F.; Neyt, X.; Waroquier, F. & Pattyn, N. (2023). Unveiling Readiness of Medical First Responders in Simulation Trainings: Insights beyond Queries. In: Jay Kalra (eds) Emerging Technologies in Healthcare and Medicine. AHFE International Conference. AHFE Open Access, vol 116. AHFE International, USA. https://doi.org/10.1177/01939459241233360.
Meng, Q. et al. (2020). On the effectiveness of facial expression recognition for evaluation of urban sound perception. Science of The Total Environment, 710, 135484, ISSN 0048-9697. https://doi.org/10.1016/j.scitotenv.2019.135484.
Yang, L.; Chen, X.; Guo, Q.; Zhang, J.; Luo, M.; Chen, Y.; Zou, X.; Xu, F. (2022). Changes in facial expressions in patients with Parkinson’s disease. Computer Speech & Language, 72(3). https://doi.org/10.1016/j.csl.2021.101286.