Emotion analysis can be beneficial to researchers in decision making

David Schindler and colleagues developed a software, µCap (muCap), which is capable of creating a link between video footage and phases of the experiment, suitable for automated analysis in FaceReader.

Posted by

Guest Blogger

Published on

Wed 13 Jan. 2016

Emotion analysis can be beneficial to researchers in decision making

By David Schindler

Introducing µCap

David Schindler is the laboratory manager of the interdisciplinary decision making laboratory MELESSA at the University of Munich, currently visiting the University of Pennsylvania.

Many researchers interested in human behavior have used Noldus FaceReader in the past to determine emotional states of their subjects. By now, it has become somewhat of a standard to analyze emotions in small-scale psychological studies, to gain insights into consumer behavior from a marketing perspective and in many more areas.

In need of larger sample sizes?

Very recently, experimental economists and also experimental psychologists who need larger sample sizes (i.e. up to 400 subjects) have found interest in analyzing the role of emotions in decision making. Those studies are often conducted in computerized laboratories, where subjects in several sessions sit in small cubicles at separate computer workstations and, with the help of experimental software, make decisions or work on different kinds of tasks.

While FaceReader in general is capable of handling large numbers of participants, it has no built-in routines to link it to the standard experimental software that exist (in economics, the top dog is z-Tree [1], while psychologists use a diverse set of software of which for example E-Prime or Qualtrics are often used).

FaceReader and z-Tree

The recent interest in emotional responses as correlates of decision making in economics have sparked a series of papers that have used FaceReader (in connection with z-Tree), e.g. [2,3,4]. An increased interest in such studies has led us [5] to develop a software, µCap (muCap), which is capable of creating a link between video footage and phases of the experiment, suitable for automated analysis in FaceReader.


 
 
 
 
 
 
 
FREE WHITE PAPER: FaceReader methodology

Download the free FaceReader methodology note to learn more about facial expression analysis theory.

  • How FaceReader works
  • More about the calibration
  • Insight in quality of analysis & output
 

µCap

The idea behind µCap is very simple: the program constantly reads out a specific pixel in the top left corner of the screen. Whenever this pixel changes color, µCap creates a timestamp in a .csv file. This timestamp contains the exact time at which the change of color happens (on the client computer) and reflects a point of interest to the researcher.

While our release version of µCap contains a sample file, for how this color change can be triggered in z-Tree, it is not necessarily limited to be used with z-Tree. Any experimental software that allows to change color of a specific screen area can be used to connect to FaceReader.

As an example, consider a decision making experiment, where subjects choose between a risky option (e.g. receive either $5 or $0 with equal probability) and a safe option (e.g. receive $2) after they went through several other decision problems. Just by looking at video footage of the participants’ faces, the experimenter is unable to determine at which point in time subjects saw this particular decision problem. With the help of µCap, the experimenter can implement a color change that takes place when the decision problem of interest is first displayed and thus knows the exact time at which this happened and can link it to the footage.

µConfig, µCap, µProject

µCap consists of three independent tools that allow a fully automatized recording and analysis in FaceReader of a virtually unlimited number of subjects: µConfig is a visual interface that allows experimenters to define color codes at which timestamps should be recorded, the main tool of µCap records the videos and creates the timestamps, and µProject will automatically create a new project in FaceReader such that the analysis can be started with just simply one click. 

µCap can be used free of charge in academic contexts, you can find the latest downloadable version and the corresponding paper at http://mucap.david-schindler.de/.

Analyze emotions

The µCap software makes it feasible to run medium- to even large-scale experiments and analyze emotions using FaceReader with only a few clicks. While we hope that the research community will benefit from our tool, we also hope that future versions of our software can benefit from input from our user base.

References

  1. Fischbacher, U. (2007). z-Tree: Zurich toolbox for ready-made economic experiments. Experimental Economics, 10(2), 171-178.
  2. Joffily, M., Masclet, D., Noussair, C. N., & Villeval, M. C. (2014). Emotions, Sanctions, and Cooperation. Southern Economic Journal, 80(4), 1002-1027.
  3. Nguyen, Y., & Noussair, C. N. (2014). Risk aversion and emotions. Pacific Economic Review, 19(3), 296-312.
  4. Breaban, A., & Noussair, C. N. (2013). Emotional state and market behavior, Working Paper.
  5. Doyle, L., & Schindler, D. (2015). MuCap: Connecting FaceReaderTM to z-Tree. Munich Discussion Paper (No. 2015-4).

Related Posts

Top 5 blog posts about autism
02 Apr human behavior research Psychology

Top 5 blog posts about autism

World Autism Awareness Day is an internationally recognized day on April 2nd. A good reason to sum up the top 5 best read blog posts about autism of the last year for you.
Autism research - mothers of high-risk children use more gestures
06 Aug human behavior research Psychology

Autism research - mothers of high-risk children use more gestures

Children with autism spectrum disorder (ASD) often have delayed development in communication and language. This impaired development can be present even in the first year of their lives.
What does an infant’s gaze tell us about how hungry they feel?
11 Apr human behavior research Psychology

What does an infant’s gaze tell us about how hungry they feel?

McNally and her colleagues developed a coding scheme to observe infant gaze behavior and applied it in a study of complementary feeding.