How to use facial expression analysis in pain research
This blog post was written by Victor Lemain, who completed his Master's degree in Science, Management & Innovation at the Radboud University in Nijmegen, The Netherlands. For his thesis, he did a market research internship at Noldus.
How to use facial expression analysis in pain research
As the world is continuously looking for new, innovative ways to conduct research in the most efficient way possible, automated facial expression analysis tools can already be a big help when studying facial expressions. What if we were to use these tools in pain assessment?
Challenges in pain assessment
Assessing and measuring physical pain can be accomplished through a variety of methods, considering factors such as behavioral cues and physiological responses.
Currently, the golden standard for pain measurement relies on self-reporting, a subjective process that may create challenges for individuals with a lack of verbal communication skills. For example, for infants, individuals with aphasia, or patients suffering from Alzheimer’s disease (AD) [1] [2].
Facial expression analysis in pain research
There are four key target groups where facial expression analysis is used most in pain research: in children under one year of age, patients suffering from dementia or Parkinson's disease, and non-communicative people with nociceptive pain [1-5].
We analyzed publications to find more information about the trends in these research fields. For example, in research focused on children under one year old, more than 350 authors published a paper in 2022. Interestingly, we observed a trend of an increasing number of authors exploring this field of research from 2020 onwards. As this trend analysis suggests, there is a rising recognition of the importance of facial expression analysis in understanding pain.
Understanding the needs of pain researchers
We set up a project to explore the feasibility of tools like FaceReader for applications in pain research. This included a series of interviews with experts in pain research in the field, such as researchers of dementia and those studying young children. Key objectives were identifying their potential obstacles or challenges for using automated facial expression analysis in pain research.
To successfully apply automated facial expression analysis in pain research, validation must be addressed. This means that the facial expression analysis algorithms and deep neural networks accurately and reliably detect and interpret pain-related facial expressions. This should hold across diverse populations and settings, verifying that the results are both valid and true.
Secondly, it's important to focus on the lack of specific measurement of pain. With this, we mean the inability to precisely quantify and differentiate between various intensities and types of pain through facial expressions alone.
Quick and objective measurements with FaceReader
The results from interviews with pain researchers indicated that they perceived facial expression analysis tools like FaceReader as very useful and necessary in this field.
Currently, manual coding is the standard. From the conducted interviews it became clear that the time-consuming process of manual coding is a major bottleneck when studying facial expressions in pain research.
Another important advantage of automated facial expression analysis they mentioned is the objectiveness of such a tool. After all, as is described in the literature, biases from doctors or pain assessors can result in different measurements of pain levels [6].
Also read: The 5 best ways to use FaceReader: a systematic review.
Next steps in pain research using FaceReader
This study investigated the feasibility of facial expression analysis in four fields of pain research. Trend analysis and interviews showed that pain researchers view automated facial expression tools as highly useful, especially for assessing pain in infants.
A possible direction to enable for pain research is the use of custom expressions in facial expression analysis. FaceReader's newest versions allow users to create their own algorithms, by combining parameters like Action Units, valence, gaze direction, and head orientation. Using these custom expressions can help researchers gain more insights into pain related facial expressions.
To implement this type of tool in pain research, it's vital to focus on FaceReader's validation for pain assessment. This will help researchers gain the reliable results they need and move the field of pain research forward.
References
- T. Hadjistavropoulos et al., "A biopsychosocial formulation of pain communication," Psychological bulletin, vol. 137, no. 6, p. 910, 2011.
- S. Lautenbacher and M. Kunz, "Facial pain expression in dementia: a review of the experimental and clinical evidence," Current Alzheimer Research, vol. 14, no. 5, pp. 501-505, 2017.
- Z. Chen, R. Ansari, and D. Wilkie, "Automated pain detection from facial expressions using facs: A review," arXiv preprint arXiv:1811.07988, 2018.
- E. Pegolo, D. Volpe, A. Cucca, L. Ricciardi, and Z. Sawacha, "Quantitative Evaluation of Hypomimia in Parkinson's Disease: A Face Tracking Approach," (in eng), Sensors (Basel), vol. 22, no. 4, Feb 10 2022, doi: 10.3390/s22041358.
- A. C. Williams, "Facial expression of pain: an evolutionary account," (in eng), Behav Brain Sci, vol. 25, no. 4, pp. 439-55; discussion 455-88, Aug 2002, doi: 10.1017/s0140525x02000080.
- T. R. Dawes et al., "Objectively Measuring Pain Using Facial Expression: Is the Technology Finally Ready?," Pain Management, vol. 8, no. 2, pp. 105-113, 2018/03/01 2018, doi: 10.2217/pmt-2017-0049.
Get the latest blog posts delivered to your inbox - every 15th of the month
more