13 research outputs found
Personalized Automatic Estimation of Self-reported Pain Intensity from Facial Expressions
Pain is a personal, subjective experience that is commonly evaluated through
visual analog scales (VAS). While this is often convenient and useful,
automatic pain detection systems can reduce pain score acquisition efforts in
large-scale studies by estimating it directly from the participants' facial
expressions. In this paper, we propose a novel two-stage learning approach for
VAS estimation: first, our algorithm employs Recurrent Neural Networks (RNNs)
to automatically estimate Prkachin and Solomon Pain Intensity (PSPI) levels
from face images. The estimated scores are then fed into the personalized
Hidden Conditional Random Fields (HCRFs), used to estimate the VAS, provided by
each person. Personalization of the model is performed using a newly introduced
facial expressiveness score, unique for each person. To the best of our
knowledge, this is the first approach to automatically estimate VAS from face
images. We show the benefits of the proposed personalized over traditional
non-personalized approach on a benchmark dataset for pain analysis from face
images.Comment: Computer Vision and Pattern Recognition Conference, The 1st
International Workshop on Deep Affective Learning and Context Modelin
Multi-task multiple kernel machines for personalized pain recognition from functional near-infrared spectroscopy brain signals
Currently there is no validated objective measure of pain. Recent
neuroimaging studies have explored the feasibility of using functional
near-infrared spectroscopy (fNIRS) to measure alterations in brain function in
evoked and ongoing pain. In this study, we applied multi-task machine learning
methods to derive a practical algorithm for pain detection derived from fNIRS
signals in healthy volunteers exposed to a painful stimulus. Especially, we
employed multi-task multiple kernel learning to account for the inter-subject
variability in pain response. Our results support the use of fNIRS and machine
learning techniques in developing objective pain detection, and also highlight
the importance of adopting personalized analysis in the process.Comment: International Conference on Pattern Recognition (ICPR
Automatic Estimation of Self-Reported Pain by Interpretable Representations of Motion Dynamics
We propose an automatic method for pain intensity measurement from video. For
each video, pain intensity was measured using the dynamics of facial movement
using 66 facial points. Gram matrices formulation was used for facial points
trajectory representations on the Riemannian manifold of symmetric positive
semi-definite matrices of fixed rank. Curve fitting and temporal alignment were
then used to smooth the extracted trajectories. A Support Vector Regression
model was then trained to encode the extracted trajectories into ten pain
intensity levels consistent with the Visual Analogue Scale for pain intensity
measurement. The proposed approach was evaluated using the UNBC McMaster
Shoulder Pain Archive and was compared to the state-of-the-art on the same
data. Using both 5-fold cross-validation and leave-one-subject-out
cross-validation, our results are competitive with respect to state-of-the-art
methods.Comment: accepted at ICPR 2020 Conferenc
Associations between facial expressions and observational pain in residents with dementia and chronic pain
AimTo identify specific facial expressions associated with pain behaviors using the PainChek application in residents with dementia.DesignThis is a secondary analysis from a study exploring the feasibility of PainChek to evaluate the effectiveness of a social robot (PARO) intervention on pain for residents with dementia from June to November 2021.MethodsParticipants experienced PARO individually five days per week for 15 min (once or twice) per day for three consecutive weeks. The PainChek app assessed each resident's pain levels before and after each session. The association between nine facial expressions and the adjusted PainChek scores was analyzed using a linear mixed model.ResultsA total of 1820 assessments were completed with 46 residents. Six facial expressions were significantly associated with a higher adjusted PainChek score. Horizontal mouth stretch showed the strongest association with the score, followed by brow lowering parting lips, wrinkling of the nose, raising of the upper lip and closing eyes. However, the presence of cheek raising, tightening of eyelids and pulling at the corner lip were not significantly associated with the score. Limitations of using the PainChek app were identified.ConclusionSix specific facial expressions were associated with observational pain scores in residents with dementia. Results indicate that automated real-time facial analysis is a promising approach to assessing pain in people with dementia. However, it requires further validation by human observers before it can be used for decision-making in clinical practice.ImpactPain is common in people with dementia, while assessing pain is challenging in this group. This study generated new evidence of facial expressions of pain in residents with dementia. Results will inform the development of valid artificial intelligence-based algorithms that will support healthcare professionals in identifying pain in people with dementia in clinical situations.Reporting MethodThe study adheres to the CONSORT reporting guidelines.Patient or Public ContributionOne resident with dementia and two family members of people with dementia were consulted and involved in the study design, where they provided advice on the protocol, information sheets and consent forms, and offered valuable insights to ensure research quality and relevance