5,189 research outputs found
Automated detection of pain levels using deep feature extraction from shutter blinds‑based dynamic‑sized horizontal patches with facial images
Pain intensity classification using facial images is a challenging problem in computer vision research.
This work proposed a patch and transfer learning-based model to classify various pain intensities
using facial images. The input facial images were segmented into dynamic-sized horizontal patches
or “shutter blinds”. A lightweight deep network DarkNet19 pre-trained on ImageNet1K was used
to generate deep features from the shutter blinds and the undivided resized segmented input facial
image. The most discriminative features were selected from these deep features using iterative
neighborhood component analysis, which were then fed to a standard shallow fine k-nearest neighbor
classifier for classification using tenfold cross-validation. The proposed shutter blinds-based model
was trained and tested on datasets derived from two public databases—University of Northern
British Columbia-McMaster Shoulder Pain Expression Archive Database and Denver Intensity of
Spontaneous Facial Action Database—which both comprised four pain intensity classes that had
been labeled by human experts using validated facial action coding system methodology. Our shutter
blinds-based classification model attained more than 95% overall accuracy rates on both datasets.
The excellent performance suggests that the automated pain intensity classification model can be
deployed to assist doctors in the non-verbal detection of pain using facial images in various situations
(e.g., non-communicative patients or during surgery). This system can facilitate timely detection and
management of pain
Facial Expression Recognition Based on Deep Learning Convolution Neural Network: A Review
Facial emotional processing is one of the most important activities in effective calculations, engagement with people and computers, machine vision, video game testing, and consumer research. Facial expressions are a form of nonverbal communication, as they reveal a person's inner feelings and emotions. Extensive attention to Facial Expression Recognition (FER) has recently been received as facial expressions are considered. As the fastest communication medium of any kind of information. Facial expression recognition gives a better understanding of a person's thoughts or views and analyzes them with the currently trending deep learning methods. Accuracy rate sharply compared to traditional state-of-the-art systems. This article provides a brief overview of the different FER fields of application and publicly accessible databases used in FER and studies the latest and current reviews in FER using Convolution Neural Network (CNN) algorithms. Finally, it is observed that everyone reached good results, especially in terms of accuracy, with different rates, and using different data sets, which impacts the results
Reconnaissance de l'Ă©motion thermique
Pour améliorer les interactions homme-ordinateur dans les domaines de la santé, de l'e-learning et des jeux vidéos, de nombreux chercheurs ont étudié la reconnaissance des émotions à partir des signaux de texte, de parole, d'expression faciale, de détection d'émotion ou d'électroencéphalographie (EEG). Parmi eux, la reconnaissance d'émotion à l'aide d'EEG a permis une précision satisfaisante. Cependant, le fait d'utiliser des dispositifs d'électroencéphalographie limite la gamme des mouvements de l'utilisateur. Une méthode non envahissante est donc nécessaire pour faciliter la détection des émotions et ses applications. C'est pourquoi nous avons proposé d'utiliser une caméra thermique pour capturer les changements de température de la peau, puis appliquer des algorithmes d'apprentissage machine pour classer les changements d'émotion en conséquence. Cette thèse contient deux études sur la détection d'émotion thermique avec la comparaison de la détection d'émotion basée sur EEG. L'un était de découvrir les profils de détection émotionnelle thermique en comparaison avec la technologie de détection d'émotion basée sur EEG; L'autre était de construire une application avec des algorithmes d'apprentissage en machine profonds pour visualiser la précision et la performance de la détection d'émotion thermique et basée sur EEG. Dans la première recherche, nous avons appliqué HMM dans la reconnaissance de l'émotion thermique, et après avoir comparé à la détection de l'émotion basée sur EEG, nous avons identifié les caractéristiques liées à l'émotion de la température de la peau en termes d'intensité et de rapidité. Dans la deuxième recherche, nous avons mis en place une application de détection d'émotion qui supporte à la fois la détection d'émotion thermique et la détection d'émotion basée sur EEG en appliquant les méthodes d'apprentissage par machine profondes - Réseau Neuronal Convolutif (CNN) et Mémoire à long court-terme (LSTM). La précision de la détection d'émotion basée sur l'image thermique a atteint 52,59% et la précision de la détection basée sur l'EEG a atteint 67,05%. Dans une autre étude, nous allons faire plus de recherches sur l'ajustement des algorithmes d'apprentissage machine pour améliorer la précision de détection d'émotion thermique.To improve computer-human interactions in the areas of healthcare, e-learning and video
games, many researchers have studied on recognizing emotions from text, speech, facial
expressions, emotion detection, or electroencephalography (EEG) signals. Among them,
emotion recognition using EEG has achieved satisfying accuracy. However, wearing
electroencephalography devices limits the range of user movement, thus a noninvasive method
is required to facilitate the emotion detection and its applications. That’s why we proposed using
thermal camera to capture the skin temperature changes and then applying machine learning
algorithms to classify emotion changes accordingly. This thesis contains two studies on thermal
emotion detection with the comparison of EEG-base emotion detection. One was to find out the
thermal emotional detection profiles comparing with EEG-based emotion detection technology;
the other was to implement an application with deep machine learning algorithms to visually
display both thermal and EEG based emotion detection accuracy and performance. In the first
research, we applied HMM in thermal emotion recognition, and after comparing with EEG-base
emotion detection, we identified skin temperature emotion-related features in terms of intensity
and rapidity. In the second research, we implemented an emotion detection application
supporting both thermal emotion detection and EEG-based emotion detection with applying the
deep machine learning methods – Convolutional Neutral Network (CNN) and LSTM (Long-
Short Term Memory). The accuracy of thermal image based emotion detection achieved 52.59%
and the accuracy of EEG based detection achieved 67.05%. In further study, we will do more
research on adjusting machine learning algorithms to improve the thermal emotion detection
precision
Application of Texture Descriptors to Facial Emotion Recognition in Infants
The recognition of facial emotions is an important issue in computer vision and artificial intelligence due to its important academic and commercial potential. If we focus on the health sector, the ability to detect and control patients’ emotions, mainly pain, is a fundamental objective within any medical service. Nowadays, the evaluation of pain in patients depends mainly on the continuous monitoring of the medical staff when the patient is unable to express verbally his/her experience of pain, as is the case of patients under sedation or babies. Therefore, it is necessary to provide alternative methods for its evaluation and detection. Facial expressions can be considered as a valid indicator of a person’s degree of pain. Consequently, this paper presents a monitoring system for babies that uses an automatic pain detection system by means of image analysis. This system could be accessed through wearable or mobile devices. To do this, this paper makes use of three different texture descriptors for pain detection: Local Binary Patterns, Local Ternary Patterns, and Radon Barcodes. These descriptors are used together with Support Vector Machines (SVM) for their classification. The experimental results show that the proposed features give a very promising classification accuracy of around 95% for the Infant COPE database, which proves the validity of the proposed method.This work has been partially supported by the Spanish Research Agency (AEI) and the European Regional Development Fund (FEDER) under project CloudDriver4Industry TIN2017-89266-R, and by the Conselleria de Educación, Investigación, Cultura y Deporte, of the Community of Valencia, Spain, within the program of support for research under project AICO/2017/134
Chronic-Pain Protective Behavior Detection with Deep Learning
In chronic pain rehabilitation, physiotherapists adapt physical activity to
patients' performance based on their expression of protective behavior,
gradually exposing them to feared but harmless and essential everyday
activities. As rehabilitation moves outside the clinic, technology should
automatically detect such behavior to provide similar support. Previous works
have shown the feasibility of automatic protective behavior detection (PBD)
within a specific activity. In this paper, we investigate the use of deep
learning for PBD across activity types, using wearable motion capture and
surface electromyography data collected from healthy participants and people
with chronic pain. We approach the problem by continuously detecting protective
behavior within an activity rather than estimating its overall presence. The
best performance reaches mean F1 score of 0.82 with leave-one-subject-out cross
validation. When protective behavior is modelled per activity type, performance
is mean F1 score of 0.77 for bend-down, 0.81 for one-leg-stand, 0.72 for
sit-to-stand, 0.83 for stand-to-sit, and 0.67 for reach-forward. This
performance reaches excellent level of agreement with the average experts'
rating performance suggesting potential for personalized chronic pain
management at home. We analyze various parameters characterizing our approach
to understand how the results could generalize to other PBD datasets and
different levels of ground truth granularity.Comment: 24 pages, 12 figures, 7 tables. Accepted by ACM Transactions on
Computing for Healthcar
Recommended from our members
The role of HG in the analysis of temporal iteration and interaural correlation
- …