2 research outputs found

    Automatic Pain Assessment Through Facial Expressions

    Get PDF
    Pain is a strong symptom of diseases. Being an involuntary unpleasant feeling, it can be considered as a reliable indicator of health issues. Pain has always been expressed verbally, but in some cases, traditional patient self-reporting is not efficient. On one side, there are patients who have neurological disorders and cannot express themselves accurately, as well as patients who suddenly lose consciousness due to an abrupt faintness. On another side, medical staff working in crowded hospitals need to focus on emergencies and would opt for the automation of the task of looking after hospitalized patients during their entire stay, in order to notice any pain-related emergency. These issues can be tackled with deep learning. Knowing that pain is generally followed by spontaneous facial behaviors, facial expressions can be used as a substitute to verbal reporting, to express pain. That is, with the help of image processing techniques, an automatic pain assessment system can be implemented to analyze facial expressions and detect existing pain. In this project, a convolutional neural network model was built and trained to detect pain though patients’ facial expressions, using the UNBC-McMaster Shoulder Pain dataset [25]. First, faces were detected from images using the Haarcascade Frontal Face Detector [12], provided by OpenCV [26], and preprocessed through gray scaling, histogram equalization, face detection, image cropping, mean filtering and normalization. Next, preprocessed images were fed into a CNN model which was built based on a modified version of the VGG16 architecture. The model was finally evaluated and fine-tuned in a continuous way based on its accuracy

    Real-time pain detection in facial expressions for health robotics

    Get PDF
    Automatic pain detection is an important challenge in health computing. In this paper we report on our efforts to develop a real-time, real-world pain detection system from human facial expressions. Although many studies addressed this challenge, most of them use the same dataset for training and testing. There is no cross-check with other datasets or implementation in real-time to check performance on new data. This is problematic, as evidenced in this paper, because the classifiers overtrain on dataset-specific features. This limits realtime, real-world usage. In this paper, we investigate different methods of real-time pain detection. The training data uses a combination of pain and emotion datasets, unlike other papers. The best model shows an accuracy of 88.4% on a dataset including pain and 7 non-pain emotional expressions. Results suggest that convolutional neural networks (CNN) are not the best methods in some cases as they easily overtrain if the dataset is biased. Finally we implemented our pain detection method on a humanoid robot for physiotherapy. Our work highlights the importance of cross-corpus evaluation & real-time testing, as well as the need for a well balanced and ecologically valid pain dataset
    corecore