35,852 research outputs found

    Brain processing of contagious itch in patients with atopic dermatitis

    Get PDF
    Several studies show that itch and scratching cannot only be induced by pruritogens like histamine or cowhage, but also by the presentation of certain (audio-) visual stimuli like pictures on crawling insects or videos showing other people scratching. This phenomenon is coined Contagious itch (CI). Due to the fact that CI is more profound in patients with the chronic itchy skin disease atopic dermatitis (AD), we believe that it is highly relevant to study brain processing of CI in this group. Knowledge on brain areas involved in CI in AD-patients can provide us with useful hints regarding non-invasive treatments that AD-patients could profit from when they are confronted with itch-inducing situations in daily life. Therefore, this study investigated the brain processing of CI in AD-patients. 11 AD-patients underwent fMRI scans during the presentation of an itch inducing experimental video (EV) and a non-itch inducing control video (CV). Perfusion based brain activity was measured using arterial spin labeling functional MRI. As expected, the EV compared to the CV led to an increase in itch and scratching (p \u3c 0.05). CI led to a significant increase in brain activity in the supplementary motor area, left ventral striatum and right orbitofrontal cortex (threshold: p \u3c 0.001; cluster size k \u3e 50). Moreover, itch induced by watching the EV was by trend correlated with activity in memory-related regions including the temporal cortex and the (pre-) cuneus as well as the posterior operculum, a brain region involved in itch processing (threshold: p \u3c 0.005; cluster size k \u3e 50). These findings suggest that the fronto-striatal circuit, which is associated with the desire to scratch, might be a target region for non-invasive treatments in AD patients. © 2017 Schut, Mochizuki, Grossman, Lin, Conklin, Mohamed, Gieler, Kupfer and Yosipovitch

    Personalized Automatic Estimation of Self-reported Pain Intensity from Facial Expressions

    Full text link
    Pain is a personal, subjective experience that is commonly evaluated through visual analog scales (VAS). While this is often convenient and useful, automatic pain detection systems can reduce pain score acquisition efforts in large-scale studies by estimating it directly from the participants' facial expressions. In this paper, we propose a novel two-stage learning approach for VAS estimation: first, our algorithm employs Recurrent Neural Networks (RNNs) to automatically estimate Prkachin and Solomon Pain Intensity (PSPI) levels from face images. The estimated scores are then fed into the personalized Hidden Conditional Random Fields (HCRFs), used to estimate the VAS, provided by each person. Personalization of the model is performed using a newly introduced facial expressiveness score, unique for each person. To the best of our knowledge, this is the first approach to automatically estimate VAS from face images. We show the benefits of the proposed personalized over traditional non-personalized approach on a benchmark dataset for pain analysis from face images.Comment: Computer Vision and Pattern Recognition Conference, The 1st International Workshop on Deep Affective Learning and Context Modelin

    Embodying functionally relevant action sounds in patients with spinal cord injury

    Get PDF
    Growing evidence indicates that perceptual-motor codes may be associated with and influenced by actual bodily states. Following a spinal cord injury (SCI), for example, individuals exhibit reduced visual sensitivity to biological motion. However, a dearth of direct evidence exists about whether profound alterations in sensorimotor traffic between the body and brain influence audio-motor representations. We tested 20 wheelchair-bound individuals with lower skeletal-level SCI who were unable to feel and move their lower limbs, but have retained upper limb function. In a two-choice, matching-to-sample auditory discrimination task, the participants were asked to determine which of two action sounds matched a sample action sound presented previously. We tested aural discrimination ability using sounds that arose from wheelchair, upper limb, lower limb, and animal actions. Our results indicate that an inability to move the lower limbs did not lead to impairment in the discrimination of lower limb-related action sounds in SCI patients. Importantly, patients with SCI discriminated wheelchair sounds more quickly than individuals with comparable auditory experience (i.e. physical therapists) and inexperienced, able-bodied subjects. Audio-motor associations appear to be modified and enhanced to incorporate external salient tools that now represent extensions of their body schema

    Towards a comprehensive 3D dynamic facial expression database

    Get PDF
    Human faces play an important role in everyday life, including the expression of person identity, emotion and intentionality, along with a range of biological functions. The human face has also become the subject of considerable research effort, and there has been a shift towards understanding it using stimuli of increasingly more realistic formats. In the current work, we outline progress made in the production of a database of facial expressions in arguably the most realistic format, 3D dynamic. A suitable architecture for capturing such 3D dynamic image sequences is described and then used to record seven expressions (fear, disgust, anger, happiness, surprise, sadness and pain) by 10 actors at 3 levels of intensity (mild, normal and extreme). We also present details of a psychological experiment that was used to formally evaluate the accuracy of the expressions in a 2D dynamic format. The result is an initial, validated database for researchers and practitioners. The goal is to scale up the work with more actors and expression types

    Pain Level Detection From Facial Image Captured by Smartphone

    Get PDF
    Accurate symptom of cancer patient in regular basis is highly concern to the medical service provider for clinical decision making such as adjustment of medication. Since patients have limitations to provide self-reported symptoms, we have investigated how mobile phone application can play the vital role to help the patients in this case. We have used facial images captured by smart phone to detect pain level accurately. In this pain detection process, existing algorithms and infrastructure are used for cancer patients to make cost low and user-friendly. The pain management solution is the first mobile-based study as far as we found today. The proposed algorithm has been used to classify faces, which is represented as a weighted combination of Eigenfaces. Here, angular distance, and support vector machines (SVMs) are used for the classification system. In this study, longitudinal data was collected for six months in Bangladesh. Again, cross-sectional pain images were collected from three different countries: Bangladesh, Nepal and the United States. In this study, we found that personalized model for pain assessment performs better for automatic pain assessment. We also got that the training set should contain varying levels of pain in each group: low, medium and high
    • …
    corecore