54 research outputs found

    Increased pain intensity is associated with greater verbal communication difficulty and increased production of speech and co-speech gestures

    Get PDF
    Effective pain communication is essential if adequate treatment and support are to be provided. Pain communication is often multimodal, with sufferers utilising speech, nonverbal behaviours (such as facial expressions), and co-speech gestures (bodily movements, primarily of the hands and arms that accompany speech and can convey semantic information) to communicate their experience. Research suggests that the production of nonverbal pain behaviours is positively associated with pain intensity, but it is not known whether this is also the case for speech and co-speech gestures. The present study explored whether increased pain intensity is associated with greater speech and gesture production during face-to-face communication about acute, experimental pain. Participants (N = 26) were exposed to experimentally elicited pressure pain to the fingernail bed at high and low intensities and took part in video-recorded semi-structured interviews. Despite rating more intense pain as more difficult to communicate (t(25) = 2.21, p = .037), participants produced significantly longer verbal pain descriptions and more co-speech gestures in the high intensity pain condition (Words: t(25) = 3.57, p = .001; Gestures: t(25) = 3.66, p = .001). This suggests that spoken and gestural communication about pain is enhanced when pain is more intense. Thus, in addition to conveying detailed semantic information about pain, speech and co-speech gestures may provide a cue to pain intensity, with implications for the treatment and support received by pain sufferers. Future work should consider whether these findings are applicable within the context of clinical interactions about pain

    Does patient-physiotherapist agreement influence the outcome of low back pain? A prospective cohort study

    Get PDF
    BACKGROUND: Recent research suggests that agreement between patients' and health professionals' perceptions may influence the outcome of various painful conditions. This issue has received little attention in the context of low back pain and physiotherapy interventions. The current study aimed at exploring the relationship between patient-physiotherapist agreement on baseline low back pain intensity and related functional limitations, and changes in patient outcomes four weeks later. METHODS: Seventy-eight patient-physiotherapist dyads were included in the study. At baseline, patients and physiotherapists completed a Numerical Rating Scale and the Roland-Morris Disability Questionnaire. Patients' perceptions were reassessed over the phone at follow-up. RESULTS: Using multiple regression, baseline level of patient-physiotherapist agreement on pain intensity was associated with both outcome measures at follow-up. Agreement on functional limitations had no impact on outcomes. CONCLUSION: The results of this study indicate that patient-physiotherapist agreement has some impacts on the short-term outcomes of low back pain. Further research is needed to confirm these findings

    Paramedic assessment of pain in the cognitively impaired adult patient

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Paramedics are often a first point of contact for people experiencing pain in the community. Wherever possible the patient's self report of pain should be sought to guide the assessment and management of this complaint. Communication difficulty or disability such as cognitive impairment associated with dementia may limit the patient's ability to report their pain experience, and this has the potential to affect the quality of care. The primary objective of this study was to systematically locate evidence relating to the use of pain assessment tools that have been validated for use with cognitively impaired adults and to identify those that have been recommended for use by paramedics.</p> <p>Methods</p> <p>A systematic search of health databases for evidence relating to the use of pain assessment tools that have been validated for use with cognitively impaired adults was undertaken using specific search criteria. An extended search included position statements and clinical practice guidelines developed by health agencies to identify evidence-based recommendations regarding pain assessment in older adults.</p> <p>Results</p> <p>Two systematic reviews met study inclusion criteria. Weaknesses in tools evaluated by these studies limited their application in assessing pain in the population of interest. Only one tool was designed to assess pain in acute care settings. No tools were located that are designed for paramedic use.</p> <p>Conclusion</p> <p>The reviews of pain assessment tools found that the majority were developed to assess chronic pain in aged care, hospital or hospice settings. An analysis of the characteristics of these pain assessment tools identified attributes that may limit their use in paramedic practice. One tool - the Abbey Pain Scale - may have application in paramedic assessment of pain, but clinical evaluation is required to validate this tool in the paramedic practice setting. Further research is recommended to evaluate the Abbey Pain Scale and to evaluate the effectiveness of paramedic pain management practice in older adults to ensure that the care of all patients is unaffected by age or disability.</p

    Hearing Feelings: Affective Categorization of Music and Speech in Alexithymia, an ERP Study

    Get PDF
    Background: Alexithymia, a condition characterized by deficits in interpreting and regulating feelings, is a risk factor for a variety of psychiatric conditions. Little is known about how alexithymia influences the processing of emotions in music and speech. Appreciation of such emotional qualities in auditory material is fundamental to human experience and has profound consequences for functioning in daily life. We investigated the neural signature of such emotional processing in alexithymia by means of event-related potentials. Methodology: Affective music and speech prosody were presented as targets following affectively congruent or incongruent visual word primes in two conditions. In two further conditions, affective music and speech prosody served as primes and visually presented words with affective connotations were presented as targets. Thirty-two participants (16 male) judged the affective valence of the targets. We tested the influence of alexithymia on cross-modal affective priming and on N400 amplitudes, indicative of individual sensitivity to an affective mismatch between words, prosody, and music. Our results indicate that the affective priming effect for prosody targets tended to be reduced with increasing scores on alexithymia, while no behavioral differences were observed for music and word targets. At the electrophysiological level, alexithymia was associated with significantly smaller N400 amplitudes in response to affectively incongruent music and speech targets, but not to incongruent word targets. Conclusions: Our results suggest a reduced sensitivity for the emotional qualities of speech and music in alexithymia during affective categorization. This deficit becomes evident primarily in situations in which a verbalization of emotional information is required

    Automated Pain Detection in Facial Videos of Children using Human-Assisted Transfer Learning.

    No full text
    Accurately determining pain levels in children is difficult, even for trained professionals and parents. Facial activity provides sensitive and specific information about pain, and computer vision algorithms have been developed to automatically detect Facial Action Units (AUs) defined by the Facial Action Coding System (FACS). Our prior work utilized information from computer vision, i.e., automatically detected facial AUs, to develop classifiers to distinguish between pain and no-pain conditions. However, application of pain/no-pain classifiers based on automated AU codings across different environmental domains results in diminished performance. In contrast, classifiers based on manually coded AUs demonstrate reduced environmentally-based variability in performance. In this paper, we train a machine learning model to recognize pain using AUs coded by a computer vision system embedded in a software package called iMotions. We also study the relationship between iMotions (automatically) and human (manually) coded AUs. We find that AUs coded automatically are different from those coded by a human trained in the FACS system, and that the human coder is less sensitive to environmental changes. To improve classification performance in the current work, we applied transfer learning by training another machine learning model to map automated AU codings to a subspace of manual AU codings to enable more robust pain recognition performance when only automatically coded AUs are available for the test data. With this transfer learning method, we improved the Area Under the ROC Curve (AUC) on independent data from new participants in our target domain from 0.67 to 0.72
    • …
    corecore