349,100 research outputs found

    The development of emotion recognition from facial expressions and non-linguistic vocalizations during childhood

    Get PDF
    Sensitivity to facial and vocal emotion is fundamental to children's social competence. Previous research has focused on children's facial emotion recognition, and few studies have investigated non-linguistic vocal emotion processing in childhood. We compared facial and vocal emotion recognition and processing biases in 4- to 11-year-olds and adults. Eighty-eight 4- to 11-year-olds and 21 adults participated. Participants viewed/listened to faces and voices (angry, happy, and sad) at three intensity levels (50%, 75%, and 100%). Non-linguistic tones were used. For each modality, participants completed an emotion identification task. Accuracy and bias for each emotion and modality were compared across 4- to 5-, 6- to 9- and 10- to 11-year-olds and adults. The results showed that children's emotion recognition improved with age; preschoolers were less accurate than other groups. Facial emotion recognition reached adult levels by 11 years, whereas vocal emotion recognition continued to develop in late childhood. Response bias decreased with age. For both modalities, sadness recognition was delayed across development relative to anger and happiness. The results demonstrate that developmental trajectories of emotion processing differ as a function of emotion type and stimulus modality. In addition, vocal emotion processing showed a more protracted developmental trajectory, compared to facial emotion processing. The results have important implications for programmes aiming to improve children's socio-emotional competence

    Understanding the link between emotional recognition and awareness, therapy, and training : a thesis presented in partial fulfilment of the requirements for the degree of Doctorate in Clinical Psychology at Massey University, Manawatū, New Zealand

    Get PDF
    Therapy is an emotionally laden event, both for individuals seeking therapeutic intervention and the therapists who provide it. While the recognition of emotions in the general population has been a popular topic of research, very little research has been conducted into the emotional competencies, or more specifically, emotion recognition and awareness of therapists. In addition, there are few studies on the effectiveness of emotion recognition training for therapists’ emotional competencies, which is surprising given the innately emotional moments that clients and therapists experience during therapeutic work. This study aimed to address these gaps by investigating the association between emotional recognition, awareness, practice, and training. Fifty five therapists made up of clinical psychologists, counsellors, and a psychotherapist completed an online task that involved completion of a social-emotional orientated questionnaire and an emotion recognition task. Of these 55 participants, 26 completed an emotion recognition training before completing the same task again, two weeks later, while the remainder 29 participants were instructed to participate in no emotion recognition training. The results revealed that, compared to the no treatment condition, those who received emotion recognition training were more accurate in their recognition of emotions and also reported higher use of therapeutic emotional practice. Unexpectedly, participants who completed emotion recognition training reported less emotional awareness than the control group. Related to this, an inverse relationship was found between emotion recognition ability and self-reported emotional awareness, as well as the finding for some support for an inverse relationship between emotion recognition ability and self-reported use of emotional practice. There are two implications of this research; first, emotion recognition training increases therapists’ accuracy in emotion recognition, and second, therapists may need to be provided emotional practice feedback by an alternative form rather than through supervision or client outcome. This is due to an inverse relationship being found between participants’ actual and perceived emotional awareness. Therefore, future research into social-emotional practices and client outcomes will be advised to be considered. The limitations of the study and areas for future research are also discussed

    Emotion recognition and intellectual disability : development of the kinetic emotion recognition assessment and evaluation of the emotion specificity hypothesis : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Clinical Psychology at Massey University, Albany, New Zealand

    Get PDF
    Deficits in social adaptive functioning are a defining criterion of intellectual disability (ID) (American Psychiatric Association, 2013), and a key predictor of social inclusion and subsequent quality of life (Kozma, Mansell, & Beadle-Brown, 2009). Impairment in facial emotion recognition is often cited as the component skill responsible for the social difficulties observed. This position has been formally conceptualised by the emotion specificity hypothesis (ESH; Rojahn, Rabold, & Schneider, 1995), which proposes that individuals with ID manifest a specific deficit in facial emotion recognition beyond that which can be explained by difficulties in general intellectual functioning. Despite apparent widespread acceptance, there is not yet sufficient evidence to substantiate these claims. Moore (2001) proposes that emotion perception capacities may be intact in people with ID, and that reported deficits are instead, due to emotion recognition tasks making extensive cognitive demands that disadvantage those with lesser cognitive abilities. The aim of the present study was to clarify the nature of facial emotion recognition abilities in adults with mild ID. To this end, the Kinetic Emotion Recognition Assessment (KERA), a video-based measure of facial emotion recognition, was developed and a pilot study completed. The measure was designed to assess emotion recognition abilities, while attempting to reduce information-processing demands beyond those required to perceive the emotional content of stimuli. The new instrument was assessed for its psychometric properties in individuals with ID and neurotypical control participants. Initial findings supported the interrater reliability and overarching construct validity of the measure, offering strong evidence in favour of content, convergent and predictive validity. Item difficulty and discrimination analysis confirmed that the KERA included items of an appropriate level of difficulty to capture the range of emotion recognition capacities expected of individuals with mild ID. The secondary focus of the study was to assess how subtle methodological changes in the assessment of emotion recognition ability may affect emotion recognition performance, and in turn provide insight into how we might reinterpret existing ESH literature. To this end, the KERA was also applied in an investigation of the potential moderating effects of dynamic cues and emotion intensity, in addition to the assessment of the ESH. The results offer strong evidence that individuals with ID experience relative impairment in emotion recognition abilities when compared with typically developing controls. However, it remains to be seen whether the observed difficulties are specific to emotional expression or associated with more generalised facial processing. Preliminary findings also suggest that like their typically developing peers, individuals with ID benefit from higher intensity emotional displays; while in contrast, they observe no advantage from the addition of movement cues. Finally, the overarching motivation for the reassessment and improved measurement of the ESH, was in the interests of improving real-world outcomes associated with emotion recognition capacities. Accordingly, emotion recognition data were also interpreted in the context of three measures of social functioning to explore the link between social competence and emotion recognition ability. Results indicated that emotion recognition abilities are linked to outcomes in social adaptive functioning, particularly for females

    Does comorbid anxiety counteract emotion recognition deficits in conduct disorder?

    Get PDF
    Background: Previous research has reported altered emotion recognition in both conduct disorder (CD) and anxiety disorders (ADs) - but these effects appear to be of different kinds. Adolescents with CD often show a generalised pattern of deficits, while those with ADs show hypersensitivity to specific negative emotions. Although these conditions often cooccur, little is known regarding emotion recognition performance in comorbid CD+ADs. Here, we test the hypothesis that in the comorbid case, anxiety-related emotion hypersensitivity counteracts the emotion recognition deficits typically observed in CD. Method: We compared facial emotion recognition across four groups of adolescents aged 12-18 years: those with CD alone (n = 28), ADs alone (n = 23), cooccurring CD+ADs (n = 20) and typically developing controls (n = 28). The emotion recognition task we used systematically manipulated the emotional intensity of facial expressions as well as fixation location (eye, nose or mouth region). Results: Conduct disorder was associated with a generalised impairment in emotion recognition; however, this may have been modulated by group differences in IQ. AD was associated with increased sensitivity to low-intensity happiness, disgust and sadness. In general, the comorbid CD+ADs group performed similarly to typically developing controls. Conclusions: Although CD alone was associated with emotion recognition impairments, ADs and comorbid CD+ADs were associated with normal or enhanced emotion recognition performance. The presence of comorbid ADs appeared to counteract the effects of CD, suggesting a potentially protective role, although future research should examine the contribution of IQ and gender to these effects

    Damage to Association Fiber Tracts Impairs Recognition of the Facial Expression of Emotion

    Get PDF
    An array of cortical and subcortical structures have been implicated in the recognition of emotion from facial expressions. It remains unknown how these regions communicate as parts of a system to achieve recognition, but white matter tracts are likely critical to this process. We hypothesized that (1) damage to white matter tracts would be associated with recognition impairment and (2) the degree of disconnection of association fiber tracts [inferior longitudinal fasciculus (ILF) and/or inferior fronto-occipital fasciculus (IFOF)] connecting the visual cortex with emotion-related regions would negatively correlate with recognition performance. One hundred three patients with focal, stable brain lesions mapped onto a reference brain were tested on their recognition of six basic emotional facial expressions. Association fiber tracts from a probabilistic atlas were coregistered to the reference brain. Parameters estimating disconnection were entered in a general linear model to predict emotion recognition impairments, accounting for lesion size and cortical damage. Damage associated with the right IFOF significantly predicted an overall facial emotion recognition impairment and specific impairments for sadness, anger, and fear. One subject had a pure white matter lesion in the location of the right IFOF and ILF. He presented specific, unequivocal emotion recognition impairments. Additional analysis suggested that impairment in fear recognition can result from damage to the IFOF and not the amygdala. Our findings demonstrate the key role of white matter association tracts in the recognition of the facial expression of emotion and identify specific tracts that may be most critical

    Facial emotion recognition using min-max similarity classifier

    Full text link
    Recognition of human emotions from the imaging templates is useful in a wide variety of human-computer interaction and intelligent systems applications. However, the automatic recognition of facial expressions using image template matching techniques suffer from the natural variability with facial features and recording conditions. In spite of the progress achieved in facial emotion recognition in recent years, the effective and computationally simple feature selection and classification technique for emotion recognition is still an open problem. In this paper, we propose an efficient and straightforward facial emotion recognition algorithm to reduce the problem of inter-class pixel mismatch during classification. The proposed method includes the application of pixel normalization to remove intensity offsets followed-up with a Min-Max metric in a nearest neighbor classifier that is capable of suppressing feature outliers. The results indicate an improvement of recognition performance from 92.85% to 98.57% for the proposed Min-Max classification method when tested on JAFFE database. The proposed emotion recognition technique outperforms the existing template matching methods

    How preserved is emotion recognition in Alzheimer disease compared with behavioral variant frontotemporal dementia?

    Get PDF
    Background: Emotion deficits are a recognised biomarker for behavioural variant frontotemporal dementia (bvFTD), but recent studies have reported emotion deficits also in Alzheimer’s disease (AD). Methods: A hundred and twenty-three participants (33 AD, 60 bvFTD, 30 controls) were administered a facial emotion recognition test, to investigate the clinical factors influencing the diagnostic distinction on this measure. Binomial regression analysis revealed that facial emotion recognition in AD was influenced by disease duration and MMSE, whereas the same was not true for bvFTD. Based on this information, we median-split the AD group on disease duration (3 years) or MMSE (24) and compared the facial emotion recognition performance of mild-AD, moderate-AD, bvFTD patients and controls. Results: Results showed that very mild-AD performed consistently at control levels for all emotions. By contrast, mild/moderate-AD and bvFTD were impaired compared to controls on most emotions. Interestingly, mild/moderate-AD were significantly impaired compared to very mild-AD on total score, anger and sadness subscores. Logistic regression analyses corroborated these findings with ~94% of very mild-AD being successfully distinguished from bvFTD at presentation, while this distinction was reduced to ~78% for mild/moderate-AD. Conclusions: Facial emotion recognition in AD is influenced by disease progression, with very mild-AD being virtually intact for emotion performance. Mild/moderate-AD and bvFTD show consistent impairment in emotion recognition, with bvFTD being worse. A disease progression of over 3 years or a MMSE lower than 24 should warrant caution to put too much emphasis on emotion recognition performance in the diagnostic distinction of AD and bvFTD
    • …
    corecore