791,668 research outputs found

    Speech-based recognition of self-reported and observed emotion in a dimensional space

    Get PDF
    The differences between self-reported and observed emotion have only marginally been investigated in the context of speech-based automatic emotion recognition. We address this issue by comparing self-reported emotion ratings to observed emotion ratings and look at how differences between these two types of ratings affect the development and performance of automatic emotion recognizers developed with these ratings. A dimensional approach to emotion modeling is adopted: the ratings are based on continuous arousal and valence scales. We describe the TNO-Gaming Corpus that contains spontaneous vocal and facial expressions elicited via a multiplayer videogame and that includes emotion annotations obtained via self-report and observation by outside observers. Comparisons show that there are discrepancies between self-reported and observed emotion ratings which are also reflected in the performance of the emotion recognizers developed. Using Support Vector Regression in combination with acoustic and textual features, recognizers of arousal and valence are developed that can predict points in a 2-dimensional arousal-valence space. The results of these recognizers show that the self-reported emotion is much harder to recognize than the observed emotion, and that averaging ratings from multiple observers improves performance

    The development of emotion recognition from facial expressions and non-linguistic vocalizations during childhood

    Get PDF
    Sensitivity to facial and vocal emotion is fundamental to children's social competence. Previous research has focused on children's facial emotion recognition, and few studies have investigated non-linguistic vocal emotion processing in childhood. We compared facial and vocal emotion recognition and processing biases in 4- to 11-year-olds and adults. Eighty-eight 4- to 11-year-olds and 21 adults participated. Participants viewed/listened to faces and voices (angry, happy, and sad) at three intensity levels (50%, 75%, and 100%). Non-linguistic tones were used. For each modality, participants completed an emotion identification task. Accuracy and bias for each emotion and modality were compared across 4- to 5-, 6- to 9- and 10- to 11-year-olds and adults. The results showed that children's emotion recognition improved with age; preschoolers were less accurate than other groups. Facial emotion recognition reached adult levels by 11 years, whereas vocal emotion recognition continued to develop in late childhood. Response bias decreased with age. For both modalities, sadness recognition was delayed across development relative to anger and happiness. The results demonstrate that developmental trajectories of emotion processing differ as a function of emotion type and stimulus modality. In addition, vocal emotion processing showed a more protracted developmental trajectory, compared to facial emotion processing. The results have important implications for programmes aiming to improve children's socio-emotional competence

    Coping strategies as mediators within the relationship between emotion-regulation and perceived stress in teachers

    Get PDF
    The aim of the present study was to examine whether different coping strategies (focus on positive, support coping, active coping and evasive coping) mediate the relationship between emotion-regulation (i.e., emotion acceptance skills, emotion resilience skills and emotion regulation skills) and perceived stress in physical education (PE) teachers. The sample consisted of 457 PE pre-service teachers. Results show that evasive coping strategies partly negatively mediate the relationship between emotion resilience skills and emotion regulation and perceived stress. Therefore, emotion-regulation might protect against using evasive coping strategies, which have been found to be related to higher stress in previous studies.peer-reviewe

    EMIR: A novel emotion-based music retrieval system

    Get PDF
    Music is inherently expressive of emotion meaning and affects the mood of people. In this paper, we present a novel EMIR (Emotional Music Information Retrieval) System that uses latent emotion elements both in music and non-descriptive queries (NDQs) to detect implicit emotional association between users and music to enhance Music Information Retrieval (MIR). We try to understand the latent emotional intent of queries via machine learning for emotion classification and compare the performance of emotion detection approaches on different feature sets. For this purpose, we extract music emotion features from lyrics and social tags crawled from the Internet, label some for training and model them in high-dimensional emotion space and recognize latent emotion of users by query emotion analysis. The similarity between queries and music is computed by verified BM25 model

    Emotional Chatting Machine: Emotional Conversation Generation with Internal and External Memory

    Full text link
    Perception and expression of emotion are key factors to the success of dialogue systems or conversational agents. However, this problem has not been studied in large-scale conversation generation so far. In this paper, we propose Emotional Chatting Machine (ECM) that can generate appropriate responses not only in content (relevant and grammatical) but also in emotion (emotionally consistent). To the best of our knowledge, this is the first work that addresses the emotion factor in large-scale conversation generation. ECM addresses the factor using three new mechanisms that respectively (1) models the high-level abstraction of emotion expressions by embedding emotion categories, (2) captures the change of implicit internal emotion states, and (3) uses explicit emotion expressions with an external emotion vocabulary. Experiments show that the proposed model can generate responses appropriate not only in content but also in emotion.Comment: Accepted in AAAI 201

    Neural correlates of early deliberate emotion regulation: Young children\u27s responses to interpersonal scaffolding.

    Get PDF
    Deliberate emotion regulation, the ability to willfully modulate emotional experiences, is shaped through interpersonal scaffolding and forecasts later functioning in multiple domains. However, nascent deliberate emotion regulation in early childhood is poorly understood due to a paucity of studies that simulate interpersonal scaffolding of this skill and measure its occurrence in multiple modalities. Our goal was to identify neural and behavioral components of early deliberate emotion regulation to identify patterns of competent and deficient responses. A novel probe was developed to assess deliberate emotion regulation in young children. Sixty children (age 4-6 years) were randomly assigned to deliberate emotion regulation or control conditions. Children completed a frustration task while lateral prefrontal cortex (LPFC) activation was recorded via functional near-infrared spectroscopy (fNIRS). Facial expressions were video recorded and children self-rated their emotions. Parents rated their child\u27s temperamental emotion regulation. Deliberate emotion regulation interpersonal scaffolding predicted a significant increase in frustration-related LPFC activation not seen in controls. Better temperamental emotion regulation predicted larger LPFC activation increases post- scaffolding among children who engaged in deliberate emotion regulation interpersonal scaffolding. A capacity to increase LPFC activation in response to interpersonal scaffolding may be a crucial neural correlate of early deliberate emotion regulation

    Emotion Differentiation as a Protective Factor Against Nonsuicidal Self-Injury in Borderline Personality Disorder

    Full text link
    Evidence that nonsuicidal self-injury (NSSI) serves a maladaptive emotion regulation function in borderline personality disorder (BPD) has drawn attention to processes that may increase risk for NSSI by exacerbating negative emotion, such as rumination. However, more adaptive forms of emotion processing, including differentiating broad emotional experiences into nuanced emotion categories, might serve as a protective factoragainst NSSI. Using an experience-sampling diary, the present study tested whether differentiation of negative emotion was associated with lower frequency of NSSI acts and urges in 38 individuals with BPD who reported histories of NSSI. Participants completed a dispositional measure of rumination and a 21-day experience-sampling diary, which yielded an index of negative emotion differentiation and frequency of NSSI acts and urges. A significant rumination by negative emotion differentiation interaction revealed that rumination predicted higher rates of NSSI acts and urges in participants with difficulty differentiating their negative emotions. The results extend research on emotion differentiation into the clinical literature and provide empirical support for clinical theories that suggest emotion identification and labeling underlie strategies for adaptive self-regulation and decreased NSSI risk in BPD
    corecore