8,885 research outputs found

    The emotional recall task : juxtaposing recall and recognition-based affect scales

    Get PDF
    Existing affect scales typically involve recognition of emotions from a predetermined emotion checklist. However, a recognition-based checklist may fail to capture sufficient breadth and specificity of an individual’s recalled emotional experiences and may therefore miss emotions that frequently come to mind. More generally, how do recalled emotions differ from recognized emotions? To address these issues, we present and evaluate an affect scale based on recalled emotions. Participants are asked to produce 10 words that best described their emotions over the past month and then to rate each emotion for how often it was experienced. We show that average weighted valence of the words produced in this task, the Emotional Recall Task (ERT), is strongly correlated with scales related to general affect, such as the PANAS, Ryff’s Scales of Psychological Well-being, the Satisfaction with Life Scale, Depression Anxiety and Stress Scales, and a few other related scales. We further show that the Emotional Recall Task captures a breadth and specificity of emotions not available in other scales but that are nonetheless commonly reported as experienced emotions. We test a general version of the ERT (the ERT general) that is language neutral and can be used across cultures. Finally, we show that the ERT is valid in a test-retest paradigm. In sum, the ERT measures affect based on emotion terms relevant to an individual’s idiosyncratic experience. It is consistent with recognition-based scales, but also offers a new direction towards enriching our understanding of individual differences in recalled and recognized emotions

    Shared acoustic codes underlie emotional communication in music and speech—Evidence from deep transfer learning

    Get PDF
    Music and speech exhibit striking similarities in the communication of emotions in the acoustic domain, in such a way that the communication of specific emotions is achieved, at least to a certain extent, by means of shared acoustic patterns. From an Affective Sciences points of view, determining the degree of overlap between both domains is fundamental to understand the shared mechanisms underlying such phenomenon. From a Machine learning perspective, the overlap between acoustic codes for emotional expression in music and speech opens new possibilities to enlarge the amount of data available to develop music and speech emotion recognition systems. In this article, we investigate time-continuous predictions of emotion (Arousal and Valence) in music and speech, and the Transfer Learning between these domains. We establish a comparative framework including intra- (i.e., models trained and tested on the same modality, either music or speech) and cross-domain experiments (i.e., models trained in one modality and tested on the other). In the cross-domain context, we evaluated two strategies—the direct transfer between domains, and the contribution of Transfer Learning techniques (feature-representation-transfer based on Denoising Auto Encoders) for reducing the gap in the feature space distributions. Our results demonstrate an excellent cross-domain generalisation performance with and without feature representation transfer in both directions. In the case of music, cross-domain approaches outperformed intra-domain models for Valence estimation, whereas for Speech intra-domain models achieve the best performance. This is the first demonstration of shared acoustic codes for emotional expression in music and speech in the time-continuous domain

    Creativity and the Brain

    Get PDF
    Neurocognitive approach to higher cognitive functions that bridges the gap between psychological and neural level of description is introduced. Relevant facts about the brain, working memory and representation of symbols in the brain are summarized. Putative brain processes responsible for problem solving, intuition, skill learning and automatization are described. The role of non-dominant brain hemisphere in solving problems requiring insight is conjectured. Two factors seem to be essential for creativity: imagination constrained by experience, and filtering that selects most interesting solutions. Experiments with paired words association are analyzed in details and evidence for stochastic resonance effects is found. Brain activity in the process of invention of novel words is proposed as the simplest way to understand creativity using experimental and computational means. Perspectives on computational models of creativity are discussed

    Embodied Robot Models for Interdisciplinary Emotion Research

    Get PDF
    Due to their complex nature, emotions cannot be properly understood from the perspective of a single discipline. In this paper, I discuss how the use of robots as models is beneficial for interdisciplinary emotion research. Addressing this issue through the lens of my own research, I focus on a critical analysis of embodied robots models of different aspects of emotion, relate them to theories in psychology and neuroscience, and provide representative examples. I discuss concrete ways in which embodied robot models can be used to carry out interdisciplinary emotion research, assessing their contributions: as hypothetical models, and as operational models of specific emotional phenomena, of general emotion principles, and of specific emotion ``dimensions''. I conclude by discussing the advantages of using embodied robot models over other models.Peer reviewe

    The role of automaticity and attention in neural processes underlying empathy for happiness, sadness, and anxiety.

    Get PDF
    Although many studies have examined the neural basis of empathy, relatively little is known about how empathic processes are affected by different attentional conditions. Thus, we examined whether instructions to empathize might amplify responses in empathy-related regions and whether cognitive load would diminish the involvement of these regions. Thirty-two participants completed a functional magnetic resonance imaging session assessing empathic responses to individuals experiencing happy, sad, and anxious events. Stimuli were presented under three conditions: watching naturally, actively empathizing, and under cognitive load. Across analyses, we found evidence for a core set of neural regions that support empathic processes (dorsomedial prefrontal cortex, DMPFC; medial prefrontal cortex, MPFC; temporoparietal junction, TPJ; amygdala; ventral anterior insula, AI; and septal area, SA). Two key regions-the ventral AI and SA-were consistently active across all attentional conditions, suggesting that they are automatically engaged during empathy. In addition, watching vs. empathizing with targets was not markedly different and instead led to similar subjective and neural responses to others' emotional experiences. In contrast, cognitive load reduced the subjective experience of empathy and diminished neural responses in several regions related to empathy and social cognition (DMPFC, MPFC, TPJ, and amygdala). The results reveal how attention impacts empathic processes and provides insight into how empathy may unfold in everyday interactions

    Affective neuroscience, emotional regulation, and international relations

    Get PDF
    International relations (IR) has witnessed an emerging interest in neuroscience, particularly for its relevance to a now widespread scholarship on emotions. Contributing to this scholarship, this article draws on the subfields of affective neuroscience and neuropsychology, which remain largely unexplored in IR. Firstly, the article draws on affective neuroscience in illuminating affect's defining role in consciousness and omnipresence in social behavior, challenging the continuing elision of emotions in mainstream approaches. Secondly, it applies theories of depth neuropsychology, which suggest a neural predisposition originating in the brain's higher cortical regions to attenuate emotional arousal and limit affective consciousness. This predisposition works to preserve individuals' self-coherence, countering implicit assumptions about rationality and motivation within IR theory. Thirdly, it outlines three key implications for IR theory. It argues that affective neuroscience and neuropsychology offer a route towards deep theorizing of ontologies and motivations. It also leads to a reassessment of the social regulation of emotions, particularly as observed in institutions, including the state. It also suggests a productive engagement with constructivist and poststructuralist approaches by addressing the agency of the body in social relations. The article concludes by sketching the potential for a therapeutically-attuned approach to IR

    EMPATH: A Neural Network that Categorizes Facial Expressions

    Get PDF
    There are two competing theories of facial expression recognition. Some researchers have suggested that it is an example of "categorical perception." In this view, expression categories are considered to be discrete entities with sharp boundaries, and discrimination of nearby pairs of expressive faces is enhanced near those boundaries. Other researchers, however, suggest that facial expression perception is more graded and that facial expressions are best thought of as points in a continuous, low-dimensional space, where, for instance, "surprise" expressions lie between "happiness" and "fear" expressions due to their perceptual similarity. In this article, we show that a simple yet biologically plausible neural network model, trained to classify facial expressions into six basic emotions, predicts data used to support both of these theories. Without any parameter tuning, the model matches a variety of psychological data on categorization, similarity, reaction times, discrimination, and recognition difficulty, both qualitatively and quantitatively. We thus explain many of the seemingly complex psychological phenomena related to facial expression perception as natural consequences of the tasks' implementations in the brain
    • …
    corecore