63 research outputs found

    Neural correlates of processing valence and arousal in affective words

    Get PDF
    Psychological frameworks conceptualize emotion along 2 dimensions, "valence" and "arousal." Arousal invokes a single axis of intensity increasing from neutral to maximally arousing. Valence can be described variously as a bipolar continuum, as independent positive and negative dimensions, or as hedonic value (distance from neutral). In this study, we used functional magnetic resonance imaging to characterize neural activity correlating with arousal and with distinct models of valence during presentation of affective word stimuli. Our results extend observations in the chemosensory domain suggesting a double dissociation in which subregions of orbitofrontal cortex process valence, whereas amygdala preferentially processes arousal. In addition, our data support the physiological validity of descriptions of valence along independent axes or as absolute distance from neutral but fail to support the validity of descriptions of valence along a bipolar continuum

    Sensations of skin infestation linked to abnormal frontolimbic brain reactivity and differences in self-representation

    Get PDF
    Some patients experience skin sensations of infestation and contamination that are elusive to proximate dermatological explanation. We undertook a functional magnetic resonance imaging study of the brain to demonstrate, for the first time, that central processing of infestation-relevant stimuli is altered in patients with such abnormal skin sensations. We show differences in neural activity within amygdala, insula, middle temporal lobe and frontal cortices. Patients also demonstrated altered measures of self-representation, with poorer sensitivity to internal bodily (interoceptive) signals and greater susceptibility to take on an illusion of body ownership: the rubber hand illusion. Together, these findings highlight a potential model for the maintenance of abnormal skin sensations, encompassing heightened threat processing within amygdala, increased salience of skin representations within insula and compromised prefrontal capacity for self-regulation and appraisal

    Affective Man-Machine Interface: Unveiling human emotions through biosignals

    Get PDF
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals

    Model-based analyses: Promises, pitfalls, and example applications to the study of cognitive control

    Get PDF
    We discuss a recent approach to investigating cognitive control, which has the potential to deal with some of the challenges inherent in this endeavour. In a model-based approach, the researcher defines a formal, computational model that performs the task at hand and whose performance matches that of a research participant. The internal variables in such a model might then be taken as proxies for latent variables computed in the brain. We discuss the potential advantages of such an approach for the study of the neural underpinnings of cognitive control and its pitfalls, and we make explicit the assumptions underlying the interpretation of data obtained using this approach

    From facial mimicry to emotional empathy: a role for norepinephrine?

    No full text
    Tendency to mimic others' emotional facial expressions predicts empathy and may represent a physiological marker of psychopathy. Anatomical connectivity between amygdala, cingulate motor cortex (M3, M4), and facial nucleus demonstrates a potential neuroanatomical substrate for mimicry, though pharmacological influences are largely unknown. Norepinephrine modulation selectively impairs negative emotion recognition, reflecting a potential role in processing empathy-eliciting facial expressions. We examined effects of single doses of propranolol (beta-adrenoceptor blocker) and reboxetine (selective norepinephrine reuptake inhibitor) on automatic facial mimicry of sadness, anger, and happiness, and the relationship between mimicry and empathy. Forty-five healthy volunteers were randomized to 40 mg propranolol or 4 mg reboxetine. Two hours after drug subjects viewed and rated facial expressions of sadness, anger, and happiness, while corrugator, zygomatic, and mentalis EMG were recorded. Trait emotional empathy was measured using the Balanced Emotional Empathy Scale. EMG confirmed emotion-specific mimicry and the relationship between corrugator mimicry and empathy. Norepinephrine modulation did not alter mimicry to any expression or influence the relationship between mimicry and empathy. Corrugator but not zygomaticus mimicry predicts trait empathy, consistent with greater anatomical connectivity between amygdala and M3 coding upper facial muscle representations. Although influencing emotion perception, norepinephrine does not influence emotional facial mimicry or its relationship with trait empathy

    Processing of observed pupil size modulates perception of sadness and predicts empathy

    No full text
    Facial autonomic responses may contribute to emotional communication and reveal individual affective style. In this study, the authors examined how observed pupillary size modulates processing of facial expression, extending the finding that incidentally perceived pupils influence ratings of sadness but not those of happy, angry, or neutral facial expressions. Healthy subjects rated the valence and arousal of photographs depicting facial muscular expressions of sadness, surprise, fear, and disgust. Pupil sizes within the stimuli were experimentally manipulated. Subjects themselves were scored with an empathy questionnaire. Diminishing pupil size linearly enhanced intensity and valence judgments of sad expressions (but not fear, surprise, or disgust). At debriefing, subjects were unaware of differences in pupil size across stimuli. These observations complement an earlier study showing that pupil size directly influences processing of sadness but not other basic emotional facial expressions. Furthermore, across subjects, the degree to which pupil size influenced sadness processing correlated with individual differences in empathy score. Together, these data demonstrate a central role of sadness processing in empathetic emotion and highlight the salience of implicit autonomic signals in affective communication. (PsycINFO Database Record (c) 2016 APA, all rights reserved

    Dynamic pupillary exchange engages brain regions encoding social salience

    No full text
    Covert exchange of autonomic responses may shape social affective behavior, as observed in mirroring of pupillary responses during sadness processing. We examined how, independent of facial emotional expression, dynamic coherence between one's own and another's pupil size modulates regional brain activity. Fourteen subjects viewed pairs of eye stimuli while undergoing fMRI. Using continuous pupillometry biofeedback, the size of the observed pupils was varied, correlating positively or negatively with changes in participants’ own pupils. Viewing both static and dynamic stimuli activated right fusiform gyrus. Observing dynamically changing pupils activated STS and amygdala, regions engaged by non-static and salient facial features. Discordance between observed and observer's pupillary changes enhanced activity within bilateral anterior insula, left amygdala and anterior cingulate. In contrast, processing positively correlated pupils enhanced activity within left frontal operculum. Our findings suggest pupillary signals are monitored continuously during social interactions and that incongruent changes activate brain regions involved in tracking motivational salience and attentionally meaningful information. Naturalistically, dynamic coherence in pupillary change follows fluctuations in ambient light. Correspondingly, in social contexts discordant pupil response is likely to reflect divergence of dispositional state. Our data provide empirical evidence for an autonomically mediated extension of forward models of motor control into social interaction
    corecore