407 research outputs found

    An ERP Study of Responses to Emotional Facial Expressions: Morphing Effects on Early-Latency Valence Processing

    Get PDF
    Early-latency theories of emotional processing state that at least coarse monitoring of the emotional valence (a pleasure-displeasure continuum) of facial expressions should be both rapid and highly automated (LeDoux, 1995; Russell, 1980). Research has largely substantiated early-latency differential processing of emotional versus non-emotional facial expressions; however, the effect of valence on early-latency processing of emotional facial expression remains unclear. In an effort to delineate the effects of valence on early-latency emotional facial expression processing, the current investigation compared ERP responses to positive (happy and surprise), neutral, and negative (afraid and sad) basic facial expression photographs as well as to positive (happy-surprise), neutral (afraid-surprise, happy-afraid, happy-sad, sad-surprise), and negative (sad-afraid) morph facial expression photographs during a valence-rating task. Morphing manipulations have been shown to decrease the familiarity of facial patterns and thus preclude any overlearned responses to specific facial codes. Accordingly, it was proposed that morph stimuli would disrupt more detailed emotional identification to reveal a valence response independent of a specific identifiable emotion (Balconi & Lucchiari, 2005; Schweinberger, Burton & Kelly, 1999). ERP results revealed early-latency differentiation between positive, neutral, and negative morph facial expressions approximately 108 milliseconds post-stimulus (P1) within the right electrode cluster; negative morph facial expressions continued to elicit significantly smaller ERP amplitudes than other valence categories approximately 164 milliseconds post-stimulus (N170). Consistent with previous imaging research on emotional facial expression processing, source localization revealed substantial dipole activation within regions of the mesolimbic dopamine system. Thus, these findings confirm rapid valence processing of facial expressions and suggest that negative valence processing may continue to modulate subsequent structural facial processing

    The role of the amygdala in face perception and evaluation

    Get PDF
    Faces are one of the most significant social stimuli and the processes underlying face perception are at the intersection of cognition, affect, and motivation. Vision scientists have had a tremendous success of mapping the regions for perceptual analysis of faces in posterior cortex. Based on evidence from (a) single unit recording studies in monkeys and humans; (b) human functional localizer studies; and (c) meta-analyses of neuroimaging studies, I argue that faces automatically evoke responses not only in these regions but also in the amygdala. I also argue that (a) a key property of faces represented in the amygdala is their typicality; and (b) one of the functions of the amygdala is to bias attention to atypical faces, which are associated with higher uncertainty. This framework is consistent with a number of other amygdala findings not involving faces, suggesting a general account for the role of the amygdala in perception

    Seeing Fearful Body Expressions Activates the Fusiform Cortex and Amygdala

    Get PDF

    Regional Brain Responses in Nulliparous Women to Emotional Infant Stimuli

    Get PDF
    Infant cries and facial expressions influence social interactions and elicit caretaking behaviors from adults. Recent neuroimaging studies suggest that neural responses to infant stimuli involve brain regions that process rewards. However, these studies have yet to investigate individual differences in tendencies to engage or withdraw from motivationally relevant stimuli. To investigate this, we used event-related fMRI to scan 17 nulliparous women. Participants were presented with novel infant cries of two distress levels (low and high) and unknown infant faces of varying affect (happy, sad, and neutral) in a randomized, counter-balanced order. Brain activation was subsequently correlated with scores on the Behavioral Inhibition System/Behavioral Activation System scale. Infant cries activated bilateral superior and middle temporal gyri (STG and MTG) and precentral and postcentral gyri. Activation was greater in bilateral temporal cortices for low- relative to high-distress cries. Happy relative to neutral faces activated the ventral striatum, caudate, ventromedial prefrontal, and orbitofrontal cortices. Sad versus neutral faces activated the precuneus, cuneus, and posterior cingulate cortex, and behavioral activation drive correlated with occipital cortical activations in this contrast. Behavioral inhibition correlated with activation in the right STG for high- and low-distress cries relative to pink noise. Behavioral drive correlated inversely with putamen, caudate, and thalamic activations for the comparison of high-distress cries to pink noise. Reward-responsiveness correlated with activation in the left precentral gyrus during the perception of low-distress cries relative to pink noise. Our findings indicate that infant cry stimuli elicit activations in areas implicated in auditory processing and social cognition. Happy infant faces may be encoded as rewarding, whereas sad faces activate regions associated with empathic processing. Differences in motivational tendencies may modulate neural responses to infant cues

    Seeing fearful body expressions activates the fusiform cortex and amygdala

    Get PDF
    AbstractDarwin's evolutionary approach to organisms' emotional states attributes a prominent role to expressions of emotion in whole-body actions. Researchers in social psychology [1, 2] and human development [3] have long emphasized the fact that emotional states are expressed through body movement, but cognitive neuroscientists have almost exclusively considered isolated facial expressions (for review, see [4]). Here we used high-field fMRI to determine the underlying neural mechanisms of perception of body expression of emotion. Subjects were presented with short blocks of body expressions of fear alternating with short blocks of emotionally neutral meaningful body gestures. All images had internal facial features blurred out to avoid confounds due to a face or facial expression. We show that exposure to body expressions of fear, as opposed to neutral body postures, activates the fusiform gyrus and the amygdala. The fact that these two areas have previously been associated with the processing of faces and facial expressions [5–8] suggests synergies between facial and body-action expressions of emotion. Our findings open a new area of investigation of the role of body expressions of emotion in adaptive behavior as well as the relation between processes of emotion recognition in the face and in the body

    Neural Coding of Cooperative vs. Affective Human Interactions: 150 ms to Code the Action's Purpose

    Get PDF
    The timing and neural processing of the understanding of social interactions was investigated by presenting scenes in which 2 people performed cooperative or affective actions. While the role of the human mirror neuron system (MNS) in understanding actions and intentions is widely accepted, little is known about the time course within which these aspects of visual information are automatically extracted. Event-Related Potentials were recorded in 35 university students perceiving 260 pictures of cooperative (e.g., 2 people dragging a box) or affective (e.g., 2 people smiling and holding hands) interactions. The action's goal was automatically discriminated at about 150–170 ms, as reflected by occipito/temporal N170 response. The swLORETA inverse solution revealed the strongest sources in the right posterior cingulate cortex (CC) for affective actions and in the right pSTS for cooperative actions. It was found a right hemispheric asymmetry that involved the fusiform gyrus (BA37), the posterior CC, and the medial frontal gyrus (BA10/11) for the processing of affective interactions, particularly in the 155–175 ms time window. In a later time window (200–250 ms) the processing of cooperative interactions activated the left post-central gyrus (BA3), the left parahippocampal gyrus, the left superior frontal gyrus (BA10), as well as the right premotor cortex (BA6). Women showed a greater response discriminative of the action's goal compared to men at P300 and anterior negativity level (220–500 ms). These findings might be related to a greater responsiveness of the female vs. male MNS. In addition, the discriminative effect was bilateral in women and was smaller and left-sided in men. Evidence was provided that perceptually similar social interactions are discriminated on the basis of the agents' intentions quite early in neural processing, differentially activating regions devoted to face/body/action coding, the limbic system and the MNS

    Facial identity and emotional expression as predictors during economic decisions

    Get PDF
    Two sources of information most relevant to guide social decision making are the cooperative tendencies associated with different people and their facial emotional displays. This electrophysiological experiment aimed to study how the use of personal identity and emotional expressions as cues impacts different stages of face processing and their potential isolated or interactive processing. Participants played a modified trust game with 8 different alleged partners, and in separate blocks either the identity or the emotions carried information regarding potential trial outcomes (win or loss). Behaviorally, participants were faster to make decisions based on identity compared to emotional expressions. Also, ignored (nonpredictive) emotions interfered with decisions based on identity in trials where these sources of information conflicted. Electrophysiological results showed that expectations based on emotions modulated processing earlier in time than those based on identity. Whereas emotion modulated the central N1 and VPP potentials, identity judgments heightened the amplitude of the N2 and P3b. In addition, the conflict that ignored emotions generated was reflected on the N170 and P3b potentials. Overall, our results indicate that using identity or emotional cues to predict cooperation tendencies recruits dissociable neural circuits from an early point in time, and that both sources of information generate early and late interactive patterns

    Face Coding Is Bilateral in the Female Brain

    Get PDF
    Background: It is currently believed that face processing predominantly activates the right hemisphere in humans, but available literature is very inconsistent. Methodology/Principal Findings: In this study, ERPs were recorded in 50 right-handed women and men in response to 390 faces (of different age and sex), and 130 technological objects. Results showed no sex difference in the amplitude of N170 to objects; a much larger face-specific response over the right hemisphere in men, and a bilateral response in women; a lack of face-age coding effect over the left hemisphere in men, with no differences in N170 to faces as a function of age; a significant bilateral face-age coding effect in women. Conclusions/Significance: LORETA reconstruction showed a significant left and right asymmetry in the activation of the fusiform gyrus (BA19), in women and men, respectively. The present data reveal a lesser degree of lateralization of brain functions related to face coding in women than men. In this light, they may provide an explanation of the inconsistencies in the available literature concerning the asymmetric activity of left and right occipito-temporal cortices devoted to fac

    A multimodal investigation of dynamic face perception using functional magnetic resonance imaging and magnetoencephalography

    Get PDF
    Motion is an important aspect of face perception that has been largely neglected to date. Many of the established findings are based on studies that use static facial images, which do not reflect the unique temporal dynamics available from seeing a moving face. In the present thesis a set of naturalistic dynamic facial emotional expressions was purposely created and used to investigate the neural structures involved in the perception of dynamic facial expressions of emotion, with both functional Magnetic Resonance Imaging (fMRI) and Magnetoencephalography (MEG). Through fMRI and connectivity analysis, a dynamic face perception network was identified, which is demonstrated to extend the distributed neural system for face perception (Haxby et al.,2000). Measures of effective connectivity between these regions revealed that dynamic facial stimuli were associated with specific increases in connectivity between early visual regions, such as inferior occipital gyri and superior temporal sulci, along with coupling between superior temporal sulci and amygdalae, as well as with inferior frontal gyri. MEG and Synthetic Aperture Magnetometry (SAM) were used to examine the spatiotemporal profile of neurophysiological activity within this dynamic face perception network. SAM analysis revealed a number of regions showing differential activation to dynamic versus static faces in the distributed face network, characterised by decreases in cortical oscillatory power in the beta band, which were spatially coincident with those regions that were previously identified with fMRI. These findings support the presence of a distributed network of cortical regions that mediate the perception of dynamic facial expressions, with the fMRI data providing information on the spatial co-ordinates paralleled by the MEG data, which indicate the temporal dynamics within this network. This integrated multimodal approach offers both excellent spatial and temporal resolution, thereby providing an opportunity to explore dynamic brain activity and connectivity during face processing
    corecore