34 research outputs found

    Superior Facial Expression, But Not Identity Recognition, in Mirror-Touch Synesthesia

    Get PDF
    Simulation models of expression recognition contend that to understand another's facial expressions, individuals map the perceived expression onto the same sensorimotor representations that are active during the experience of the perceived emotion. To investigate this view, the present study examines facial expression and identity recognition abilities in a rare group of participants who show facilitated sensorimotor simulation (mirror-touch synesthetes). Mirror-touch synesthetes experience touch on their own body when observing touch to another person. These experiences have been linked to heightened sensorimotor simulation in the shared-touch network (brain regions active during the passive observation and experience of touch). Mirror-touch synesthetes outperformed nonsynesthetic participants on measures of facial expression recognition, but not on control measures of face memory or facial identity perception. These findings imply a role for sensorimotor simulation processes in the recognition of facial affect, but not facial identity

    Suppressing sensorimotor activity modulates the discrimination of auditory emotions but not speaker identity

    Get PDF
    Our ability to recognize the emotions of others is a crucial feature of human social cognition. Functional neuroimaging studies indicate that activity in sensorimotor cortices is evoked during the perception of emotion. In the visual domain, right somatosensory cortex activity has been shown to be critical for facial emotion recognition. However, the importance of sensorimotor representations in modalities outside of vision remains unknown. Here we use continuous theta-burst transcranial magnetic stimulation (cTBS) to investigate whether neural activity in the right postcentral gyrus (rPoG) and right lateral premotor cortex (rPM) is involved in nonverbal auditory emotion recognition. Three groups of participants completed same-different tasks on auditory stimuli, discriminating between the emotion expressed and the speakers' identities, before and following cTBS targeted at rPoG, rPM, or the vertex (control site). A task-selective deficit in auditory emotion discrimination was observed. Stimulation to rPoG and rPM resulted in a disruption of participants' abilities to discriminate emotion, but not identity, from vocal signals. These findings suggest that sensorimotor activity may be a modality-independent mechanism which aids emotion discrimination. Copyright © 2010 the authors

    Beta event-related desynchronization as an index of individual differences in processing human facial expression: further investigations of autistic traits in typically developing adults

    Get PDF
    The human mirror neuron system (hMNS) has been associated with various forms of social cognition and affective processing including vicarious experience. It has also been proposed that a faulty hMNS may underlie some of the deficits seen in the autism spectrum disorders (ASDs). In the present study we set out to investigate whether emotional facial expressions could modulate a putative EEG index of hMNS activation (mu suppression) and if so, would this differ according to the individual level of autistic traits [high versus low Autism Spectrum Quotient (AQ) score]. Participants were presented with 3 s films of actors opening and closing their hands (classic hMNS mu-suppression protocol) while simultaneously wearing happy, angry, or neutral expressions. Mu-suppression was measured in the alpha and low beta bands. The low AQ group displayed greater low beta event-related desynchronization (ERD) to both angry and neutral expressions. The high AQ group displayed greater low beta ERD to angry than to happy expressions. There was also significantly more low beta ERD to happy faces for the low than for the high AQ group. In conclusion, an interesting interaction between AQ group and emotional expression revealed that hMNS activation can be modulated by emotional facial expressions and that this is differentiated according to individual differences in the level of autistic traits. The EEG index of hMNS activation (mu suppression) seems to be a sensitive measure of the variability in facial processing in typically developing individuals with high and low self-reported traits of autism

    I feel your fear:shared touch between faces facilitates recognition of fearful facial expressions

    Get PDF
    Embodied simulation accounts of emotion recognition claim that we vicariously activate somatosensory representations to simulate, and eventually understand, how others feel. Interestingly, Mirror-Touch Synaesthetes, who experience touch when observing others being touched, show both enhanced somatosensory simulation and superior recognition of emotional facial expressions. We employed synchronous visuotactile stimulation to experimentally induce a similar experience of ‘mirror touch’ in non-synesthetic participants. Seeing someone else’s face being touched at the same time as one’s own face results in the ‘enfacement illusion’, which has been previously shown to blur self-other boundaries. We demonstrate that the enfacement illusion also facilitates emotion recognition, and, importantly, this facilitatory effect is specific to fearful facial expressions. Shared synchronous multisensory experiences may experimentally facilitate somatosensory simulation mechanisms involved in the recognition of fearful emotional expressions

    Impaired recognition and regulation of disgust is associated with distinct but partially overlapping patterns of decreased gray matter volume in the ventroanterior insula

    Get PDF
    Background The ventroanterior insula is implicated in the experience, expression, and recognition of disgust; however, whether this brain region is required for recognizing disgust or regulating disgusting behaviors remains unknown. Methods We examined the brain correlates of the presence of disgusting behavior and impaired recognition of disgust using voxel-based morphometry in a sample of 305 patients with heterogeneous patterns of neurodegeneration. Permutation-based analyses were used to determine regions of decreased gray matter volume at a significance level p <=.05 corrected for family-wise error across the whole brain and within the insula. Results Patients with behavioral variant frontotemporal dementia and semantic variant primary progressive aphasia were most likely to exhibit disgusting behaviors and were, on average, the most impaired at recognizing disgust in others. Imaging analysis revealed that patients who exhibited disgusting behaviors had significantly less gray matter volume bilaterally in the ventral anterior insula. A region of interest analysis restricted to behavioral variant frontotemporal dementia and semantic variant primary progressive aphasia patients alone confirmed this result. Moreover, impaired recognition of disgust was associated with decreased gray matter volume in the bilateral ventroanterior and ventral middle regions of the insula. There was an area of overlap in the bilateral anterior insula where decreased gray matter volume was associated with both the presence of disgusting behavior and impairments in recognizing disgust. Conclusions These findings suggest that regulating disgusting behaviors and recognizing disgust in others involve two partially overlapping neural systems within the insula. Moreover, the ventral anterior insula is required for both processes

    Spatially generalizable representations of facial expressions: Decoding across partial face samples

    Get PDF
    A network of cortical and sub-cortical regions is known to be important in the processing of facial expression. However, to date no study has investigated whether representations of facial expressions present in this network permit generalization across independent samples of face information (e.g. eye region Vs mouth region). We presented participants with partial face samples of five expression categories in a rapid event-related fMRI experiment. We reveal a network of face sensitive regions that contain information about facial expression categories regardless of which part of the face is presented. We further reveal that the neural information present in a subset of these regions: dorsal prefrontal cortex (dPFC), superior temporal sulcus (STS), lateral occipital and ventral temporal cortex, and even early visual cortex, enables reliable generalization across independent visual inputs (faces depicting the 'eyes only' versus 'eyes removed'). Furthermore, classification performance was correlated to behavioral performance in STS and dPFC. Our results demonstrate that both higher (e.g. STS, dPFC) and lower level cortical regions contain information useful for facial expression decoding that go beyond the visual information presented, and implicate a key role for contextual mechanisms such as cortical feedback in facial expression perception under challenging conditions of visual occlusion

    Emotional Complexity and the Neural Representation of Emotion in Motion

    Get PDF
    According to theories of emotional complexity, individuals low in emotional complexity encode and represent emotions in visceral or action-oriented terms, whereas individuals high in emotional complexity encode and represent emotions in a differentiated way, using multiple emotion concepts. During functional magnetic resonance imaging, participants viewed valenced animated scenarios of simple ball-like figures attending either to social or spatial aspects of the interactions. Participant’s emotional complexity was assessed using the Levels of Emotional Awareness Scale. We found a distributed set of brain regions previously implicated in processing emotion from facial, vocal and bodily cues, in processing social intentions, and in emotional response, were sensitive to emotion conveyed by motion alone. Attention to social meaning amplified the influence of emotion in a subset of these regions. Critically, increased emotional complexity correlated with enhanced processing in a left temporal polar region implicated in detailed semantic knowledge; with a diminished effect of social attention; and with increased differentiation of brain activity between films of differing valence. Decreased emotional complexity was associated with increased activity in regions of pre-motor cortex. Thus, neural coding of emotion in semantic vs action systems varies as a function of emotional complexity, helping reconcile puzzling inconsistencies in neuropsychological investigations of emotion recognition

    Compensatory premotor activity during affective face processing in subclinical carriers of a single mutant Parkin allele

    Get PDF
    Patients with Parkinson's disease suffer from significant motor impairments and accompanying cognitive and affective dysfunction due to progressive disturbances of basal ganglia–cortical gating loops. Parkinson's disease has a long presymptomatic stage, which indicates a substantial capacity of the human brain to compensate for dopaminergic nerve degeneration before clinical manifestation of the disease. Neuroimaging studies provide evidence that increased motor-related cortical activity can compensate for progressive dopaminergic nerve degeneration in carriers of a single mutant Parkin or PINK1 gene, who show a mild but significant reduction of dopamine metabolism in the basal ganglia in the complete absence of clinical motor signs. However, it is currently unknown whether similar compensatory mechanisms are effective in non-motor basal ganglia–cortical gating loops. Here, we ask whether asymptomatic Parkin mutation carriers show altered patterns of brain activity during processing of facial gestures, and whether this might compensate for latent facial emotion recognition deficits. Current theories in social neuroscience assume that execution and perception of facial gestures are linked by a special class of visuomotor neurons (‘mirror neurons’) in the ventrolateral premotor cortex/pars opercularis of the inferior frontal gyrus (Brodmann area 44/6). We hypothesized that asymptomatic Parkin mutation carriers would show increased activity in this area during processing of affective facial gestures, replicating the compensatory motor effects that have previously been observed in these individuals. Additionally, Parkin mutation carriers might show altered activity in other basal ganglia–cortical gating loops. Eight asymptomatic heterozygous Parkin mutation carriers and eight matched controls underwent functional magnetic resonance imaging and a subsequent facial emotion recognition task. As predicted, Parkin mutation carriers showed significantly stronger activity in the right ventrolateral premotor cortex during execution and perception of affective facial gestures than healthy controls. Furthermore, Parkin mutation carriers showed a slightly reduced ability to recognize facial emotions that was least severe in individuals who showed the strongest increase of ventrolateral premotor activity. In addition, Parkin mutation carriers showed a significantly weaker than normal increase of activity in the left lateral orbitofrontal cortex (inferior frontal gyrus pars orbitalis, Brodmann area 47), which was unrelated to facial emotion recognition ability. These findings are consistent with the hypothesis that compensatory activity in the ventrolateral premotor cortex during processing of affective facial gestures can reduce impairments in facial emotion recognition in subclinical Parkin mutation carriers. A breakdown of this compensatory mechanism might lead to the impairment of facial expressivity and facial emotion recognition observed in manifest Parkinson's disease

    Affective resonance in response to others' emotional faces varies with affective ratings and psychopathic traits in amygdala and anterior insula.

    Get PDF
    Despite extensive research on the neural basis of empathic responses for pain and disgust, there is limited data about the brain regions that underpin affective response to other people's emotional facial expressions. Here, we addressed this question using event-related functional magnetic resonance imaging to assess neural responses to emotional faces, combined with online ratings of subjective state. When instructed to rate their own affective response to others' faces, participants recruited anterior insula, dorsal anterior cingulate, inferior frontal gyrus, and amygdala, regions consistently implicated in studies investigating empathy for disgust and pain, as well as emotional saliency. Importantly, responses in anterior insula and amygdala were modulated by trial-by-trial variations in subjective affective responses to the emotional facial stimuli. Furthermore, overall task-elicited activations in these regions were negatively associated with psychopathic personality traits, which are characterized by low affective empathy. Our findings suggest that anterior insula and amygdala play important roles in the generation of affective internal states in response to others' emotional cues and that attenuated function in these regions may underlie reduced empathy in individuals with high levels of psychopathic traits
    corecore