22 research outputs found

    Recruitment of Language-, Emotion- and Speech-Timing Associated Brain Regions for Expressing Emotional Prosody: Investigation of Functional Neuroanatomy with fMRI

    Get PDF
    We aimed to progress understanding of prosodic emotion expression by establishing brain regions active when expressing specific emotions, those activated irrespective of the target emotion, and those whose activation intensity varied depending on individual performance. BOLD contrast data were acquired whilst participants spoke non-sense words in happy, angry or neutral tones, or performed jaw-movements. Emotion-specific analyses demonstrated that when expressing angry prosody, activated brain regions included the inferior frontal and superior temporal gyri, the insula, and the basal ganglia. When expressing happy prosody, the activated brain regions also included the superior temporal gyrus, insula, and basal ganglia, with additional activation in the anterior cingulate. Conjunction analysis confirmed that the superior temporal gyrus and basal ganglia were activated regardless of the specific emotion concerned. Nevertheless, disjunctive comparisons between the expression of angry and happy prosody established that anterior cingulate activity was significantly higher for angry prosody than for happy prosody production. Degree of inferior frontal gyrus activity correlated with the ability to express the target emotion through prosody. We conclude that expressing prosodic emotions (vs. neutral intonation) requires generic brain regions involved in comprehending numerous aspects of language, emotion-related processes such as experiencing emotions, and in the time-critical integration of speech information

    Psychosocial predictorsfor the success oflumbar discsurgery

    No full text

    Effects of intensity of facial expressions on amygdalar activation independently of valence

    Get PDF
    For several stimulus categories (e.g., pictures, odors and words), the arousal of both negative and positive stimuli has been shown to modulate amygdalar activation. In contrast, previous studies did not observe similar amygdalar effects in response to negative and positive facial expressions with varying intensity of facial expressions. Reasons for this discrepancy may be related to analytical strategies, experimental design and stimuli. Therefore, the present study aimed at re-investigating whether the intensity of facial expressions modulates amygdalar activation by circumventing limitations of previous research. Event-related functional magnetic resonance imaging (fMRI) was used to assess brain activation while participants observed a static neutral expression and positive (happy) and negative (angry) expressions of either high or low intensity from an ecologically valid, novel stimulus set. The ratings of arousal and intensity were highly correlated. We found that amygdalar activation followed a u-shaped activation pattern with highest activation to high intense facial expressions as compared to low intensity facial expressions and to the neutral expression irrespective of valence, suggesting a critical role of the amygdala in valence-independent arousal processing of facial expressions. Additionally, consistent with previous studies, intensity effects were also found in visual areas and generally increased activation to angry vs. happy faces were found in visual cortex and insula, indicating enhanced visual representations of high arousing facial expressions and increased visual and somatosensory representations of threat

    Stimulus arousal drives amygdalar responses to emotional expressions across sensory modalities

    Full text link
    The factors that drive amygdalar responses to emotionally significant stimuli are still a matter of debate – particularly the proneness of the amygdala to respond to negatively-valenced stimuli has been discussed controversially. Furthermore, it is uncertain whether the amygdala responds in a modality-general fashion or whether modality-specific idiosyncrasies exist. Therefore, the present functional magnetic resonance imaging (fMRI) study systematically investigated amygdalar responding to stimulus valence and arousal of emotional expressions across visual and auditory modalities. During scanning, participants performed a gender judgment task while prosodic and facial emotional expressions were presented. The stimuli varied in stimulus valence and arousal by including neutral, happy and angry expressions of high and low emotional intensity. Results demonstrate amygdalar activation as a function of stimulus arousal and accordingly associated emotional intensity regardless of stimulus valence. Furthermore, arousal-driven amygdalar responding did not depend on the visual and auditory modalities of emotional expressions. Thus, the current results are consistent with the notion that the amygdala codes general stimulus relevance across visual and auditory modalities irrespective of valence. In addition, whole brain analyses revealed that effects in visual and auditory areas were driven mainly by high intense emotional facial and vocal stimuli, respectively, suggesting modality-specific representations of emotional expressions in auditory and visual cortices

    Functional Evidence for a Dual Route to Amygdala

    No full text
    The amygdala plays a central role in evaluating the behavioral importance of sensory information. Anatomical subcortical pathways provide direct input to the amygdala from early sensory systems and may support an adaptively valuable rapid appraisal of salient information [1–3]. However, the functional significance of these subcortical inputs remains controversial [4]. We recorded magnetoencephalographic activity evoked by tones in the context of emotionally valent faces and tested two competing biologically motivated dynamic causal models [5, 6] against these data: the dual and cortical models. The dual model comprised two parallel (cortical and subcortical) routes to the amygdala, whereas the cortical model excluded the subcortical path. We found that neuronal responses elicited by salient information were better explained when a subcortical pathway was included. In keeping with its putative functional role of rapid stimulus appraisal, the subcortical pathway was most important early in stimulus processing. However, as often assumed, its action was not limited to the context of fear, pointing to a more widespread information processing role. Thus, our data supports the idea that an expedited evaluation of sensory input is best explained by an architecture that involves a subcortical path to the amygdala

    Unimpaired discrimination of fearful prosody after amygdala lesion

    Get PDF
    Prosody (i.e. speech melody) is an important cue to infer an interlocutor's emotional state, complementing information from face expression and body posture. Inferring fear from face expression is reported as impaired after amygdala lesions. It remains unclear whether this deficit is specific to face expression, or is a more global fear recognition deficit. Here, we report data from two twins with bilateral selective amygdala lesions and show they are unimpaired in a multinomial emotional prosody classification task. In a two-alternative forced choice task, they demonstrate increased ability to discriminate fearful and neutral prosody, the opposite of what would be expected under an hypothesis of a global role for the amygdala in fear recognition. Hence, we provide evidence that the amygdala is not required for recognition of fearful prosody
    corecore