151,648 research outputs found

    ORIGINAL ARTICLES Can’t Shake that Feeling: Event-Related fMRI Assessment of Sustained Amygdala Activity in Response to Emotional Information in Depressed Individuals

    Get PDF
    individuals engage in prolonged elaborative processing of emotional information. A computational neural network model of emotional information processing suggests this process involves sustained amygdala activity in response to processing negative features of information. This study examined whether brain activity in response to emotional stimuli was sustained in depressed individuals, even following subsequent distracting stimuli. Methods: Seven depressed and 10 never-depressed individuals were studied using event-related functional magnetic resonance imaging during alternating 15-sec emotional processing (valence identification) and nonemotional processing (Sternberg memory) trials. Amygdala regions were traced on high-resolution structural scans and coregistered to the functional data. The time course of activity in these areas during emotional and nonemotional processing trials was examined. Results: During emotional processing trials, never-depressed individuals displayed amygdalar responses to all stimuli, which decayed within 10 sec. In contrast, depressed individuals displayed sustained amygdala responses to negative words that lasted throughout the following nonemotional processing trials (25 sec later). The difference in sustained amygdala activity to negative and positive words was moderately related to self-reported rumination. Conclusions: Results suggest that depression is associated with sustained activity in brain areas responsible for coding emotional features. Biol Psychiatry 2002;51

    The neural basis of authenticity recognition in laughter and crying

    Get PDF
    Deciding whether others’ emotions are genuine is essential for successful communication and social relationships. While previous fMRI studies suggested that differentiation between authentic and acted emotional expressions involves higher-order brain areas, the time course of authenticity discrimination is still unknown. To address this gap, we tested the impact of authenticity discrimination on event-related potentials (ERPs) related to emotion, motivational salience, and higher-order cognitive processing (N100, P200 and late positive complex, the LPC), using vocalised non-verbal expressions of sadness (crying) and happiness (laughter) in a 32-participant, within-subject study. Using a repeated measures 2-factor (authenticity, emotion) ANOVA, we show that N100’s amplitude was larger in response to authentic than acted vocalisations, particularly in cries, while P200’s was larger in response to acted vocalisations, particularly in laughs. We suggest these results point to two different mechanisms: (1) a larger N100 in response to authentic vocalisations is consistent with its link to emotional content and arousal (putatively larger amplitude for genuine emotional expressions); (2) a larger P200 in response to acted ones is in line with evidence relating it to motivational salience (putatively larger for ambiguous emotional expressions). Complementarily, a significant main effect of emotion was found on P200 and LPC amplitudes, in that the two were larger for laughs than cries, regardless of authenticity. Overall, we provide the first electroencephalographic examination of authenticity discrimination and propose that authenticity processing of others’ vocalisations is initiated early, along that of their emotional content or category, attesting for its evolutionary relevance for trust and bond formation.info:eu-repo/semantics/publishedVersio

    The role of spatial frequency information for ERP components sensitive to faces and emotional facial expression

    Get PDF
    To investigate the impact of spatial frequency on emotional facial expression analysis, ERPs were recorded in response to low spatial frequency (LSF), high spatial frequency (HSF), and unfiltered broad spatial frequency (BSF) faces with fearful or neutral expressions, houses, and chairs. In line with previous findings, BSF fearful facial expressions elicited a greater frontal positivity than BSF neutral facial expressions, starting at about 150 ms after stimulus onset. In contrast, this emotional expression effect was absent for HSF and LSF faces. Given that some brain regions involved in emotion processing, such as amygdala and connected structures, are selectively tuned to LSF visual inputs, these data suggest that ERP effects of emotional facial expression do not directly reflect activity in these regions. It is argued that higher order neocortical brain systems are involved in the generation of emotion-specific waveform modulations. The face-sensitive N170 component was neither affected by emotional facial expression nor by spatial frequency information

    Great nature’s second course: Introduction to the special issue on the behavioral neuroscience of sleep

    Full text link
    Sleep is necessary for normal psychological functioning, and psychological function in turn affects sleep integrity. Recent investigations delineate the relation of sleep to a broad array of processes ranging from learning and memory to emotional reactivity and mood, and use a variety of methodological approaches (imaging, electrophysiological, behavioral) to reveal the complex relations between sleep and the functioning of the awake brain. The articles in this issue advance our fundamental knowledge of the relation of sleep to psychological function. In addition, several of the articles discuss how sleep is affected by or affects human clinical conditions, including insomnia, epilepsy, mild cognitive impairment, bipolar disorder, and cancer. Together, the articles of this special issue highlight recent progress in understanding the behavioral neuroscience of sleep and identify promising areas for future research, including the possibility of sleep-based interventions to improve psychological health.Accepted manuscrip

    Affective iconic words benefit from additional sound–meaning integration in the left amygdala

    Get PDF
    Recent studies have shown that a similarity between sound and meaning of a word (i.e., iconicity) can help more readily access the meaning of that word, but the neural mechanisms underlying this beneficial role of iconicity in semantic processing remain largely unknown. In an fMRI study, we focused on the affective domain and examined whether affective iconic words (e.g., high arousal in both sound and meaning) activate additional brain regions that integrate emotional information from different domains (i.e., sound and meaning). In line with our hypothesis, affective iconic words, compared to their non‐iconic counterparts, elicited additional BOLD responses in the left amygdala known for its role in multimodal representation of emotions. Functional connectivity analyses revealed that the observed amygdalar activity was modulated by an interaction of iconic condition and activations in two hubs representative for processing sound (left superior temporal gyrus) and meaning (left inferior frontal gyrus) of words. These results provide a neural explanation for the facilitative role of iconicity in language processing and indicate that language users are sensitive to the interaction between sound and meaning aspect of words, suggesting the existence of iconicity as a general property of human language

    Left and right amygdala : mediofrontal cortical functional connectivity is differentially modulated by harm avoidance

    Get PDF
    Background: The left and right amygdalae are key regions distinctly involved in emotion-regulation processes. Individual differences, such as personality features, may affect the implicated neurocircuits. The lateralized amygdala affective processing linked with the temperament dimension Harm Avoidance (HA) remains poorly understood. Resting state functional connectivity imaging (rsFC) may provide more insight into these neuronal processes. Methods: In 56 drug-naive healthy female subjects, we have examined the relationship between the personality dimension HA on lateralized amygdala rsFC. Results: Across all subjects, left and right amygdalae were connected with distinct regions mainly within the ipsilateral hemisphere. Females scoring higher on HA displayed stronger left amygdala rsFC with ventromedial prefrontal cortical (vmPFC) regions involved in affective disturbances. In high HA scorers, we also observed stronger right amygdala rsFC with the dorsomedial prefrontal cortex (dmPFC), which is implicated in negative affect regulation. Conclusions: In healthy females, left and right amygdalae seem implicated in distinct mPFC brain networks related to HA and may represent a vulnerability marker for sensitivity to stress and anxiety (disorders)
    • 

    corecore