26 research outputs found

    On the connection between level of education and the neural circuitry of emotion perception

    Get PDF
    Through education, a social group transmits accumulated knowledge, skills, customs, and values to its members. So far, to the best of our knowledge, the association between educational attainment and neural correlates of emotion processing has been left unexplored. In a retrospective analysis of The Netherlands Study of Depression and Anxiety (NESDA) functional magnetic resonance imaging (fMRI) study, we compared two groups of fourteen healthy volunteers with intermediate and high educational attainment, matched for age and gender. The data concerned event-related fMRI of brain activation during perception of facial emotional expressions. The region of interest (ROI) analysis showed stronger right amygdala activation to facial expressions in participants with lower relative to higher educational attainment (HE). The psychophysiological interaction analysis revealed that participants with HE exhibited stronger right amygdala-right insula connectivity during perception of emotional and neutral facial expressions. This exploratory study suggests the relevance of educational attainment on the neural mechanism of facial expressions processing

    Richness in Functional Connectivity Depends on the Neuronal Integrity within the Posterior Cingulate Cortex

    Get PDF
    The brain's connectivity skeleton-a rich club of strongly interconnected members-was initially shown to exist in human structural networks, but recent evidence suggests a functional counterpart. This rich club typically includes key regions (or hubs) from multiple canonical networks, reducing the cost of inter-network communication. The posterior cingulate cortex (PCC), a hub node embedded within the default mode network, is known to facilitate communication between brain networks and is a key member of the "rich club." Here, we assessed how metabolic signatures of neuronal integrity and cortical thickness influence the global extent of a functional rich club as measured using the functional rich club coefficient (fRCC). Rich club estimation was performed on functional connectivity of resting state brain signals acquired at 3T in 48 healthy adult subjects. Magnetic resonance spectroscopy was measured in the same session using a point resolved spectroscopy sequence. We confirmed convergence of functional rich club with a previously established structural rich club. N-acetyl aspartate (NAA) in the PCC is significantly correlated with age (p = 0.001), while the rich club coefficient showed no effect of age (p = 0.106). In addition, we found a significant quadratic relationship between fRCC and NAA concentration in PCC (p = 0.009). Furthermore, cortical thinning in the PCC was correlated with a reduced rich club coefficient after accounting for age and NAA. In conclusion, we found that the fRCC is related to a marker of neuronal integrity in a key region of the cingulate cortex. Furthermore, cortical thinning in the same area was observed, suggesting that both cortical thinning and neuronal integrity in the hub regions influence functional integration of at a whole brain level

    Enhanced amygdala reactivity to emotional faces in adults reporting childhood emotional maltreatment

    Get PDF
    In the context of chronic childhood emotional maltreatment (CEM; emotional abuse and/or neglect), adequately responding to facial expressions is an important skill. Over time, however, this adaptive response may lead to a persistent vigilance for emotional facial expressions. The amygdala and the medial prefrontal cortex (mPFC) are key regions in face processing. However, the neurobiological correlates of face processing in adults reporting CEM are yet unknown. We examined amydala and mPFC reactivity to emotional faces (Angry, Fearful, Sad, Happy, Neutral) vs scrambled faces in healthy controls and unmedicated patients with depression and/or anxiety disorders reporting CEM before the age of 16 years (n = 60), and controls and patients who report no childhood abuse (n = 75). We found that CEM was associated with enhanced bilateral amygdala reactivity to emotional faces in general, and independent of psychiatric status. Furthermore, we found no support for differential mPFC functioning, suggesting that amygdala hyper-responsivity to emotional facial perception in adults reporting CEM may be independent from top-down influences of the mPFC. These findings may be key in understanding the increased emotional sensitivity and interpersonal difficulties, that have been reported in individuals with a history of CEM.</p

    Mood Modulates Auditory Laterality of Hemodynamic Mismatch Responses during Dichotic Listening

    Get PDF
    Hemodynamic mismatch responses can be elicited by deviant stimuli in a sequence of standard stimuli even during cognitive demanding tasks. Emotional context is known to modulate lateralized processing. Right-hemispheric negative emotion processing may bias attention to the right and enhance processing of right-ear stimuli. The present study examined the influence of induced mood on lateralized pre-attentive auditory processing of dichotic stimuli using functional magnetic resonance imaging (fMRI). Faces expressing emotions (sad/happy/neutral) were presented in a blocked design while a dichotic oddball sequence with consonant-vowel (CV) syllables in an event-related design was simultaneously administered. Twenty healthy participants were instructed to feel the emotion perceived on the images and to ignore the syllables. Deviant sounds reliably activated bilateral auditory cortices and confirmed attention effects by modulation of visual activity. Sad mood induction activated visual, limbic and right prefrontal areas. A lateralization effect of emotion-attention interaction was reflected in a stronger response to right-ear deviants in the right auditory cortex during sad mood. This imbalance of resources may be a neurophysiological correlate of laterality in sad mood and depression. Conceivably, the compensatory right-hemispheric enhancement of resources elicits increased ipsilateral processing

    Does valence in the visual domain influence the spatial attention after auditory deviants? Exploratory data3

    Get PDF
    The auditory mismatch responses are elicited in absence of directed attention but are thought to reflect attention modulating effects. Little is known however, if the deviants in a stream of standards are specifically directing attention across modalities and how they interact with other attention directing signals such as emotions. We applied the well-established paradigm of left- or right-lateralized deviant syllables within a dichotic listening design. In a simple target detection paradigm with lateralized visual stimuli, we hypothesized that responses to visual stimuli would be speeded after ignored auditory deviants on the same side. Moreover, stimuli with negative valence in the visual domain could be expected to reduce this effect due to attention capture for this emotion, resulting in speeded responses to visual stimuli even when attention was directed to the opposite side by the auditory deviant beforehand. Reaction times of 17 subjects confirmed the speeding of responses after deviant events. However, reduced facilitation was observed for positive targets at the left after incongruent deviants, i.e., at the right ear. In particular, significant interactions of valence and visual field and of valence and spatial congruency emerged. Pre-attentive auditory processing may modulate attention in a spatially selective way. However, negative valence processing in the right hemisphere may override this effect. Resource allocation such as spatial attention is regulated dynamically by multimodal and emotion information processing

    Age- and Gender-Related Variations of Emotion Recognition in Pseudowords and Faces

    No full text
    Background/Study Context: The ability to interpret emotionally salient stimuli is an important skill for successful social functioning at any age. The objective of the present study was to disentangle age and gender effects on emotion recognition ability in voices and faces.Methods: Three age groups of participants (young, age range: 18–35 years; middle-aged, age range: 36–55 years; and older, age range: 56–75 years) identified basic emotions presented in voices and faces in a forced-choice paradigm. Five emotions (angry, fearful, sad, disgusted, and happy) and a nonemotional category (neutral) were shown as encoded in color photographs of facial expressions and pseudowords spoken in affective prosody.Results: Overall, older participants had a lower accuracy rate in categorizing emotions than young and middle-aged participants. Females performed better than males in recognizing emotions from voices, and this gender difference emerged in middle-aged and older participants. The performance of emotion recognition in faces was significantly correlated with the performance in voices.Conclusion: The current study provides further evidence for a general age and gender effect on emotion recognition; the advantage of females seems to be age- and stimulus modality-dependent

    Mapping revealed hemodynamic responses to deviant events at the left and the right auditory cortex (panel b; FWE-corrected p<.05, extent threshold 15 voxels).

    No full text
    <p>In the ROI analyses, (a) the responses at the left hemisphere showed a significant effect of presentation side and (c) the right auditory cortex exhibited a significant interaction of mood and presentation side. In particular, right-ear deviants elicited significantly higher activation in the right auditory cortex during sad mood as compared to neutral mood and as compared to left-ear stimuli (**: p<.01; *: p<.05; °: p<.1; mean±SE); a.u.: arbitrary units.</p

    Hemodynamic responses to a) sad, b) happy and c) neutral mood induction.

    No full text
    <p>A wide-spread activity in visual areas is due to the procedure using facial presentations. Notable are bilateral amygdala responses and right-lateralized frontal activation during sadness as well as hippocampus responses during happiness, confirming the effectiveness of mood induction independent from the ongoing acoustic stimulation; height threshold T>4.59, extent threshold 50 voxels.</p

    Demographic characteristics of the sample (mean±SD).

    No full text
    <p>MWT-B: The Multiple-Choice Vocabulary Intelligence Test [Der Mehrfachwahl-Wortschatz-Intelligenztest, 26]; verbal intelligence screening asking participants to find existing German words among non-words in a multiple-choice task.</p
    corecore