31 research outputs found

    How Bodies and Voices Interact in Early Emotion Perception

    Get PDF
    Successful social communication draws strongly on the correct interpretation of others' body and vocal expressions. Both can provide emotional information and often occur simultaneously. Yet their interplay has hardly been studied. Using electroencephalography, we investigated the temporal development underlying their neural interaction in auditory and visual perception. In particular, we tested whether this interaction qualifies as true integration following multisensory integration principles such as inverse effectiveness. Emotional vocalizations were embedded in either low or high levels of noise and presented with or without video clips of matching emotional body expressions. In both, high and low noise conditions, a reduction in auditory N100 amplitude was observed for audiovisual stimuli. However, only under high noise, the N100 peaked earlier in the audiovisual than the auditory condition, suggesting facilitatory effects as predicted by the inverse effectiveness principle. Similarly, we observed earlier N100 peaks in response to emotional compared to neutral audiovisual stimuli. This was not the case in the unimodal auditory condition. Furthermore, suppression of beta–band oscillations (15–25 Hz) primarily reflecting biological motion perception was modulated 200–400 ms after the vocalization. While larger differences in suppression between audiovisual and audio stimuli in high compared to low noise levels were found for emotional stimuli, no such difference was observed for neutral stimuli. This observation is in accordance with the inverse effectiveness principle and suggests a modulation of integration by emotional content. Overall, results show that ecologically valid, complex stimuli such as joined body and vocal expressions are effectively integrated very early in processing

    The Neuronal Correlates of Digits Backward Are Revealed by Voxel-Based Morphometry and Resting-State Functional Connectivity Analyses

    Get PDF
    Digits backward (DB) is a widely used neuropsychological measure that is believed to be a simple and effective index of the capacity of the verbal working memory. However, its neural correlates remain elusive. The aim of this study is to investigate the neural correlates of DB in 299 healthy young adults by combining voxel-based morphometry (VBM) and resting-state functional connectivity (rsFC) analyses. The VBM analysis showed positive correlations between the DB scores and the gray matter volumes in the right anterior superior temporal gyrus (STG), the right posterior STG, the left inferior frontal gyrus and the left Rolandic operculum, which are four critical areas in the auditory phonological loop of the verbal working memory. Voxel-based correlation analysis was then performed between the positive rsFCs of these four clusters and the DB scores. We found that the DB scores were positively correlated with the rsFCs within the salience network (SN), that is, between the right anterior STG, the dorsal anterior cingulate cortex and the right fronto-insular cortex. We also found that the DB scores were negatively correlated with the rsFC within an anti-correlation network of the SN, between the right posterior STG and the left posterior insula. Our findings suggest that DB performance is related to the structural and functional organizations of the brain areas that are involved in the auditory phonological loop and the SN

    The role of the amygdala in face perception and evaluation

    Get PDF
    Faces are one of the most significant social stimuli and the processes underlying face perception are at the intersection of cognition, affect, and motivation. Vision scientists have had a tremendous success of mapping the regions for perceptual analysis of faces in posterior cortex. Based on evidence from (a) single unit recording studies in monkeys and humans; (b) human functional localizer studies; and (c) meta-analyses of neuroimaging studies, I argue that faces automatically evoke responses not only in these regions but also in the amygdala. I also argue that (a) a key property of faces represented in the amygdala is their typicality; and (b) one of the functions of the amygdala is to bias attention to atypical faces, which are associated with higher uncertainty. This framework is consistent with a number of other amygdala findings not involving faces, suggesting a general account for the role of the amygdala in perception

    Early Category-Specific Cortical Activation Revealed by Visual Stimulus Inversion

    Get PDF
    Visual categorization may already start within the first 100-ms after stimulus onset, in contrast with the long-held view that during this early stage all complex stimuli are processed equally and that category-specific cortical activation occurs only at later stages. The neural basis of this proposed early stage of high-level analysis is however poorly understood. To address this question we used magnetoencephalography and anatomically-constrained distributed source modeling to monitor brain activity with millisecond-resolution while subjects performed an orientation task on the upright and upside-down presented images of three different stimulus categories: faces, houses and bodies. Significant inversion effects were found for all three stimulus categories between 70–100-ms after picture onset with a highly category-specific cortical distribution. Differential responses between upright and inverted faces were found in well-established face-selective areas of the inferior occipital cortex and right fusiform gyrus. In addition, early category-specific inversion effects were found well beyond visual areas. Our results provide the first direct evidence that category-specific processing in high-level category-sensitive cortical areas already takes place within the first 100-ms of visual processing, significantly earlier than previously thought, and suggests the existence of fast category-specific neocortical routes in the human brain
    corecore