32 research outputs found

    Early visual ERPs show stable body-sensitive patterns over a 4-week test period

    Get PDF
    Event-related potential (ERP) studies feature among the most cited papers in the field of body representation, with recent research highlighting the potential of ERPs as neuropsychiatric biomarkers. Despite this, investigation into how reliable early visual ERPs and body-sensitive effects are over time has been overlooked. This study therefore aimed to assess the stability of early body-sensitive effects and visual P1, N1 and VPP responses. Participants were asked to identify pictures of their own bodies, other bodies and houses during an EEG test session that was completed at the same time, once a week, for four consecutive weeks. Results showed that amplitude and latency of early visual components and their associated body-sensitive effects were stable over the 4-week period. Furthermore, correlational analyses revealed that VPP component amplitude might be more reliable than VPP latency and specific electrode sites might be more robust indicators of body-sensitive cortical activity than others. These findings suggest that visual P1, N1 and VPP responses, alongside body-sensitive N1/VPP effects, are robust indications of neuronal activity. We conclude that these components are eligible to be considered as electrophysiological biomarkers relevant to body representation

    Face Coding Is Bilateral in the Female Brain

    Get PDF
    Background: It is currently believed that face processing predominantly activates the right hemisphere in humans, but available literature is very inconsistent. Methodology/Principal Findings: In this study, ERPs were recorded in 50 right-handed women and men in response to 390 faces (of different age and sex), and 130 technological objects. Results showed no sex difference in the amplitude of N170 to objects; a much larger face-specific response over the right hemisphere in men, and a bilateral response in women; a lack of face-age coding effect over the left hemisphere in men, with no differences in N170 to faces as a function of age; a significant bilateral face-age coding effect in women. Conclusions/Significance: LORETA reconstruction showed a significant left and right asymmetry in the activation of the fusiform gyrus (BA19), in women and men, respectively. The present data reveal a lesser degree of lateralization of brain functions related to face coding in women than men. In this light, they may provide an explanation of the inconsistencies in the available literature concerning the asymmetric activity of left and right occipito-temporal cortices devoted to fac

    Gender differences in hemispheric asymmetry for face processing

    Get PDF
    BACKGROUND: Current cognitive neuroscience models predict a right-hemispheric dominance for face processing in humans. However, neuroimaging and electromagnetic data in the literature provide conflicting evidence of a right-sided brain asymmetry for decoding the structural properties of faces. The purpose of this study was to investigate whether this inconsistency might be due to gender differences in hemispheric asymmetry. RESULTS: In this study, event-related brain potentials (ERPs) were recorded in 40 healthy, strictly right-handed individuals (20 women and 20 men) while they observed infants' faces expressing a variety of emotions. Early face-sensitive P1 and N1 responses to neutral vs. affective expressions were measured over the occipital/temporal cortices, and the responses were analyzed according to viewer gender. Along with a strong right hemispheric dominance for men, the results showed a lack of asymmetry for face processing in the amplitude of the occipito-temporal N1 response in women to both neutral and affective faces. CONCLUSION: Men showed an asymmetric functioning of visual cortex while decoding faces and expressions, whereas women showed a more bilateral functioning. These results indicate the importance of gender effects in the lateralization of the occipito-temporal response during the processing of face identity, structure, familiarity, or affective content

    How Bodies and Voices Interact in Early Emotion Perception

    Get PDF
    Successful social communication draws strongly on the correct interpretation of others' body and vocal expressions. Both can provide emotional information and often occur simultaneously. Yet their interplay has hardly been studied. Using electroencephalography, we investigated the temporal development underlying their neural interaction in auditory and visual perception. In particular, we tested whether this interaction qualifies as true integration following multisensory integration principles such as inverse effectiveness. Emotional vocalizations were embedded in either low or high levels of noise and presented with or without video clips of matching emotional body expressions. In both, high and low noise conditions, a reduction in auditory N100 amplitude was observed for audiovisual stimuli. However, only under high noise, the N100 peaked earlier in the audiovisual than the auditory condition, suggesting facilitatory effects as predicted by the inverse effectiveness principle. Similarly, we observed earlier N100 peaks in response to emotional compared to neutral audiovisual stimuli. This was not the case in the unimodal auditory condition. Furthermore, suppression of beta–band oscillations (15–25 Hz) primarily reflecting biological motion perception was modulated 200–400 ms after the vocalization. While larger differences in suppression between audiovisual and audio stimuli in high compared to low noise levels were found for emotional stimuli, no such difference was observed for neutral stimuli. This observation is in accordance with the inverse effectiveness principle and suggests a modulation of integration by emotional content. Overall, results show that ecologically valid, complex stimuli such as joined body and vocal expressions are effectively integrated very early in processing

    Neurophysiological evidence for rapid processing of verbal and gestural information in understanding communicative actions

    Get PDF
    During everyday social interaction, gestures are a fundamental part of human communication. The communicative pragmatic role of hand gestures and their interaction with spoken language has been documented at the earliest stage of language development, in which two types of indexical gestures are most prominent: the pointing gesture for directing attention to objects and the give-me gesture for making requests. Here we study, in adult human participants, the neurophysiological signatures of gestural-linguistic acts of communicating the pragmatic intentions of naming and requesting by simultaneously presenting written words and gestures. Already at ~150 ms, brain responses diverged between naming and request actions expressed by word-gesture combination, whereas the same gestures presented in isolation elicited their earliest neurophysiological dissociations significantly later (at ~210 ms). There was an early enhancement of request-evoked brain activity as compared with naming, which was due to sources in the frontocentral cortex, consistent with access to action knowledge in request understanding. In addition, an enhanced N400-like response indicated late semantic integration of gesture-language interaction. The present study demonstrates that word-gesture combinations used to express communicative pragmatic intentions speed up the brain correlates of comprehension processes – compared with gesture-only understanding – thereby calling into question current serial linguistic models viewing pragmatic function decoding at the end of a language comprehension cascade. Instead, information about the social-interactive role of communicative acts is processed instantaneously

    Vigabatrin in Low doses Selectively Suppresses the Clonic Component of Audiogenically Kindled Seizures in Rats

    No full text
    Item does not contain fulltextThe effect of systemic administration of the gamma-aminobutyric acid (GABA)-transaminase inhibitor vigabatrin (VGB) on different components of convulsions was tested in the model of audiogenically kindled seizures, which consist of brainstem (running, tonus) and forebrain (clonus) elements. METHODS: Audiogenically susceptible rats of Krushinsky-Molodkina (KM), Wistar, and WAG/Rij strains received repeated sound stimulation (60 dB, 10-80 kHz) until kindled audiogenic seizures were reliably elicited. Kindled audiogenic seizures consisted of running, tonic, and generalized clonic phases in KM rats (severe audiogenic seizures) and of running and Racine stage 5 facial/forelimb clonus in Wistar and WAG/Rij rats (moderate seizures). Vehicle, 100, or 200 mg/kg of VGB was intraperitoneally injected 2, 4 and 24 h before the induction of kindled audiogenic seizures. RESULTS: At both doses, VGB did not change the seizure latency and the duration of running and tonic convulsions, but suppressed clonic ones in all rat strains. In KM rats, the mean duration of posttonic clonus was significantly reduced at 24 h after 100 mg/kg and from 4 h after 200 mg/kg. In Wistar and WAG/Rij rats, the mean duration of facial/forelimb clonus was reduced from 4 and 2 h after 100- and 200-mg/kg administration, respectively; 24 h after the high-dose injection, clonus was completely blocked in all rats of both strains. No difference in efficacy of VGB between Wistar and WAG/Rij rats was observed. CONCLUSIONS: VGB more effectively suppresses clonic convulsions than running and tonic ones in audiogenically kindled rats. It is supposed that this selective anticonvulsive effect of VGB results from different sensitivities of forebrain and brainstem epileptic networks to the presumed GABA enhancement

    The EU-Emotion Stimulus Set: A validation study

    No full text
    The EU-Emotion Stimulus Set is a newly developed collection of dynamic multimodal emotion and mental state representations. A total of 20 emotions and mental states are represented through facial expressions, vocal expressions, body gestures and contextual social scenes. This emotion set is portrayed by a multi-ethnic group of child and adult actors. Here we present the validation results, as well as participant ratings of the emotional valence, arousal and intensity of the visual stimuli from this emotion stimulus set. The EU-Emotion Stimulus Set is available for use by the scientific community and the validation data are provided as a supplement available for download
    corecore