92 research outputs found

    Gender differences in hemispheric asymmetry for face processing

    Get PDF
    BACKGROUND: Current cognitive neuroscience models predict a right-hemispheric dominance for face processing in humans. However, neuroimaging and electromagnetic data in the literature provide conflicting evidence of a right-sided brain asymmetry for decoding the structural properties of faces. The purpose of this study was to investigate whether this inconsistency might be due to gender differences in hemispheric asymmetry. RESULTS: In this study, event-related brain potentials (ERPs) were recorded in 40 healthy, strictly right-handed individuals (20 women and 20 men) while they observed infants' faces expressing a variety of emotions. Early face-sensitive P1 and N1 responses to neutral vs. affective expressions were measured over the occipital/temporal cortices, and the responses were analyzed according to viewer gender. Along with a strong right hemispheric dominance for men, the results showed a lack of asymmetry for face processing in the amplitude of the occipito-temporal N1 response in women to both neutral and affective faces. CONCLUSION: Men showed an asymmetric functioning of visual cortex while decoding faces and expressions, whereas women showed a more bilateral functioning. These results indicate the importance of gender effects in the lateralization of the occipito-temporal response during the processing of face identity, structure, familiarity, or affective content

    Queen mandibular pheromone: questions that remain to be resolved

    No full text
    The discovery of ‘queen substance’, and the subsequent identification and synthesis of keycomponents of queen mandibular pheromone, has been of significant importance to beekeepers and to thebeekeeping industry. Fifty years on, there is greater appreciation of the importance and complexity of queenpheromones, but many mysteries remain about the mechanisms through which pheromones operate. Thediscovery of sex pheromone communication in moths occurred within the same time period, but in this case,intense pressure to find better means of pest management resulted in a remarkable focusing of research activityon understanding pheromone detection mechanisms and the central processing of pheromone signals in themoth. We can benefit from this work and here, studies on moths are used to highlight some of the gaps in ourknowledge of pheromone communication in bees. A better understanding of pheromone communication inhoney bees promises improved strategies for the successful management of these extraordinary animals

    Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Get PDF
    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions

    First valence, then arousal: the temporal dynamics of brain electric activity evoked by emotional stimuli

    Full text link
    The temporal dynamics of the neural activity that implements the dimensions valence and arousal during processing of emotional stimuli were studied in two multi-channel ERP experiments that used visually presented emotional words (experiment 1) and emotional pictures (experiment 2) as stimulus material. Thirty-two healthy subjects participated (mean age 26.8 +/- 6.4 years, 24 women). The stimuli in both experiments were selected on the basis of verbal reports in such a way that we were able to map the temporal dynamics of one dimension while controlling for the other one. Words (pictures) were centrally presented for 450 (600) ms with interstimulus intervals of 1,550 (1,400) ms. ERP microstate analysis of the entire epochs of stimulus presentations parsed the data into sequential steps of information processing. The results revealed that in several microstates of both experiments, processing of pleasant and unpleasant valence (experiment 1, microstate #3: 118-162 ms, #6: 218-238 ms, #7: 238-266 ms, #8: 266-294 ms; experiment 2, microstate #5: 142-178 ms, #6: 178-226 ms, #7: 226-246 ms, #9: 262-302 ms, #10: 302-330 ms) as well as of low and high arousal (experiment 1, microstate #8: 266-294 ms, #9: 294-346 ms; experiment 2, microstate #10: 302-330 ms, #15: 562-600 ms) involved different neural assemblies. The results revealed also that in both experiments, information about valence was extracted before information about arousal. The last microstate of valence extraction was identical with the first microstate of arousal extraction

    Investigation of amino acid residues involved in the interaction between MipZ and ParB in C. crescentus

    No full text

    Zahl und Verteilung antennaler Sensillen bei der Honigbiene (Apis mellifera L.)

    No full text

    Facial electroneurography:

    No full text
    • …
    corecore