45 research outputs found

    Gender differences in hemispheric asymmetry for face processing

    Get PDF
    BACKGROUND: Current cognitive neuroscience models predict a right-hemispheric dominance for face processing in humans. However, neuroimaging and electromagnetic data in the literature provide conflicting evidence of a right-sided brain asymmetry for decoding the structural properties of faces. The purpose of this study was to investigate whether this inconsistency might be due to gender differences in hemispheric asymmetry. RESULTS: In this study, event-related brain potentials (ERPs) were recorded in 40 healthy, strictly right-handed individuals (20 women and 20 men) while they observed infants' faces expressing a variety of emotions. Early face-sensitive P1 and N1 responses to neutral vs. affective expressions were measured over the occipital/temporal cortices, and the responses were analyzed according to viewer gender. Along with a strong right hemispheric dominance for men, the results showed a lack of asymmetry for face processing in the amplitude of the occipito-temporal N1 response in women to both neutral and affective faces. CONCLUSION: Men showed an asymmetric functioning of visual cortex while decoding faces and expressions, whereas women showed a more bilateral functioning. These results indicate the importance of gender effects in the lateralization of the occipito-temporal response during the processing of face identity, structure, familiarity, or affective content

    Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Get PDF
    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions

    Queen mandibular pheromone: questions that remain to be resolved

    No full text
    The discovery of ‘queen substance’, and the subsequent identification and synthesis of keycomponents of queen mandibular pheromone, has been of significant importance to beekeepers and to thebeekeeping industry. Fifty years on, there is greater appreciation of the importance and complexity of queenpheromones, but many mysteries remain about the mechanisms through which pheromones operate. Thediscovery of sex pheromone communication in moths occurred within the same time period, but in this case,intense pressure to find better means of pest management resulted in a remarkable focusing of research activityon understanding pheromone detection mechanisms and the central processing of pheromone signals in themoth. We can benefit from this work and here, studies on moths are used to highlight some of the gaps in ourknowledge of pheromone communication in bees. A better understanding of pheromone communication inhoney bees promises improved strategies for the successful management of these extraordinary animals

    Rapid perceptual integration of facial expression and emotional body language

    No full text
    In our natural world, a face is usually encountered not as an isolated object but as an integrated part of a whole body. The face and the body both normally contribute in conveying the emotional state of the individual. Here we show that observers judging a facial expression are strongly influenced by emotional body language. Photographs of fearful and angry faces and bodies were used to create face-body compound images, with either matched or mismatched emotional expressions. When face and body convey conflicting emotional information, judgment of facial expression is hampered and becomes biased toward the emotion expressed by the body. Electrical brain activity was recorded from the scalp while subjects attended to the face and judged its emotional expression. An enhancement of the occipital P1 component as early as 115 ms after presentation onset points to the existence of a rapid neural mechanism sensitive to the degree of agreement between simultaneously presented facial and bodily emotional expressions, even when the latter are unattended
    corecore