90 research outputs found
Neural correlates of perceiving emotional faces and bodies in developmental prosopagnosia:An event-related fMRI-study
Many people experience transient difficulties in recognizing faces but only a small number of them cannot recognize their family members when meeting them unexpectedly. Such face blindness is associated with serious problems in everyday life. A better understanding of the neuro-functional basis of impaired face recognition may be achieved by a careful comparison with an equally unique object category and by a adding a more realistic setting involving neutral faces as well facial expressions. We used event-related functional magnetic resonance imaging (fMRI) to investigate the neuro-functional basis of perceiving faces and bodies in three developmental prosopagnosics (DP) and matched healthy controls. Our approach involved materials consisting of neutral faces and bodies as well as faces and bodies expressing fear or happiness. The first main result is that the presence of emotional information has a different effect in the patient vs. the control group in the fusiform face area (FFA). Neutral faces trigger lower activation in the DP group, compared to the control group, while activation for facial expressions is the same in both groups. The second main result is that compared to controls, DPs have increased activation for bodies in the inferior occipital gyrus (IOG) and for neutral faces in the extrastriate body area (EBA), indicating that body and face sensitive processes are less categorically segregated in DP. Taken together our study shows the importance of using naturalistic emotional stimuli for a better understanding of developmental face deficits
Social context influences recognition of bodily expressions
Previous studies have shown that recognition of facial expressions is influenced by the affective information provided by the surrounding scene. The goal of this study was to investigate whether similar effects could be obtained for bodily expressions. Images of emotional body postures were briefly presented as part of social scenes showing either neutral or emotional group actions. In Experiment 1, fearful and happy bodies were presented in fearful, happy, neutral and scrambled contexts. In Experiment 2, we compared happy with angry body expressions. In Experiment 3 and 4, we blurred the facial expressions of all people in the scene. This way, we were able to ascribe possible scene effects to the presence of body expressions visible in the scene and we were able to measure the contribution of facial expressions to the body expression recognition. In all experiments, we observed an effect of social scene context. Bodily expressions were better recognized when the actions in the scenes expressed an emotion congruent with the bodily expression of the target figure. The specific influence of facial expressions in the scene was dependent on the emotional expression but did not necessarily increase the congruency effect. Taken together, the results show that the social context influences our recognition of a person’s bodily expression
Context Modulation of Facial Emotion Perception Differed by Individual Difference
Background: Certain facial configurations are believed to be associated with distinct affective meanings (i.e. basic facial expressions), and such associations are common across cultures (i.e. universality of facial expressions). However, recently, many studies suggest that various types of contextual information, rather than facial configuration itself, are important factor for facial emotion perception. Methodology/Principal Findings: To examine systematically how contextual information influences individuals ’ facial emotion perception, the present study estimated direct observers ’ perceptual thresholds for detecting negative facial expressions via a forced-choice psychophysical procedure using faces embedded in various emotional contexts. We additionally measured the individual differences in affective information-processing tendency (BIS/BAS) as a possible factor that may determine the extent to which contextual information on facial emotion perception is used. It was found that contextual information influenced observers ’ perceptual thresholds for facial emotion. Importantly, individuals ’ affectiveinformation tendencies modulated the extent to which they incorporated context information into their facial emotion perceptions. Conclusions/Significance: The findings of this study suggest that facial emotion perception not only depends on facial configuration, but the context in which the face appears as well. This contextual influence appeared differently wit
Face Coding Is Bilateral in the Female Brain
Background: It is currently believed that face processing predominantly activates the right hemisphere in humans, but available literature is very inconsistent. Methodology/Principal Findings: In this study, ERPs were recorded in 50 right-handed women and men in response to 390 faces (of different age and sex), and 130 technological objects. Results showed no sex difference in the amplitude of N170 to objects; a much larger face-specific response over the right hemisphere in men, and a bilateral response in women; a lack of face-age coding effect over the left hemisphere in men, with no differences in N170 to faces as a function of age; a significant bilateral face-age coding effect in women. Conclusions/Significance: LORETA reconstruction showed a significant left and right asymmetry in the activation of the fusiform gyrus (BA19), in women and men, respectively. The present data reveal a lesser degree of lateralization of brain functions related to face coding in women than men. In this light, they may provide an explanation of the inconsistencies in the available literature concerning the asymmetric activity of left and right occipito-temporal cortices devoted to fac
Gender differences in hemispheric asymmetry for face processing
BACKGROUND: Current cognitive neuroscience models predict a right-hemispheric dominance for face processing in humans. However, neuroimaging and electromagnetic data in the literature provide conflicting evidence of a right-sided brain asymmetry for decoding the structural properties of faces. The purpose of this study was to investigate whether this inconsistency might be due to gender differences in hemispheric asymmetry. RESULTS: In this study, event-related brain potentials (ERPs) were recorded in 40 healthy, strictly right-handed individuals (20 women and 20 men) while they observed infants' faces expressing a variety of emotions. Early face-sensitive P1 and N1 responses to neutral vs. affective expressions were measured over the occipital/temporal cortices, and the responses were analyzed according to viewer gender. Along with a strong right hemispheric dominance for men, the results showed a lack of asymmetry for face processing in the amplitude of the occipito-temporal N1 response in women to both neutral and affective faces. CONCLUSION: Men showed an asymmetric functioning of visual cortex while decoding faces and expressions, whereas women showed a more bilateral functioning. These results indicate the importance of gender effects in the lateralization of the occipito-temporal response during the processing of face identity, structure, familiarity, or affective content
Early Left-Hemispheric Dysfunction of Face Processing in Congenital Prosopagnosia: An MEG Study
Electrophysiological research has demonstrated the relevance to face processing of a negative deflection peaking around 170 ms, labelled accordingly as N170 in the electroencephalogram (EEG) and M170 in magnetoencephalography (MEG). The M170 was shown to be sensitive to the inversion of faces and to familiarity-two factors that are assumed to be crucial for congenital prosopagnosia. In order to locate the cognitive dysfunction and its neural correlates, we investigated the time course of neural activity in response to these manipulations.Seven individuals with congenital prosopagnosia and seven matched controls participated in the experiment. To explore brain activity with high accuracy in time, we recorded evoked magnetic fields (275 channel whole head MEG) while participants were looking at faces differing in familiarity (famous vs. unknown) and orientation (upright vs. inverted). The underlying neural sources were estimated by means of the least square minimum-norm-estimation (L2-MNE) approach.The behavioural data corroborate earlier findings on impaired configural processing in congenital prosopagnosia. For the M170, the overall results replicated earlier findings, with larger occipito-temporal brain responses to inverted than upright faces, and more right- than left-hemispheric activity. Compared to controls, participants with congenital prosopagnosia displayed a general decrease in brain activity, primarily over left occipitotemporal areas. This attenuation did not interact with familiarity or orientation.The study substantiates the finding of an early involvement of the left hemisphere in symptoms of prosopagnosia. This might be related to an efficient and overused featural processing strategy which serves as a compensation of impaired configural processing
- …