297 research outputs found
Cross-modal face identity aftereffects and their relation to priming
We tested the magnitude of the face identity aftereffect following adaptation to different modes of adaptors in four experiments. The perceptual midpoint between two morphed famous faces was measured pre- and post-adaptation. Significant aftereffects were observed for visual (faces) and non-visual adaptors (voices and names) but not non-specific semantic information (e.g., occupations). Aftereffects were also observed following imagination and adaptation to an associated person. The strongest aftereffects were found adapting to facial caricatures. These results are discussed in terms of cross-modal adaptation occurring at various loci within the face-recognition system analogous to priming
Sad people are more accurate at expression identification with a smaller own-ethnicity bias than happy people.
Sad individuals perform more accurately at face identity recognition (Hills, Werno, & Lewis, 2011), possibly because they scan more of the face during encoding. During expression identification tasks, sad individuals do not fixate on the eyes as much as happier individuals (Wu, Pu, Allen, & Pauli, 2012). Fixating on features other than the eyes leads to a reduced own-ethnicity bias (Hills & Lewis, 2006). This background indicates that sad individuals would not view the eyes as much as happy individuals and this would result in improved expression recognition and a reduced own-ethnicity bias. This prediction was tested using an expression identification task, with eye tracking. We demonstrate that sad-induced participants show enhanced expression recognition and a reduced own-ethnicity bias than happy-induced participants due to scanning more facial features. We conclude that mood affects eye movements and face encoding by causing a wider sampling strategy and deeper encoding of facial features diagnostic for expression identification
Eye-tracking the own-gender bias in face recognition: Other-gender faces are viewed differently to own-gender faces.
Research on the own-gender bias in face recognition has indicated an asymmetrical effect: an effect found only in women. We investigated the own-gender bias, using an eye-tracker to examine whether the own-gender bias is associated with differential processing strategies. We found an own-gender bias in our female participants. Our eye-tracking analysis indicated different scanning behaviours when processing own- and other-gender faces, with longer and more fixations to the eyes when viewing own-gender faces. Our results favour the socio-cognitive model, whilst acknowledging the role of perceptual expertise in the own-gender bias
Technological Developments in Road Pricing
Institute of Transport and Logistics Studies. Business School. The University of Sydney
Children process the self face using configural and featural encoding: Evidence from eye tracking.
Much is known about how the self-face is processed neurologically, however there has been little work exploring how self, familiar, and unfamiliar faces are viewed differently. Eye-movement data provides insights for how these stimuli are encoded and pupilometry provides information regarding the amount of effort put in when processing these stimuli. In this study, we utilise eye-tracking to explore differences in the encoding of self, age- and gender-matched personally familiar faces and age- and gender-matched unfamiliar faces in school-aged children. The self face was processed using more fixations than familiar and unfamiliar faces, specifically to the most diagnostic features, indicating enhanced and efficient use of featural processing. Furthermore, the self face was processed with more and longer central fixations than unfamiliar faces, indicating enhanced use of configural processing. Finally, the self face seemed to be processed the most efficiently as revealed through our pupilometry data. These results are incorporated into a model of self face processing that is based on efficient and robust processing consistent with the neurological data indicating that multiple brain areas are used to process faces
Explaining Sad People's Memory Advantage for Faces.
Sad people recognize faces more accurately than happy people (Hills et al., 2011). We devised four hypotheses for this finding that are tested between in the current study. The four hypotheses are: (1) sad people engage in more expert processing associated with face processing; (2) sad people are motivated to be more accurate than happy people in an attempt to repair their mood; (3) sad people have a defocused attentional strategy that allows more information about a face to be encoded; and (4) sad people scan more of the face than happy people leading to more facial features to be encoded. In Experiment 1, we found that dysphoria (sad mood often associated with depression) was not correlated with the face-inversion effect (a measure of expert processing) nor with response times but was correlated with defocused attention and recognition accuracy. Experiment 2 established that dysphoric participants detected changes made to more facial features than happy participants. In Experiment 3, using eye-tracking we found that sad-induced participants sampled more of the face whilst avoiding the eyes. Experiment 4 showed that sad-induced people demonstrated a smaller own-ethnicity bias. These results indicate that sad people show different attentional allocation to faces than happy and neutral people
Carryover of scanning behaviour affects upright face recognition differently to inverted face recognition.
Face perception is characterized by a distinct scanpath. While eye movements are considered functional, there has not been direct evidence that disrupting this scanpath affects face recognition performance. The present experiment investigated the influence of an irrelevant letter-search task (with letter strings arranged horizontally, vertically, or randomly) on the subsequent scanning strategies in processing upright and inverted famous faces. Participants’ response time to identify the face and the direction of their eye movements were recorded. The orientation of the letter search influenced saccadic direction when viewing the face images, such that a direct carryover-effect was observed. Following a vertically oriented letter-search task, the recognition of famous faces was slower and less accurate for upright faces, and faster for inverted faces. These results extend the carryover findings of Thompson and Crundall into a novel domain. Crucially they also indicate that upright and inverted faces are better processed by different eye movements, highlighting the importance of scanpaths in face recognition
The Role of Extraversion, IQ and Contact in the Own-Ethnicity Face Recognition Bias.
While IQ is weakly related to the overall face recognition (Shakeshaft & Plomin, 2015), it plays a larger role in the processing of misaligned faces in the composite face task (Zhu et al., 2010). This type of stimuli are relatively novel and may reflect the involvement of intelligence in the processing of infrequently encountered faces, such as those of other-ethnicities. Extraversion is associated with increased eye contact which signifies less viewing of diagnostic features for Black faces. Using an old/new recognition paradigm, we found that IQ negatively correlated with the magnitude of the own-ethnicity bias (OEB) and that this relationship was moderated by contact with people from another ethnicity. We interpret these results in terms of IQ enhancing the ability to process novel stimuli by utilising multiple forms of coding. Extraversion was positively correlated with the OEB in White participants and negatively correlated with the OEB in Black participants suggesting that extraverts have lower attention to diagnostic facial features of Black faces, leading to poorer recognition of Black faces in both White and Black participants, thereby contributing to the relative OEB in these participants. The OEB is dependent on participant variables such as intelligence and extraversion
The combined influence of the own-age, -gender, and -ethnicity biases on face recognition
Whether the own-group (own-ethnicity, own-gender, and own-age) biases in face recognition are based on the same mechanism and whether their effects are additive or not are as yet unanswered questions. Employing a standard old/new recognition paradigm, we investigated the combined crossover effects of the own-ethnicity, own-gender, and own-age biases in a group of 160 participants. Result showed that while face recognition accuracy decreased as the number of out-group features increased, the own-ethnicity bias appeared to have more of a unique influence on face recognition than the other biases. Furthermore, we established that in a single group of participants, these biases appear to be based on different mechanisms: the own-ethnicity bias is based on individuation whereas the own-age and own-gender biases are based on motivation
Correlations between holistic processing, Autism quotient, extraversion, and experience and the own-gender bias in face recognition.
The variability in the own-gender bias (OGB) in face-recognition is thought to be based on experience and the engagement of expert face processing mechanisms for own-gender faces. Experience is also associated with personality characteristics such as extraversion and Autism, yet the effects of these variables on the own-gender bias has not been explored. We ran a face recognition study exploring the relationships between own-gender experience, holistic processing (measured using the face-inversion effect, composite face effect, and the parts-and-wholes test), personality characteristics (extraversion and Autism Quotient) and the OGB. Findings did not support a mediational account where experience increases holistic processing and this increases the OGB. Rather, there was a direct relationship between extraversion and Autism Quotient and the OGB. We interpret this as personality characteristics having an effect on the motivation to process own-gender faces more deeply than opposite-gender faces
- …