24 research outputs found

    Estimating the Racial Composition of Groups of Faces: An Ensemble Other-Race Effect

    Get PDF
    In the current study we presented Asian and Caucasian participants with brief displays containing 16 faces and asked them to judge whether there were more Asians or Caucasians present. We varied the physical proportion of each race in the display using the method of constant stimuli and obtained estimates of the point of subjective equality (PSE) by fitting cumulative normal functions to individual data. Consistent with recent findings on “ensemble” face processing, participants were able to make group estimates quite accurately. However, the estimates from the two groups of participants did not overlap, with Asian participants appearing to weight other-race faces more heavily than Caucasian participants. To our knowledge, this is the first demonstration of an other-race effect in the context of groups of faces

    “There Is No (Where a) Face Like Home”: Recognition and Appraisal Responses to Masked Facial Dialects of Emotion in Four Different National Cultures

    Get PDF
    The theory of universal emotions suggests that certain emotions such as fear, anger, disgust, sadness, surprise and happiness can be encountered cross-culturally. These emotions are expressed using specific facial movements that enable human communication. More recently, theoretical and empirical models have been used to propose that universal emotions could be expressed via discretely different facial movements in different cultures due to the non-convergent social evolution that takes place in different geographical areas. This has prompted the consideration that own-culture emotional faces have distinct evolutionary important sociobiological value and can be processed automatically, and without conscious awareness. In this paper, we tested this hypothesis using backward masking. We showed, in two different experiments per country of origin, to participants in Britain, Chile, New Zealand and Singapore, backward masked own and other-culture emotional faces. We assessed detection and recognition performance, and self-reports for emotionality and familiarity. We presented thorough cross-cultural experimental evidence that when using Bayesian assessment of non-parametric receiver operating characteristics and hit-versus-miss detection and recognition response analyses, masked faces showing own cultural dialects of emotion were rated higher for emotionality and familiarity compared to other-culture emotional faces and that this effect involved conscious awareness

    How well do computer-generated faces tap face expertise?

    Get PDF
    The use of computer-generated (CG) stimuli in face processing research is proliferating due to the ease with which faces can be generated, standardised and manipulated. However there has been surprisingly little research into whether CG faces are processed in the same way as photographs of real faces. The present study assessed how well CG faces tap face identity expertise by investigating whether two indicators of face expertise are reduced for CG faces when compared to face photographs. These indicators were accuracy for identification of own-race faces and the other-race effect (ORE)-the well-established finding that own-race faces are recognised more accurately than other-race faces. In Experiment 1 Caucasian and Asian participants completed a recognition memory task for own- and other-race real and CG faces. Overall accuracy for own-race faces was dramatically reduced for CG compared to real faces and the ORE was significantly and substantially attenuated for CG faces. Experiment 2 investigated perceptual discrimination for own- and other-race real and CG faces with Caucasian and Asian participants. Here again, accuracy for own-race faces was significantly reduced for CG compared to real faces. However the ORE was not affected by format. Together these results signal that CG faces of the type tested here do not fully tap face expertise. Technological advancement may, in the future, produce CG faces that are equivalent to real photographs. Until then caution is advised when interpreting results obtained using CG faces

    Factors Associated with Revision Surgery after Internal Fixation of Hip Fractures

    Get PDF
    Background: Femoral neck fractures are associated with high rates of revision surgery after management with internal fixation. Using data from the Fixation using Alternative Implants for the Treatment of Hip fractures (FAITH) trial evaluating methods of internal fixation in patients with femoral neck fractures, we investigated associations between baseline and surgical factors and the need for revision surgery to promote healing, relieve pain, treat infection or improve function over 24 months postsurgery. Additionally, we investigated factors associated with (1) hardware removal and (2) implant exchange from cancellous screws (CS) or sliding hip screw (SHS) to total hip arthroplasty, hemiarthroplasty, or another internal fixation device. Methods: We identified 15 potential factors a priori that may be associated with revision surgery, 7 with hardware removal, and 14 with implant exchange. We used multivariable Cox proportional hazards analyses in our investigation. Results: Factors associated with increased risk of revision surgery included: female sex, [hazard ratio (HR) 1.79, 95% confidence interval (CI) 1.25-2.50; P = 0.001], higher body mass index (fo

    EMAP Open Dataset

    No full text
    <p>EMAP is a dataset of 145 individuals' reactions to emotion-provoking film clips. It includes electroencephalographic and peripheral physiological data as well as moment-by-moment ratings for emotional arousal in addition to overall and categorical ratings. The dataset includes "raw" EEG and "cleaned" EEG data versions, as well as skin conductance, heart rate and respiration data and subjective ratings.</p><p>Folder and file labels and contents:</p><ul><li>Both clean and raw data are available in both CSV and EEGLabset file formats. The data labeled as 'P1-50,' 'P51-100,' and 'P101-153' correspond to participants 1-50, 51-100, and 101-153, respectively.</li><li>Additionally, the repository includes already extracted EEG features, named as 'Features,' for researchers interested in utilizing the features from our previous study.</li><li>It is important to note that data for some participants are missing, which explains why the dataset contains information for 145 participants, instead of 153.</li><li>The accompanying "session" .csv files contain demographics and metadata, as well as detailed information on missing participants, trials, sensors, and data.</li></ul&gt

    Perception and imagery of faces generate similar gender aftereffects

    No full text
    © 2016 Informa UK Limited, trading as Taylor & Francis Group. Visual adaptation is known to bias perception away from the properties of the adapting stimuli, toward opposite properties, resulting in perceptual aftereffects. For example, prolonged exposure to a face has been shown to produce an identity aftereffect, biasing perception of a subsequent face toward the opposite identity. Such repulsive aftereffects have been observed for both visually perceived and visually imagined faces, suggesting that both perception and imagery yield typical aftereffects. However, recent studies have reported opposite patterns of aftereffects for perception and imagery of face gender. In these studies, visually perceived faces produced typical effects in which perception of androgynous faces was biased away from the gender of the adaptor, whereas imagery of the same stimuli produced atypical aftereffects, biasing the perceived gender of androgynous faces toward the gender of the adaptor. These findings are highly unusual and warrant further research. The present study aimed to gather new evidence on the direction of gender aftereffects following perception and imagery of faces. Experiment 1 had participants view and imagine female and male faces of famous and non-famous individuals. To determine the effect of concomitant visual stimulation on imagery and adaptation, participants visualized faces both in the presence and in the absence of a visual input. In Experiment 2, participants were adapted to perceived and imagined faces of famous and non-famous actors matched on gender typicality. This manipulation allowed us to determine the effect of face familiarity on the magnitude of gender aftereffects. Contrary to evidence from previous studies, our results demonstrated that both perception and imagery produced typical aftereffects, biasing the perceived gender of androgynous faces in the opposite direction to the gender of the adaptor. Famous faces yielded largest adaptation effects across tasks. Experiment 2 confirmed that these effects depended on familiarity rather than on sexual dimorphism. In both experiments, this effect was greater for perception than imagery. Additionally, imagery of famous faces produced strongest aftereffects when it was performed in the absence of visual stimulation. The implications of these findings are discussed.Link_to_subscribed_fulltex

    Visual perception and visual mental imagery of emotional faces generate similar expression aftereffects

    No full text
    © 2016 Elsevier Inc. What is the relationship between visual perception and visual mental imagery of emotional faces? We investigated this question using a within-emotion perceptual adaptation paradigm in which adaptation to a strong version of an expression was paired with a test face displaying a weak version of the same emotion category. We predicted that within-emotion adaptation to perception and imagery of expressions would generate similar aftereffects, biasing perception of weak emotional test faces toward a more neutral value. Our findings confirmed this prediction. Adaptation to mental images yielded aftereffects that inhibited emotion recognition of test expressions, as participants were less accurate at recognising these stimuli compared to baseline. While the same inhibitory effect was observed when expressions were visually perceived, the size of the aftereffects was greater for perception than imagery. These findings suggest the existence of expression-selective neural mechanisms that subserve both visual perception and visual mental imagery of emotional faces.Link_to_subscribed_fulltex

    Holistic processing for left–right composite faces in Chinese and Caucasian observers

    No full text
    © 2014, Taylor & Francis. In Caucasian individuals, holistic processing and face recognition is lateralized to the right hemisphere, whereas part-based processing and word recognition is lateralized to the left hemisphere. Whether this hemispheric complementarity holds more generally is unclear. We compare the hemispheric basis of holistic processing of faces in Caucasian and Chinese observers (who, as readers of logographic script, may have different hemispheric organization). Participants made same/different judgements about the left/right halves of two sequentially presented composite faces (comprised of the left half of one face and the right half of another face) when the halves were aligned or were misaligned. There was a larger congruency effect for aligned than misaligned faces, reflecting significant holistic processing, and this was equivalent for face halves judged in the right and left visual fields and for Caucasian and Chinese observers. This same result was replicated in a second study with Caucasian observers, in which we presented the cue simultaneous with the study face, rather than simultaneous with the test face. These findings reflect equal participation of both hemispheres in holistic face perception and suggest that orthographic experience does not necessarily affect the hemispheric basis of holistic processing.Link_to_subscribed_fulltex

    Prediction of moment-by-moment heart rate and skin conductance changes in the context of varying emotional arousal

    No full text
    Autonomic nervous system (ANS) responses such as heart rate (HR) and galvanic skin responses (GSR) have been linked with cerebral activity in the context of emotion. Although much work has focused on the summative effect of emotions on ANS responses, their interaction in a continuously changing context is less clear. Here, we used a multimodal data set of human affective states, which includes electroencephalogram (EEG) and peripheral physiological signals of participants' moment-by-moment reactions to emotional provoking video clips and modeled HR and GSR changes using machine learning techniques, specifically, long short-term memory (LSTM), decision tree (DT), and linear regression (LR). We found that LSTM achieved a significantly lower error rate compared with DT and LR due to its inherent ability to handle sequential data. Importantly, the prediction error was significantly reduced for DT and LR when used together with particle swarm optimization to select relevant/important features for these algorithms. Unlike summative analysis, and contrary to expectations, we found a significantly lower error rate when the prediction was made across different participants than within a participant. Moreover, the predictive selected features suggest that the patterns predictive of HR and GSR were substantially different across electrode sites and frequency bands. Overall, these results indicate that specific patterns of cerebral activity track autonomic body responses. Although individual cerebral differences are important, they might not be the only factors influencing the moment-by-moment changes in ANS responses.</p
    corecore