46,272 research outputs found

    The influence of facial blushing and paling on emotion perception and memory

    Get PDF
    Emotion expressions facilitate interpersonal communication by conveying information about a person’s affective state. The current work investigates how facial coloration (i.e., subtle changes in chromaticity from baseline facial color) impacts the perception of, and memory for, emotion expressions, and whether these depend on dynamic (vs. static) representations of emotional behavior. Emotion expressive stimuli that either did or did not vary in facial coloration were shown to participants who were asked to categorize and rate the stimuli’s intensity (Exps. 1 & 2), as well as recall their degree of facial coloration (Exps. 3 & 4). Results showed that changes in facial coloration facilitated emotion categorization accuracy in dynamic (Exp. 1) but not static expressions (Exp. 2). Facial coloration further increased perceived emotion intensity, with participants misremembering the coloration of both dynamic and static expressions differently depending on emotion category prototype (Exps. 3 & 4). Together, these findings indicate that facial coloration conveys affective information to observers and contributes to biases in how emotion expressions are perceived and remembered

    Neural correlates of the perception of dynamic versus static facial expressions of emotion

    Get PDF
    Aim: This study investigated brain areas involved in the perception of dynamic facial expressions of emotion

    The Look of Fear from the Eyes Varies with the Dynamic Sequence of Facial Actions

    Get PDF
    Most research on the ability to interpret expressions from the eyes has utilized static information. This research investigates whether the dynamic sequence of facial actions in the eye region influences the judgments of perceivers. Dynamic fear expressions involving the eye region and eyebrows were created which systematically differed in the sequential occurrence of facial actions. Participants rated the intensity of sequential fear expressions, either in addition to a simultaneous, full-blown expression (Experiment 1) or in combination with different levels of eye gaze (Experiment 2). The results showed that the degree of attributed emotion and the appraisal ratings differed as a function of the sequence of facial expressions of fear, with direct gaze resulting in stronger subjective responses. The findings challenge current notions surrounding the study of static facial displays from the eyes and suggest that emotion perception is a dynamic process shaped by the time course of the facial actions of an expression. Possible implications for the field of affective computing and clinical research are discussed

    Animating believable facial expressions:is it possible to choreograph perceptually valid emotional expressions?

    Get PDF
    In psychology research, the nuances of facial movement have been investigated to determine the key characteristics of emotional expressions. Studies have shown that movement alone can communicate emotion, that particular facial regions have varying degrees of importance to expression recognition, and that temporal factors might affect the appearance of facial regions during expressions. Furthermore, researchers have suggested that emotional expressions could have temporal configurations. If facial expressions have perceptually valid (or invalid) configurations of facial actions over time, then a detailed study of how configuration manipulation affects perception could inform the practice of character animation. Studies of facial expression - most notably Ekman and Friesen’s Facial Action Coding System – have had an impact on animation research and application. However, most studies have focused on static expressions. A better understanding of the choreography of authentic dynamic expressions (where choreography could be described as the sequence, timing, and duration of regional facial movement within and between expressions) could be of more value to practicing animators. In this paper, the authors discuss ‘Emotional Avatars’ - an interdisciplinary research project which aims to expand upon the findings of psychology-based methods in order to inform artistic practice. Drawing upon the experience of animators and psychologists, the primary aim of the project is to determine whether audiences perceive certain choreographies of facial movement to be more or less authentic. The current research concerns the sequence and timing of regional movement within and between expressions of emotion. Using existing resources as a basis for peak expression appearance the authors systematically generate and manipulate animated expressions of emotion based on the generally accepted ‘universal expressions’ (happiness, sadness, anger, fear, disgust, and surprise). By producing a range of animations with variation in sequence, and then testing observer perception of the animations under controlled conditions, the goal of the current research is to identify potential valid and invalid emotional expression choreographies

    A Book and its Cover: The effects of dynamic and static facial expressions on the perction of personality traits

    Get PDF
    This study used three dynamic and three static images of older adult men depicting either smiling, scowling, or neutral facial expressions to examine the influence of motion on emotion identification and stereotype activation, specifically the Halo Effect, in older adults (55-85 years). To that end, two hypotheses emerged: 1) older adults will be more accurate in identifying facial expressions when viewing dynamic facial expressions than static facial expressions, and 2) participants exposed to the dynamic stimuli would experience greater levels of the Halo Effect with the greatest levels in the smiling facial expression condition. A 2 (stimulus type: dynamic and static) x 3 (Facial expression: smile, neutral, scowl) mixed design was used. Two hundred participants between the ages of 55 and 85 years, viewed either a dynamic model exhibiting smiling, neutral, and scowling facial expressions, or a static model exhibiting smiling, neutral, and scowling facial expressions. To investigate the role of motion on emotion identification an emotion accuracy question was used. Additionally, two measures assessed the presence of the Halo Effect: The Self-Assessment Manikin (e.g., arousal, dominance, and pleasure) and four social perception questions (e.g., attractiveness, honesty, pleasing to look at, and threatening). Results indicate that participants were more accurate when identifying static scowling and smiling facial expressions and the dynamic neutral facial expression. Participants also attributed more positive traits to static rather than dynamic facial expressions

    Perception of dynamic facial expressions of emotion between dogs and humans

    Get PDF
    Facial expressions are a core component of the emotional response of social mammals. In contrast to Darwin’s original proposition, expressive facial cues of emotion appear to have evolved to be species-specific. Faces trigger an automatic perceptual process, and so, inter-specific emotion perception is potentially a challenge; since observers should not try to “read” heterospecific facial expressions in the same way that they do conspecific ones. Using dynamic spontaneous facial expression stimuli, we report the first inter-species eye-tracking study on fully unrestrained participants and without pre-experiment training to maintain attention to stimuli, to compare how two different species living in the same ecological niche, humans and dogs, perceive each other’s facial expressions of emotion. Humans and dogs showed different gaze distributions when viewing the same facial expressions of either humans or dogs. Humans modulated their gaze depending on the area of interest (AOI) being examined, emotion, and species observed, but dogs modulated their gaze depending on AOI only. We also analysed if the gaze distribution was random across AOIs in both species: in humans, eye movements were not correlated with the diagnostic facial movements occurring in the emotional expression, and in dogs, there was only a partial relationship. This suggests that the scanning of facial expressions is a relatively automatic process. Thus, to read other species’ facial emotions successfully, individuals must overcome these automatic perceptual processes and employ learning strategies to appreciate the inter-species emotional repertoire

    Equipping Social Robots with Culturally-Sensitive Facial Expressions of Emotion Using Data-Driven Methods

    Get PDF
    Social robots must be able to generate realistic and recognizable facial expressions to engage their human users. Many social robots are equipped with standardized facial expressions of emotion that are widely considered to be universally recognized across all cultures. However, mounting evidence shows that these facial expressions are not universally recognized - for example, they elicit significantly lower recognition accuracy in East Asian cultures than they do in Western cultures. Therefore, without culturally sensitive facial expressions, state-of-the-art social robots are restricted in their ability to engage a culturally diverse range of human users, which in turn limits their global marketability. To develop culturally sensitive facial expressions, novel data-driven methods are used to model the dynamic face movement patterns that convey basic emotions (e.g., happy, sad, anger) in a given culture using cultural perception. Here, we tested whether such dynamic facial expression models, derived in an East Asian culture and transferred to a popular social robot, improved the social signalling generation capabilities of the social robot with East Asian participants. Results showed that, compared to the social robot's existing set of facial `universal' expressions, the culturally-sensitive facial expression models are recognized with generally higher accuracy and judged as more human-like by East Asian participants. We also detail the specific dynamic face movements (Action Units) that are associated with high recognition accuracy and judgments of human-likeness, including those that further boost performance. Our results therefore demonstrate the utility of using data-driven methods that employ human cultural perception to derive culturally-sensitive facial expressions that improve the social face signal generation capabilities of social robots. We anticipate that these methods will continue to inform the design of social robots and broaden their usability and global marketability

    A multimodal investigation of dynamic face perception using functional magnetic resonance imaging and magnetoencephalography

    Get PDF
    Motion is an important aspect of face perception that has been largely neglected to date. Many of the established findings are based on studies that use static facial images, which do not reflect the unique temporal dynamics available from seeing a moving face. In the present thesis a set of naturalistic dynamic facial emotional expressions was purposely created and used to investigate the neural structures involved in the perception of dynamic facial expressions of emotion, with both functional Magnetic Resonance Imaging (fMRI) and Magnetoencephalography (MEG). Through fMRI and connectivity analysis, a dynamic face perception network was identified, which is demonstrated to extend the distributed neural system for face perception (Haxby et al.,2000). Measures of effective connectivity between these regions revealed that dynamic facial stimuli were associated with specific increases in connectivity between early visual regions, such as inferior occipital gyri and superior temporal sulci, along with coupling between superior temporal sulci and amygdalae, as well as with inferior frontal gyri. MEG and Synthetic Aperture Magnetometry (SAM) were used to examine the spatiotemporal profile of neurophysiological activity within this dynamic face perception network. SAM analysis revealed a number of regions showing differential activation to dynamic versus static faces in the distributed face network, characterised by decreases in cortical oscillatory power in the beta band, which were spatially coincident with those regions that were previously identified with fMRI. These findings support the presence of a distributed network of cortical regions that mediate the perception of dynamic facial expressions, with the fMRI data providing information on the spatial co-ordinates paralleled by the MEG data, which indicate the temporal dynamics within this network. This integrated multimodal approach offers both excellent spatial and temporal resolution, thereby providing an opportunity to explore dynamic brain activity and connectivity during face processing

    The influence of facial motion on the neural response during emotion perception in typical and atypical development

    Get PDF
    The ability to interpret emotional expressions is the key to understanding our social environment. In our everyday lives we are exposed to a huge variety of facial expressions which are constantly updated in response to environmental cues. The neural networks underpinning our cognitive ability to perceive dynamic emotional expressions are poorly understood. This thesis aims to address the effects of motion on our perception of emotional expression from a developmental perspective. The overall aim was to compare the neural correlates of emotion perception of static and dynamic images for the six basic facial expressions in typical and atypical development. Three populations were studied: 1) typically developed adults; 2) atypically developed adults, i.e. young adults who have undergone a surgical resection for paediatric temporal lobe epilepsy; and 3) typically developing infants (4-12-month-olds). Initially, morphed dynamic images for the six basic facial expressions were created, to be used in subsequent studies. These were validated, alongside static photographs, with ratings for accuracy, confidence and intensity. The first and second ERP studies, involving typically developed adults and atypically developed adults respectively, explored the amplitude and latency of the P1 and N170 event-related potential (ERP) components in response to observing static and dynamic images of facial expressions. The final study, involving typically developing infants, explored the amplitude and latency of the P1 and N290 (the N170 precursor). The impact of motion on the development of emotion perception is discussed in relation to the findings presented in this thesis

    When facial expressions do and do not signal minds: the role of face inversion, expression dynamism, and emotion type

    Get PDF
    Recent research has linked facial expressions to mind perception. Specifically, Bowling and Banissy (2017) found that ambiguous doll-human morphs were judged as more likely to have a mind when smiling. Herein, we investigate three key potential boundary conditions of this “expression-to-mind” effect. First, we demonstrate that face inversion impairs the ability of happy expressions to signal mindful states in static faces; however, inversion does not disrupt this effect for dynamic displays of emotion. Finally, we demonstrate that not all emotions have equivalent effects. Whereas happy faces generate more mind ascription compared to neutral faces, we find that expressions of disgust actually generate less mind ascription than those of happiness
    corecore