31,122 research outputs found

    Facial actions as visual cues for personality

    Get PDF
    What visual cues do human viewers use to assign personality characteristics to animated characters? While most facial animation systems associate facial actions to limited emotional states or speech content, the present paper explores the above question by relating the perception of personality to a wide variety of facial actions (e.g., head tilting/turning, and eyebrow raising) and emotional expressions (e.g., smiles and frowns). Animated characters exhibiting these actions and expressions were presented to human viewers in brief videos. Human viewers rated the personalities of these characters using a well-standardized adjective rating system borrowed from the psychological literature. These personality descriptors are organized in a multidimensional space that is based on the orthogonal dimensions of Desire for Affiliation and Displays of Social Dominance. The main result of the personality rating data was that human viewers associated individual facial actions and emotional expressions with specific personality characteristics very reliably. In particular, dynamic facial actions such as head tilting and gaze aversion tended to spread ratings along the Dominance dimension, whereas facial expressions of contempt and smiling tended to spread ratings along the Affiliation dimension. Furthermore, increasing the frequency and intensity of the head actions increased the perceived Social Dominance of the characters. We interpret these results as pointing to a reliable link between animated facial actions/expressions and the personality attributions they evoke in human viewers. The paper shows how these findings are used in our facial animation system to create perceptually valid personality profiles based on Dominance and Affiliation as two parameters that control the facial actions of autonomous animated characters

    Detecting Low Rapport During Natural Interactions in Small Groups from Non-Verbal Behaviour

    Full text link
    Rapport, the close and harmonious relationship in which interaction partners are "in sync" with each other, was shown to result in smoother social interactions, improved collaboration, and improved interpersonal outcomes. In this work, we are first to investigate automatic prediction of low rapport during natural interactions within small groups. This task is challenging given that rapport only manifests in subtle non-verbal signals that are, in addition, subject to influences of group dynamics as well as inter-personal idiosyncrasies. We record videos of unscripted discussions of three to four people using a multi-view camera system and microphones. We analyse a rich set of non-verbal signals for rapport detection, namely facial expressions, hand motion, gaze, speaker turns, and speech prosody. Using facial features, we can detect low rapport with an average precision of 0.7 (chance level at 0.25), while incorporating prior knowledge of participants' personalities can even achieve early prediction without a drop in performance. We further provide a detailed analysis of different feature sets and the amount of information contained in different temporal segments of the interactions.Comment: 12 pages, 6 figure

    First impressions: A survey on vision-based apparent personality trait analysis

    Get PDF
    © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes,creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Personality analysis has been widely studied in psychology, neuropsychology, and signal processing fields, among others. From the past few years, it also became an attractive research area in visual computing. From the computational point of view, by far speech and text have been the most considered cues of information for analyzing personality. However, recently there has been an increasing interest from the computer vision community in analyzing personality from visual data. Recent computer vision approaches are able to accurately analyze human faces, body postures and behaviors, and use these information to infer apparent personality traits. Because of the overwhelming research interest in this topic, and of the potential impact that this sort of methods could have in society, we present in this paper an up-to-date review of existing vision-based approaches for apparent personality trait recognition. We describe seminal and cutting edge works on the subject, discussing and comparing their distinctive features and limitations. Future venues of research in the field are identified and discussed. Furthermore, aspects on the subjectivity in data labeling/evaluation, as well as current datasets and challenges organized to push the research on the field are reviewed.Peer ReviewedPostprint (author's final draft

    Body Motion Cues Drive First Impressions: Consensus, Truth and the Origins of Personality Trait Judgements based on Targets' Whole-Body Motion.

    Get PDF
    Personality trait attribution is automatic, and first impressions can be lasting and lead to important social decisions. Research on how facial cues impact on person perception is plentiful, but less is known about how whole-body motion contributes to first impressions. This thesis presents results from experiments in which participants rated the traits of target individuals based solely on short, silent movie clips of those individuals performing actions or expressing emotions with their bodies, or simply walking. To isolate the contribution to trait attribution of body motion cues, the static form information of the body stimuli was degraded. Consensus at zero acquaintance is replicated throughout the thesis, as manifested by strong inter-rater agreement for all rating experiments and within all displayed behaviours, thus indicating that body motion may contain visual cues that drive trait impressions. Further experiments identified motion parameters that predict personality trait impressions, and an experimental paradigm showed that computational manipulation of motion data can indeed change observer judgements of computerised models based on human motion data. No accuracy was found in the trait judgements, in that there was no link between how a target was judged and this target individual's scores on a five-factor personality questionnaire. Underlying judgements driving personality trait impressions were found: impressions of emotions, attractiveness and masculinity appear to be intertwined with personality trait judgements. Finally, patterns in personality trait judgements based on body motion were consistent with findings from studies on face perception, reflecting a two-step judgement of a target person's intention and ability to cause harm. Differences were found depending on the display format of the stimuli, and interpretations for these discrepancies are offered. The thesis shows that people go beyond the information available to them when forming personality trait impressions of strangers, and offers evidence that changes in body motion may indeed have an impact on such trait impressions

    Affective incoherence: when affective concepts and embodied reactions clash.

    Get PDF
    In five studies, the authors examined the effects on cognitive performance of coherence and incoherence between conceptual and experiential sources of affective information. The studies crossed the priming of happy and sad concepts with affective experiences. In different experiments, these included approach or avoidance actions, happy or sad feelings, and happy or sad expressive behaviors. In all studies, coherence between affective concepts and affective experiences led to better recall of a story than did affective incoherence. The authors suggest that the experience of such experiential affective cues serves as evidence of the appropriateness of affective concepts that come to mind. The results suggest that affective coherence has epistemic benefits and that incoherence is costly in terms of cognitive performance
    corecore