870 research outputs found

    Participant responses to virtual agents in immersive virtual environments.

    Get PDF
    This thesis is concerned with interaction between people and virtual humans in the context of highly immersive virtual environments (VEs). Empirical studies have shown that virtual humans (agents) with even minimal behavioural capabilities can have a significant emotional impact on participants of immersive virtual environments (IVEs) to the extent that these have been used in studies of mental health issues such as social phobia and paranoia. This thesis focuses on understanding the impact on the responses of people to the behaviour of virtual humans rather than their visual appearance. There are three main research questions addressed. First, the thesis considers what are the key nonverbal behavioural cues used to portray a specific psychological state. Second, research determines the extent to which the underlying state of a virtual human is recognisable through the display of a key set of cues inferred from the behaviour of real humans. Finally, the degree to which a perceived psychological state in a virtual human invokes responses from participants in immersive virtual environments that are similar to those observed in the physical world is considered. These research questions were investigated through four experiments. The first experiment focused on the impact of visual fidelity and behavioural complexity on participant responses by implementing a model of gaze behaviour in virtual humans. The results of the study concluded that participants expected more life-like behaviours from more visually realistic virtual humans. The second experiment investigated the detrimental effects on participant responses when interacting with virtual humans with low behavioural complexity. The third experiment investigated the differences in responses of participants to virtual humans perceived to be in varying emotional states. The emotional states of the virtual humans were portrayed using postural and facial cues. Results indicated that posture does play an important role in the portrayal of affect however the behavioural model used in the study did not fully cover the qualities of body movement associated with the emotions studied. The final experiment focused on the portrayal of affect through the quality of body movement such as the speed of gestures. The effectiveness of the virtual humans was gauged through exploring a variety of participant responses including subjective responses, objective physiological and behavioural measures. The results show that participants are affected and respond to virtual humans in a significant manner provided that an appropriate behavioural model is used

    Emotion Capture: Emotionally Expressive Characters for Games

    Get PDF
    It has been shown that humans are sensitive to the portrayal of emotions for virtual characters. However, previous work in this area has often examined this sensitivity using extreme examples of facial or body animation. Less is known about how attuned people are at recognizing emotions as they are expressed during conversational communication. In order to determine whether body or facial motion is a better indicator for emotional expression for game characters, we conduct a perceptual experiment using synchronized full-body and facial motion-capture data. We find that people can recognize emotions from either modality alone, but combining facial and body motion is preferable in order to create more expressive characters

    Facial and Bodily Expressions for Control and Adaptation of Games (ECAG 2008)

    Get PDF

    Moving beyond the lab: investigating empathy through the Empirical 5E approach

    Get PDF
    Empathy is a complex and multifaceted phenomenon that plays a crucial role in human social interactions. Recent developments in social neuroscience have provided valuable insights into the neural underpinnings and bodily mechanisms underlying empathy. This methodology often prioritizes precision, replicability, internal validity, and confound control. However, fully understanding the complexity of empathy seems unattainable by solely relying on artificial and controlled laboratory settings, while overlooking a comprehensive view of empathy through an ecological experimental approach. In this article, we propose articulating an integrative theoretical and methodological framework based on the 5E approach (the “E”s stand for embodied, embedded, enacted, emotional, and extended perspectives of empathy), highlighting the relevance of studying empathy as an active interaction between embodied agents, embedded in a shared real-world environment. In addition, we illustrate how a novel multimodal approach including mobile brain and body imaging (MoBi) combined with phenomenological methods, and the implementation of interactive paradigms in a natural context, are adequate procedures to study empathy from the 5E approach. In doing so, we present the Empirical 5E approach (E5E) as an integrative scientific framework to bridge brain/body and phenomenological attributes in an interbody interactive setting. Progressing toward an E5E approach can be crucial to understanding empathy in accordance with the complexity of how it is experienced in the real world

    Towards Computer-Assisted Regulation of Emotions

    Get PDF
    Tunteet ovat keskeinen ja erottamaton osa ihmisen toimintaa, ajattelua ja yksilöiden välistä vuorovaikutusta. Tunteet luovat perustan mielekkäälle, toimivalle ja tehokkaalle toiminnalle. Joskus tunteiden sävy tai voimakkuus voi kuitenkin olla epäedullinen henkilön tavoitteiden ja hyvinvoinnin kannalta. Tällöin taidokas tunteiden säätely voi auttaa saavuttamaan terveen ja menestyksellisen elämän. Väitöstyön tavoitteena oli muodostaa perusta tulevaisuuden tietokoneille, jotka auttavat säätelemään tunteita. Tietokoneiden tunneälyä on toistaiseksi kehitetty kahdella alueella: ihmisen tunnereaktioiden mittaamisessa ja tietokoneen tuottamissa tunneilmaisuissa. Viimeisimmät teknologiat antavat tietokoneille jo mahdollisuuden tunnistaa ja jäljitellä ihmisen tunneilmaisuja hyvinkin tarkasti. Väitöstyössä toimistotuoliin asennetuilla paineantureilla kyettiin huomaamattomasti havaitsemaan muutoksia kehon liikkeissä: osallistujat nojautuivat kohti heille esitettyjä tietokonehahmoja. Tietokonehahmojen esittämät kasvonilmeet ja kehollinen etäisyys vaikuttivat merkittävästi osallistujien tunne- ja tarkkaavaisuuskokemuksiin sekä sydämen, ihon hikirauhasten ja kasvon lihasten toimintaan. Tulokset osoittavat että keinotekoiset tunneilmaisut voivat olla tehokkaita henkilön kokemusten ja kehon toiminnan säätelyssä. Väitöstyössä laadittiin lopulta vuorovaikutteinen asetelma, jossa tunneilmaisujen automaattinen tarkkailu liitettiin tietokoneen tuottamien sosiaalisten ilmaisujen ohjaamiseen. Osallistujat pystyivät säätelemään välittömiä fysiologisia reaktioitaan ja tunnekokemuksiaan esittämällä tahdonalaisia kasvonilmeitä (mm. ikään kuin hymyilemällä) heitä lähestyvälle tietokonehahmolle. Väitöstyön tuloksia voidaan hyödyntää laajasti, muun muassa uudenlaisten, ihmisen luonnollisia vuorovaikutustapoja paremmin tukevien tietokoneiden suunnittelussa.Emotions are intimately connected with our lives. They are essential in motivating behaviour, for reasoning effectively, and in facilitating interactions with other people. Consequently, the ability to regulate the tone and intensity of emotions is important for leading a life of success and well-being. Intelligent computer perception of human emotions and effective expression of virtual emotions provide a basis for assisting emotion regulation with technology. State-of-the-art technologies already allow computers to recognize and imitate human social and emotional cues accurately and in great detail. For example, in the present work a regular looking office chair was used to covertly measure human body movement responses to artifical expressions of proximity and facial cues. In general, such artificial cues from visual agents were found to significantly affect heart, sweat gland, and facial muscle activities, as well as subjective experiences of emotion and attention. The perceptual and expressive capabilities were combined in a setup where a person regulated her or his more spontaneous reactions by either smiling or frowning voluntarily to a virtual humanlike character. These results highlight the potential of future emotion-sensitive technologies for creating supportive and even healthy interactions between humans and computers

    Microanalysis of nonverbal communication: Development of a nonverbal research method using high-performance 3D character animation

    Get PDF
    This work provides a novel research tool for the field of nonverbal communication, with the goal being to transform 3D motion data into metric measurements that allow for the application of standard statistical methods such as analysis of variance, factor analysis, or multiple regression analysis. 3D motion data are automatically captured by motion capture systems or manually coded by humans using 3D character animation software. They precisely describe human movements, but without any furter data processing, they cannot meaningfully be interpreted and statistically analyzed. To make this possible, three nonverbal coding systems describing static body postures, dynamic body movements, and proper body part motions such as head nods have been developed. A geometrical model describing postures and movements as flexion angles of body parts on three clearly understandable and nonverbal relevant dimensions—the sagittal, the rotational, and the lateral—has been developed and provides the basis for math formulas which allow the transformation of motion capture data or 3D animation data into metric measures. Furthermore, math formulas were developed to compute around 30 nonverbal cues described in the literature on kinesics that can be understood as geometrical features of body parts such as openness, symmetry, and expansiveness of body postures, head position and head nods, gaze direction and body orientation, pointing behavior and relational gestures, interactional synchrony, proxemics, and touch, including dynamic features of movements such as rate, velocity, and acceleration. To obtain accurate measurements, the software APEx (Automatic Parameter Extraction) has been developed with a number of convenient features extracting more than 150 nonverbal parameters consisting 380 metric variables out of available motion data

    Classification et Caractérisation de l'Expression Corporelle des Emotions dans des Actions Quotidiennes

    Get PDF
    The work conducted in this thesis can be summarized into four main steps.Firstly, we proposed a multi-level body movement notation system that allows the description ofexpressive body movement across various body actions. Secondly, we collected a new databaseof emotional body expression in daily actions. This database constitutes a large repository of bodilyexpression of emotions including the expression of 8 emotions in 7 actions, combining video andmotion capture recordings and resulting in more than 8000 sequences of expressive behaviors.Thirdly, we explored the classification of emotions based on our multi-level body movement notationsystem. Random Forest approach is used for this purpose. The advantage of using RandomForest approach in our work is double-fold : 1) reliability of the classification model and 2) possibilityto select a subset of relevant features based on their relevance measures. We also comparedthe automatic classification of emotions with human perception of emotions expressed in differentactions. Finally, we extracted the most relevant features that capture the expressive content of themotion based on the relevance measure of features returned by the Random Forest model. Weused this subset of features to explore the characterization of emotional body expression acrossdifferent actions. A Decision Tree model was used for this purpose.Ce travail de thèse peut être résumé en quatre étapes principales. Premièrement, nousavons proposé un système d’annotation multi-niveaux pour décrire le mouvement corporel expressif dansdifférentes actions. Deuxièmement, nous avons enregistré une base de données de l’expression corporelledes émotions dans des actions quotidiennes. Cette base de données constitue un large corpus de comportementsexpressifs considérant l’expression de 8 émotions dans 7 actions quotidiennes, combinant à la fois lesdonnées audio-visuelle et les données de capture de mouvement et donnant lieu à plus que 8000 séquencesde mouvement expressifs. Troisièmement, nous avons exploré la classification des émotions en se basantsur notre système d’annotation multi-niveaux. L’approche des forêts aléatoires est utilisée pour cette fin. L’utilisationdes forêts aléatoires dans notre travail a un double objectif : 1) la fiabilité du modèle de classification,et 2) la possibilité de sélectionner un sous-ensemble de paramètres pertinents en se basant sur la mesured’importance retournée par le modèle. Nous avons aussi comparé la classification automatique des émotionsavec la perception humaine des émotions exprimées dans différentes actions. Finalement, nous avonsextrait les paramètres les plus pertinents qui retiennent l’expressivité du mouvement en se basant sur la mesured’importance retournée par le modèle des forêts aléatoires. Nous avons utilisé ce sous-ensemble deparamètres pour explorer la caractérisation de l’expression corporelle des émotions dans différentes actionsquotidiennes. Un modèle d’arbre de décision a été utilisé pour cette fin
    corecore