1,864 research outputs found
Affective Computing
This book provides an overview of state of the art research in Affective Computing. It presents new ideas, original results and practical experiences in this increasingly important research field. The book consists of 23 chapters categorized into four sections. Since one of the most important means of human communication is facial expression, the first section of this book (Chapters 1 to 7) presents a research on synthesis and recognition of facial expressions. Given that we not only use the face but also body movements to express ourselves, in the second section (Chapters 8 to 11) we present a research on perception and generation of emotional expressions by using full-body motions. The third section of the book (Chapters 12 to 16) presents computational models on emotion, as well as findings from neuroscience research. In the last section of the book (Chapters 17 to 22) we present applications related to affective computing
Virtual humans: thirty years of research, what next?
In this paper, we present research results and future challenges in creating realistic and believable Virtual Humans. To realize these modeling goals, real-time realistic representation is essential, but we also need interactive and perceptive Virtual Humans to populate the Virtual Worlds. Three levels of modeling should be considered to create these believable Virtual Humans: 1) realistic appearance modeling, 2) realistic, smooth and flexible motion modeling, and 3) realistic high-level behaviors modeling. At first, the issues of creating virtual humans with better skeleton and realistic deformable bodies are illustrated. To give a level of believable behavior, challenges are laid on generating on the fly flexible motion and complex behaviours of Virtual Humans inside their environments using a realistic perception of the environment. Interactivity and group behaviours are also important parameters to create believable Virtual Humans which have challenges in creating believable relationship between real and virtual humans based on emotion and personality, and simulating realistic and believable behaviors of groups and crowds. Finally, issues in generating realistic virtual clothed and haired people are presente
Robotic Faces: Exploring Dynamical Patterns of Social Interaction between Humans and Robots
Thesis (Ph.D.) - Indiana University, Informatics, 2015The purpose of this dissertation is two-fold: 1) to develop an empirically-based design for an interactive robotic face, and 2) to understand how dynamical aspects of social interaction may be leveraged to design better interactive technologies and/or further our understanding of social cognition.
Understanding the role that dynamics plays in social cognition is a challenging problem. This is particularly true in studying cognition via human-robot interaction, which entails both the natural social cognition of the human and the “artificial intelligence” of the robot. Clearly, humans who are interacting with other humans (or even other mammals such as dogs) are cognizant of the social nature of the interaction – their behavior in those cases differs from that when interacting with inanimate objects such as tools. Humans (and many other animals) have some awareness of “social”, some sense of other agents. However, it is not clear how or why.
Social interaction patterns vary across culture, context, and individual characteristics of the human interactor. These factors are subsumed into the larger interaction system, influencing the unfolding of the system over time (i.e. the dynamics). The overarching question is whether we can figure out how to utilize factors that influence the dynamics of the social interaction in order to imbue our interactive technologies (robots, clinical AI, decision support systems, etc.) with some "awareness of social", and potentially create more natural interaction paradigms for those technologies.
In this work, we explore the above questions across a range of studies, including lab-based experiments, field observations, and placing autonomous, interactive robotic faces in public spaces. We also discuss future work, how this research relates to making sense of what a robot "sees", creating data-driven models of robot social behavior, and development of robotic face personalities
Emotion in Future Intelligent Machines
Over the past decades, research in cognitive and affective neuroscience has
emphasized that emotion is crucial for human intelligence and in fact
inseparable from cognition. Concurrently, there has been a significantly
growing interest in simulating and modeling emotion in robots and artificial
agents. Yet, existing models of emotion and their integration in cognitive
architectures remain quite limited and frequently disconnected from
neuroscientific evidence. We argue that a stronger integration of emotion in
robot models is critical for the design of intelligent machines capable of
tackling real world problems. Drawing from current neuroscientific knowledge,
we provide a set of guidelines for future research in artificial emotion and
intelligent machines more generally
Recommended from our members
Investigation of an emotional virtual human modelling method
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.In order to simulate virtual humans more realistically and enable them life-like behaviours, several exploration research on emotion calculation, synthetic perception, and decision making process have been discussed. A series of sub-modules have been designed and simulation results have been presented with discussion.
A visual based synthetic perception system has been proposed in this thesis, which allows virtual humans to detect the surrounding virtual environment through a collision-based synthetic vision system. It enables autonomous virtual humans to change their emotion states according to stimuli in real time. The synthetic perception system also allows virtual humans to remember limited information within their own First-in-first-out short-term virtual memory.
The new emotion generation method includes a novel hierarchical emotion structure and a group of emotion calculation equations, which enables virtual humans to perform emotionally in real-time according to their internal and external factors. Emotion calculation equations used in this research were derived from psychologic emotion measurements. Virtual humans can utilise the information in virtual memory and emotion calculation equations to generate their own numerical emotion states within the hierarchical emotion structure. Those emotion states are important internal references for virtual humans to adopt appropriate behaviours and also key cues for their decision making.
The work introduces a dynamic emotional motion database structure for virtual human modelling. When developing realistic virtual human behaviours, lots of subjects were motion-captured whilst performing emotional motions with or without intent. The captured motions were endowed to virtual characters and implemented in different virtual scenarios to help evoke and verify design ideas, possible consequences of simulation (such as fire evacuation).
This work also introduced simple heuristics theory into decision making process in order to make the virtual human’s decision making more like real human. Emotion values are proposed as a group of the key cues for decision making under the simple heuristic structures. A data interface which connects the emotion calculation and the decision making structure together has also been designed for the simulation system
Bridging the gap between emotion and joint action
Our daily human life is filled with a myriad of joint action moments, be it children playing, adults working together (i.e., team sports), or strangers navigating through a crowd. Joint action brings individuals (and embodiment of their emotions) together, in space and in time. Yet little is known about how individual emotions propagate through embodied presence in a group, and how joint action changes individual emotion. In fact, the multi-agent component is largely missing from neuroscience-based approaches to emotion, and reversely joint action research has not found a way yet to include emotion as one of the key parameters to model socio-motor interaction. In this review, we first identify the gap and then stockpile evidence showing strong entanglement between emotion and acting together from various branches of sciences. We propose an integrative approach to bridge the gap, highlight five research avenues to do so in behavioral neuroscience and digital sciences, and address some of the key challenges in the area faced by modern societies
EMOTIONAL SYNCHRONIZATION-BASED HUMAN-ROBOT COMMUNICATION AND ITS EFFECTS
This paper presents a natural and comfortable communication system between human and robot based on synchronization to human emotional state using human facial expression recognition. The system consists of three parts: human emotion recognition, robotic emotion generation, and robotic emotion expression. The robot recognizes human emotion through human facial expressions, and robotic emotion is generated and synchronized with human emotion dynamically using a vector field of dynamics. The robot makes dynamically varying facial expressions to express its own emotions to the human. A communication experiment was conducted to examine the effectiveness of the proposed system. The authors found that subjects became much more comfortable after communicating with the robot with synchronized emotions. Subjects felt somewhat uncomfortable after communicating with the robot with non-synchronized emotions. During emotional synchronization, subjects communicated much more with the robot, and the communication time was double that during non-synchronization. Furthermore, in the case of emotional synchronization, subjects had good impressions of the robot, much better than the impressions in the case of non-synchronization. It was confirmed in this study that emotional synchronization in human-robot communication can be effective in making humans comfortable and makes the robot much more favorable and acceptable to humans.ArticleINTERNATIONAL JOURNAL OF HUMANOID ROBOTICS. 10(1):1350014 (2013)journal articl
- …