18,439 research outputs found
The perception of emotion in artificial agents
Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents
Towards the improvement of self-service systems via emotional virtual agents
Affective computing and emotional agents have been found to have a positive effect on human-computer interactions. In order to develop an acceptable emotional agent for use in a self-service interaction, two stages of research were identified and carried out; the first to determine which facial expressions are present in such an interaction and the second to determine which emotional agent behaviours are perceived as appropriate during a problematic self-service shopping task. In the first stage, facial expressions associated with negative affect were found to occur during self-service shopping interactions, indicating that facial expression detection is suitable for detecting negative affective states during self-service interactions. In the second stage, user perceptions of the emotional facial expressions displayed by an emotional agent during a problematic self-service interaction were gathered. Overall, the expression of disgust was found to be perceived as inappropriate while emotionally neutral behaviour was perceived as appropriate, however gender differences suggested that females perceived surprise as inappropriate. Results suggest that agents should change their behaviour and appearance based on user characteristics such as gender
Recommended from our members
Emotional Biosensing: Exploring Critical Alternatives
Emotional biosensing is rising in daily life: Data and categories claim to know how people feel and suggest what they should do about it, while CSCW explores new biosensing possibilities. Prevalent approaches to emotional biosensing are too limited, focusing on the individual, optimization, and normative categorization. Conceptual shifts can help explore alternatives: toward materiality, from representation toward performativity, inter-action to intra-action, shifting biopolitics, and shifting affect/desire. We contribute (1) synthesizing wide-ranging conceptual lenses, providing analysis connecting them to emotional biosensing design, (2) analyzing selected design exemplars to apply these lenses to design research, and (3) offering our own recommendations for designers and design researchers. In particular we suggest humility in knowledge claims with emotional biosensing, prioritizing care and affirmation over self- improvement, and exploring alternative desires. We call for critically questioning and generatively re- imagining the role of data in configuring sensing, feeling, ‘the good life,’ and everyday experience
Which Emotional Behaviors are Actions?
There is a wide range of things we do out of emotion. For example, we smile with pleasure, our voices drop when we are sad, we recoil in shock or jump for joy, we apologize to others out of remorse. It is uncontroversial that some of these behaviors are actions. Clearly, apologizing is an action if anything is. Things seem less clear in the case of other emotional behaviors. Intuitively, the drop in a sad person’s voice is something that happens to her, rather than something she actively performs. Perhaps more interestingly, even jumping for joy can seem a problematic case: although its execution involves the active performance of certain movements, it has been argued to contrast, e.g., with an act of apology, in that it is not performed in order to achieve some end, such as repairing a relationship. This can make this behavior seem considerably different from paradigm actions.
Our central concern in this paper is with which emotional behaviors should be classed as actions and why..
A Trip to the Moon: Personalized Animated Movies for Self-reflection
Self-tracking physiological and psychological data poses the challenge of
presentation and interpretation. Insightful narratives for self-tracking data
can motivate the user towards constructive self-reflection. One powerful form
of narrative that engages audience across various culture and age groups is
animated movies. We collected a week of self-reported mood and behavior data
from each user and created in Unity a personalized animation based on their
data. We evaluated the impact of their video in a randomized control trial with
a non-personalized animated video as control. We found that personalized videos
tend to be more emotionally engaging, encouraging greater and lengthier writing
that indicated self-reflection about moods and behaviors, compared to
non-personalized control videos
- …