858 research outputs found

    Artificial Companions with Personality and Social Role

    No full text
    Subtitle: "Expectations from Users on the Design of Groups of Companions"International audienceRobots and virtual characters are becoming increasingly used in our everyday life. Yet, they are still far from being able to maintain long-term social relationships with users. It also remains unclear what future users will expect from these so-called "artificial companions" in terms of social roles and personality. These questions are of importance because users will be surrounded with multiple artificial companions. These issues of social roles and personality among a group of companions are sledom tackled in user studies. In this paper, we describe a study in which 94 participants reported that social roles and personalities they would expect from groups of companions. We explain how the resulsts give insights for the design of future groups of companions endowed with social intelligence

    Towards player-driven procedural content generation

    Get PDF
    Generating immersive game content is one of the ultimate goals for a game designer. This goal can be achieved by realizing the fact that players' perception of the same game differ according to a number of factors including: players' personality, playing styles, expertise and culture background. While one player might find the game immersive, others may quit playing as a result of encountering a seemingly insoluble problem. One promising avenue towards optimizing the gameplay experience for individual game players is to tailor player experience in real-time via automatic game content generation. Specifying the aspects of the game that have the major influence on the gameplay experience, identifying the relationship between these aspect and each individual experience and defining a mechanism for tailoring the game content according to each individual needs are important steps towards player-driven content generation.peer-reviewe

    Affectiva-MIT Facial Expression Dataset (AM-FED): Naturalistic and Spontaneous Facial Expressions Collected In-the-Wild

    Get PDF
    Computer classification of facial expressions requires large amounts of data and this data needs to reflect the diversity of conditions seen in real applications. Public datasets help accelerate the progress of research by providing researchers with a benchmark resource. We present a comprehensively labeled dataset of ecologically valid spontaneous facial responses recorded in natural settings over the Internet. To collect the data, online viewers watched one of three intentionally amusing Super Bowl commercials and were simultaneously filmed using their webcam. They answered three self-report questions about their experience. A subset of viewers additionally gave consent for their data to be shared publicly with other researchers. This subset consists of 242 facial videos (168,359 frames) recorded in real world conditions. The dataset is comprehensively labeled for the following: 1) frame-by-frame labels for the presence of 10 symmetrical FACS action units, 4 asymmetric (unilateral) FACS action units, 2 head movements, smile, general expressiveness, feature tracker fails and gender; 2) the location of 22 automatically detected landmark points; 3) self-report responses of familiarity with, liking of, and desire to watch again for the stimuli videos and 4) baseline performance of detection algorithms on this dataset. This data is available for distribution to researchers online, the EULA can be found at: http://www.affectiva.com/facial-expression-dataset-am-fed/

    Using Emotions to Empower the Self-adaptation Capability of Software Services

    Get PDF

    Exploiting the robot kinematic redundancy for emotion conveyance to humans as a lower priority task

    Get PDF
    Current approaches do not allow robots to execute a task and simultaneously convey emotions to users using their body motions. This paper explores the capabilities of the Jacobian null space of a humanoid robot to convey emotions. A task priority formulation has been implemented in a Pepper robot which allows the specification of a primary task (waving gesture, transportation of an object, etc.) and exploits the kinematic redundancy of the robot to convey emotions to humans as a lower priority task. The emotions, defined by Mehrabian as points in the pleasure–arousal–dominance space, generate intermediate motion features (jerkiness, activity and gaze) that carry the emotional information. A map from this features to the joints of the robot is presented. A user study has been conducted in which emotional motions have been shown to 30 participants. The results show that happiness and sadness are very well conveyed to the user, calm is moderately well conveyed, and fear is not well conveyed. An analysis on the dependencies between the motion features and the emotions perceived by the participants shows that activity correlates positively with arousal, jerkiness is not perceived by the user, and gaze conveys dominance when activity is low. The results indicate a strong influence of the most energetic motions of the emotional task and point out new directions for further research. Overall, the results show that the null space approach can be regarded as a promising mean to convey emotions as a lower priority task.Postprint (author's final draft

    ACII 2009: Affective Computing and Intelligent Interaction. Proceedings of the Doctoral Consortium 2009

    Get PDF
    • 

    corecore