38 research outputs found

    Affective games:a multimodal classification system

    Get PDF
    Affective gaming is a relatively new field of research that exploits human emotions to influence gameplay for an enhanced player experience. Changes in player’s psychology reflect on their behaviour and physiology, hence recognition of such variation is a core element in affective games. Complementary sources of affect offer more reliable recognition, especially in contexts where one modality is partial or unavailable. As a multimodal recognition system, affect-aware games are subject to the practical difficulties met by traditional trained classifiers. In addition, inherited game-related challenges in terms of data collection and performance arise while attempting to sustain an acceptable level of immersion. Most existing scenarios employ sensors that offer limited freedom of movement resulting in less realistic experiences. Recent advances now offer technology that allows players to communicate more freely and naturally with the game, and furthermore, control it without the use of input devices. However, the affective game industry is still in its infancy and definitely needs to catch up with the current life-like level of adaptation provided by graphics and animation

    Automatic Recognition of Affective Body Movement in a Video Game Scenario

    Full text link
    This study aims at recognizing the affective states of players from non-acted, non-repeated body movements in the context of a video game scenario. A motion capture system was used to collect the movements of the participants while playing a Nintendo Wii tennis game. Then, a combination of body movement features along with a machine learning technique was used in order to automatically recognize emotional states from body movements. Our system was then tested for its ability to generalize to new participants and to new body motion data using a sub-sampling validation technique. To train and evaluate our system, online evaluation surveys were created using the body movements collected from the motion capture system and human observers were recruited to classify them into affective categories. The results showed that observer agreement levels are above chance level and the automatic recognition system achieved recognition rates comparable to the observers' benchmark. © 2012 ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering

    The affective body argument in technology design

    Get PDF
    In this paper, I argue that the affective body is underused in the design of interactive technology despite what it has to offer. Whilst the literature shows it to be a powerful affective communication channel, it is often ignored in favor of the more commonly studied facial and vocal expression modalities. This is despite it being as informative and in some situations even more reliable than the other affective channels. In addition, due to the proliferation of increasingly cheaper and ubiquitous movement sensing technologies, the regulatory affective functions of the body could open new possibilities in various application areas. In this paper, after presenting a brief summary of the opportunities that the affective body offers to technology designers, I will use the case of physical rehabilitation to discuss how its use could lead to interesting new solutions and more effective therapies

    Children interpretation of emotional body language displayed by a robot

    Get PDF
    Previous results show that adults are able to interpret different key poses displayed by the robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy), valence (positive or negative) and stance (approaching or avoiding) whereas moving the head up produces an increase along these dimensions [1]. Hence, changing the head position during an interaction should send intuitive signals which could be used during an interaction. The ALIZ-E target group are children between the age of 8 and 11. Existing results suggest that they would be able to interpret human emotional body language [2, 3]. Based on these results, an experiment was conducted to test whether the results of [1] can be applied to children. If yes body postures and head position could be used to convey emotions during an interaction.Peer reviewe

    Our own action kinematics predict the perceived affective states of others.

    Get PDF
    Our movement kinematics provideuseful cues aboutour affective states. Given that our experiences furnish models that help us to interpret our environment, and that a rich source of action experience comes from our own movements,the present study examined whetherwe use models of our own action kinematics to make judgments about the affective states of others. For example,relative to one’s typical kinematics, anger isassociated with fast movements. Therefore, the extent to which we perceive angerin others maybe determined by the degreeto which their movementsare faster than our own typicalmovements. We related participants’walking kinematicsin a neutral contextto their judgments of the affective statesconveyed byobserved point-light walkers(PLWs). Aspredicted,we found a linear relationship between one’s own walking kinematics and affective state judgments, such that faster participants rated sloweremotionsmore intensely relative to their ratings for faster emotions. This relationship was absent when observing PLWs where differences in velocity between affective states were removed. These findings suggest that perception of affective states in others is predicted by one’s own movement kinematics, withimportant implications for perception of, and interaction with,those who move differentl

    The perception of emotion in artificial agents

    Get PDF
    Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents

    A Database of Full Body Virtual Interactions Annotated with Expressivity Scores

    Get PDF
    Abstract Recent technologies enable the exploitation of full body expressions in applications such as interactive arts but are still limited in terms of dyadic subtle interaction patterns. Our project aims at full body expressive interactions between a user and an autonomous virtual agent. The currently available databases do not contain full body expressivity and interaction patterns via avatars. In this paper, we describe a protocol defined to collect a database to study expressive full-body dyadic interactions. We detail the coding scheme for manually annotating the collected videos. Reliability measures for global annotations of expressivity and interaction are also provided

    A mobile application to report and detect 3D body emotional poses

    Get PDF
    Most research into automatic emotion recognition is focused on facial expressions or physiological signals, while the exploitation of body postures has scarcely been explored, although they can be useful for emotion detection. This paper first explores a mechanism for self-reporting body postures with a novel easy-to-use mobile application called EmoPose. The app detects emotional states from self-reported poses, classifying them into the six basic emotions proposed by Ekman and a neutral state. The poses identified by Schindler et al. have been used as a reference and the nearest neighbor algorithm used for the classification of poses. Finally, the accuracy in detecting emotions has been assessed by means of poses reported by a sample of users

    Orecchio: Extending Body-Language through Actuated Static and Dynamic Auricular Postures

    Get PDF
    In this paper, we propose using the auricle – the visible part of the ear – as a means of expressive output to extend body language to convey emotional states. With an initial exploratory study, we provide an initial set of dynamic and static auricular postures. Using these results, we examined the relationship between emotions and auricular postures, noting that dynamic postures involving stretching the top helix in fast (e.g., 2Hz) and slow speeds (1Hz) conveyed intense and mild pleasantness while static postures involving bending the side or top helix towards the center of the ear were associated with intense and mild unpleasantness. Based on the results, we developed a prototype (called Orrechio) with miniature motors, custommade robotic arms and other electronic components. A preliminary user evaluation showed that participants feel more comfortable using expressive auricular postures with people they are familiar with, and that it is a welcome addition to the vocabulary of human body language
    corecore