1,373 research outputs found

    Using the Audio Respiration Signal for Multimodal Discrimination of Expressive Movement Qualities

    Get PDF
    In this paper we propose a multimodal approach to distinguish between movements displaying three different expressive qualities: fluid, fragmented, and impulsive movements. Our approach is based on the Event Synchronization algorithm, which is applied to compute the amount of synchronization between two low-level features extracted from multimodal data. In more details, we use the energy of the audio respiration signal captured by a standard microphone placed near to the mouth, and the whole body kinetic energy estimated from motion capture data. The method was evaluated on 90 movement segments performed by 5 dancers. Results show that fragmented movements display higher average synchronization than fluid and impulsive movements

    The dancer in the eye: Towards a multi-layered computational framework of qualities in movement

    Get PDF
    This paper presents a conceptual framework for the analysis of expressive qualities of movement. Our perspective is to model an observer of a dance performance. The conceptual framework is made of four layers, ranging from the physical signals that sensors capture to the qualities that movement communicate (e.g., in terms of emotions). The framework aims to provide a conceptual background the development of computational systems can build upon, with a particular reference to systems analyzing a vocabulary of expressive movement qualities, and translating them to other sensory channels, such as the auditory modality. Such systems enable their users to "listen to a choreography" or to "feel a ballet", in a new kind of cross-modal mediated experience

    Automated Analysis of Synchronization in Human Full-body Expressive Movement

    Get PDF
    The research presented in this thesis is focused on the creation of computational models for the study of human full-body movement in order to investigate human behavior and non-verbal communication. In particular, the research concerns the analysis of synchronization of expressive movements and gestures. Synchronization can be computed both on a single user (intra-personal), e.g., to measure the degree of coordination between the joints\u2019 velocities of a dancer, and on multiple users (inter-personal), e.g., to detect the level of coordination between multiple users in a group. The thesis, through a set of experiments and results, contributes to the investigation of both intra-personal and inter-personal synchronization applied to support the study of movement expressivity, and improve the state-of-art of the available methods by presenting a new algorithm to perform the analysis of synchronization

    Affective Medicine: a review of Affective Computing efforts in Medical Informatics

    Get PDF
    Background: Affective computing (AC) is concerned with emotional interactions performed with and through computers. It is defined as “computing that relates to, arises from, or deliberately influences emotions”. AC enables investigation and understanding of the relation between human emotions and health as well as application of assistive and useful technologies in the medical domain. Objectives: 1) To review the general state of the art in AC and its applications in medicine, and 2) to establish synergies between the research communities of AC and medical informatics. Methods: Aspects related to the human affective state as a determinant of the human health are discussed, coupled with an illustration of significant AC research and related literature output. Moreover, affective communication channels are described and their range of application fields is explored through illustrative examples. Results: The presented conferences, European research projects and research publications illustrate the recent increase of interest in the AC area by the medical community. Tele-home healthcare, AmI, ubiquitous monitoring, e-learning and virtual communities with emotionally expressive characters for elderly or impaired people are few areas where the potential of AC has been realized and applications have emerged. Conclusions: A number of gaps can potentially be overcome through the synergy of AC and medical informatics. The application of AC technologies parallels the advancement of the existing state of the art and the introduction of new methods. The amount of work and projects reviewed in this paper witness an ambitious and optimistic synergetic future of the affective medicine field

    Mobile experiences of historical place: a multimodal analysis of emotional engagement

    Get PDF
    This article explores how to research the opportunities for emotional engagement that mobile technologies provide for the design and enactment of learning environments. In the context of mobile technologies that foster location-based linking, we make the case for the centrality of in situ real-time observational research on how emotional engagement unfolds and for the inclusion of bodily aspects of interaction. We propose that multimodal methods offer tools for observing emotion as a central facet of person–environment interaction and provide an example of these methods put into practice for a study of emotional engagement in mobile history learning. A multimodal analysis of video data from 16 pairs of 9- to 10-year-olds learning about the World War II history of their local Common is used to illustrate how students’ emotional engagement was supported by their use of mobile devices through multimodal layering and linking of stimuli, the creation of digital artifacts, and changes in pace. These findings are significant for understanding the role of digital augmentation in fostering emotional engagement in history learning, informing how digital augmentation can be designed to effectively foster emotional engagement for learning, and providing insight into the benefits of multimodality as an analytical approach for examining emotion through bodily interaction

    Mobile experiences of historical place: a multimodal analysis of emotional engagement

    Get PDF
    This article explores how to research the opportunities for emotional engagement that mobile technologies provide for the design and enactment of learning environments. In the context of mobile technologies that foster location-based linking, we make the case for the centrality of in situ real-time observational research on how emotional engagement unfolds and for the inclusion of bodily aspects of interaction. We propose that multimodal methods offer tools for observing emotion as a central facet of person–environment interaction and provide an example of these methods put into practice for a study of emotional engagement in mobile history learning. A multimodal analysis of video data from 16 pairs of 9- to 10-year-olds learning about the World War II history of their local Common is used to illustrate how students’ emotional engagement was supported by their use of mobile devices through multimodal layering and linking of stimuli, the creation of digital artifacts, and changes in pace. These findings are significant for understanding the role of digital augmentation in fostering emotional engagement in history learning, informing how digital augmentation can be designed to effectively foster emotional engagement for learning, and providing insight into the benefits of multimodality as an analytical approach for examining emotion through bodily interaction

    Analysis of movement quality in full-body physical activities

    Get PDF
    Full-body human movement is characterized by fine-grain expressive qualities that humans are easily capable of exhibiting and recognizing in others' movement. In sports (e.g., martial arts) and performing arts (e.g., dance), the same sequence of movements can be performed in a wide range of ways characterized by different qualities, often in terms of subtle (spatial and temporal) perturbations of the movement. Even a non-expert observer can distinguish between a top-level and average performance by a dancer or martial artist. The difference is not in the performed movements-the same in both cases-but in the \u201cquality\u201d of their performance. In this article, we present a computational framework aimed at an automated approximate measure of movement quality in full-body physical activities. Starting from motion capture data, the framework computes low-level (e.g., a limb velocity) and high-level (e.g., synchronization between different limbs) movement features. Then, this vector of features is integrated to compute a value aimed at providing a quantitative assessment of movement quality approximating the evaluation that an external expert observer would give of the same sequence of movements. Next, a system representing a concrete implementation of the framework is proposed. Karate is adopted as a testbed. We selected two different katas (i.e., detailed choreographies of movements in karate) characterized by different overall attitudes and expressions (aggressiveness, meditation), and we asked seven athletes, having various levels of experience and age, to perform them. Motion capture data were collected from the performances and were analyzed with the system. The results of the automated analysis were compared with the scores given by 14 karate experts who rated the same performances. Results show that the movement-quality scores computed by the system and the ratings given by the human observers are highly correlated (Pearson's correlations r = 0.84, p = 0.001 and r = 0.75, p = 0.005)
    • …
    corecore