204 research outputs found

    INTERACTIVE SONIFICATION STRATEGIES FOR THE MOTION AND EMOTION OF DANCE PERFORMANCES

    Get PDF
    The Immersive Interactive SOnification Platform, or iISoP for short, is a research platform for the creation of novel multimedia art, as well as exploratory research in the fields of sonification, affective computing, and gesture-based user interfaces. The goal of the iISoP’s dancer sonification system is to “sonify the motion and emotion” of a dance performance via musical auditory display. An additional goal of this dissertation is to develop and evaluate musical strategies for adding layer of emotional mappings to data sonification. The result of the series of dancer sonification design exercises led to the development of a novel musical sonification framework. The overall design process is divided into three main iterative phases: requirement gathering, prototype generation, and system evaluation. For the first phase help was provided from dancers and musicians in a participatory design fashion as domain experts in the field of non-verbal affective communication. Knowledge extraction procedures took the form of semi-structured interviews, stimuli feature evaluation, workshops, and think aloud protocols. For phase two, the expert dancers and musicians helped create test-able stimuli for prototype evaluation. In phase three, system evaluation, experts (dancers, musicians, etc.) and novice participants were recruited to provide subjective feedback from the perspectives of both performer and audience. Based on the results of the iterative design process, a novel sonification framework that translates motion and emotion data into descriptive music is proposed and described

    Data Musicalization

    Get PDF
    Data musicalization is the process of automatically composing music based on given data, as an approach to perceptualizing information artistically. The aim of data musicalization is to evoke subjective experiences in relation to the information, rather than merely to convey unemotional information objectively. This paper is written as a tutorial for readers interested in data musicalization. We start by providing a systematic characterization of musicalization approaches, based on their inputs, methods and outputs. We then illustrate data musicalization techniques with examples from several applications: one that perceptualizes physical sleep data as music, several that artistically compose music inspired by the sleep data, one that musicalizes on-line chat conversations to provide a perceptualization of liveliness of a discussion, and one that uses musicalization in a game-like mobile application that allows its users to produce music. We additionally provide a number of electronic samples of music produced by the different musicalization applications.Peer reviewe

    What do your footsteps sound like? An investigation on interactive footstep sounds adjustment

    Get PDF
    This paper presents an experiment where participants were asked to adjust, while walking, the spectral content and the amplitude of synthetic footstep sounds in order to match the sounds of their own footsteps. The sounds were interactively generated by means of a shoe-based system capable of tracking footfalls and delivering real-time auditory feedback via headphones. Results allowed identification of the mean value and the range of variation of spectral centroid and peak level of footstep sounds simulating various combinations of shoe type and ground material. Results showed that the effect of ground material on centroid and peak level depended on the type of shoe. Similarly, the effect of shoe type on the two variables depended on the type of ground material. In particular, participants produced greater amplitudes for hard sole shoes than for soft sole shoes in presence of solid surfaces, while similar amplitudes for both types of shoes were found for aggregate, hybrids, and liquids. No significant correlations were found between each of the two acoustic features and participants’ body size. This result might be explained by the fact that while adjusting the sounds participants did not primarily focus on the acoustic rendering of their body. In addition, no significant differences were found between the values of the two acoustic features selected by the experimenters and those adjusted by participants. This result can therefore be considered as a measure of the goodness of the design choices to synthesize the involved footstep sounds for a generic walker. More importantly, this study showed that the relationships between the ground-shoes combinations are not changed when participants are actively walking. This represents the first active listening confirmation of this result, which had previously only been shown in passive listening studies. The results of this research can be used to design ecologically-valid auditory rendering of foot-floor interactions in virtual environments.This work was supported partly by a grant from the Danish Council for Independent Research awarded to Luca Turchet (Grant No. 12-131985), and partly by a grant from the ESRC awarded to Ana Tajadura-Jiménez (Grant No. ES/K001477/1)

    As Light as You Aspire to Be: Changing body perception with sound to support physical activity

    Get PDF
    Supporting exercise adherence through technology remains an important HCI challenge. Recent works showed that altering walking sounds leads people perceiving themselves as thinner/lighter, happier and walking more dynamically. While this novel approach shows potential for physical activity, it raises critical questions impacting technology design. We ran two studies in the context of exertion (gym-step, stairs-climbing) to investigate how individual factors impact the effect of sound and the duration of the after-effects. The results confirm that the effects of sound in body-perception occur even in physically demanding situations and through ubiquitous wearable devices. We also show that the effect of sound interacted with participants’ body weight and masculinity/femininity aspirations, but not with gender. Additionally, changes in body-perceptions did not hold once the feedback stopped; however, body-feelings or behavioural changes appeared to persist for longer. We discuss the results in terms of malleability of body-perception and highlight opportunities for supporting exercise adherence

    Plausible Auditory Augmentation of Physical Interaction

    Get PDF
    Weger M, Hermann T, Höldrich R. Plausible Auditory Augmentation of Physical Interaction. In: Proceedings of the 24th International Conference on Auditory Display. Sonification as ADSR. (ICAD 2018). Michigan: ICAD; 2018: 97-104.Interactions with physical objects usually evoke sounds, i.e., audi-tory feedback that depends on the interacting objects (e.g., table,hand, or pencil) and interaction type (e.g., tapping or scratching).The continuous real-time adaptation of sound during interactionenables the manipulation/refinement of perceived characteristics(size, material) of physical objects. Furthermore, when controlledby unrelated external data, the resulting ambient sonifications cankeep users aware of changing data. This article introduces the con-cept ofplausibilityto the topic of auditory augmentations of phys-ical interactions, aiming at providing an experimentation platformfor investigating surface-based physical interactions, understand-ing relevant acoustic cues, redefining these via auditory augmenta-tion / blended sonification and particularly to empirically measurethe plausibility limits of such auditory augmentations. Besidesconceptual contributions along the trade-off between plausibilityand usability, a practical experimentation system is introduced, to-gether with a very first qualitative pilot study

    Automated Analysis of Synchronization in Human Full-body Expressive Movement

    Get PDF
    The research presented in this thesis is focused on the creation of computational models for the study of human full-body movement in order to investigate human behavior and non-verbal communication. In particular, the research concerns the analysis of synchronization of expressive movements and gestures. Synchronization can be computed both on a single user (intra-personal), e.g., to measure the degree of coordination between the joints\u2019 velocities of a dancer, and on multiple users (inter-personal), e.g., to detect the level of coordination between multiple users in a group. The thesis, through a set of experiments and results, contributes to the investigation of both intra-personal and inter-personal synchronization applied to support the study of movement expressivity, and improve the state-of-art of the available methods by presenting a new algorithm to perform the analysis of synchronization
    • …
    corecore