4 research outputs found

    Interactive dance choreography assistance

    Get PDF
    Creative support for the performing arts is prevalent in many fields, however, for the art of dance, automated tools supporting creativity have been scarce. In this research, we describe ongoing research into (semi)automatic automated creative choreography support. Based on state-of-the-art and a survey among 54 choreographers we establish functionalities and requirements for a choreography assistance tool, including the semantic levels at which it should operate and communicate with the end-users. We describe a user study with a prototype tool which presents choreography alternatives using various simple strategies in three dance styles. The results show that the needs for such a tool vary based on the dance discipline. In a second user study, we investigate various methods of presenting choreography variations. Here, we evaluate four presentation methods: textual descriptions, 2D animations, 3D animations and auditory instructions in two different dance styles. The outcome of the expert survey shows that the tool is effective in communicating the variations to the experts and that they express a preference for 3D animations. Based on these results, we propose a design for an interactive dance choreography assistant tool

    Moving sounds and sonic moves : exploring interaction quality of embodied music mediation technologies through a user-centered perspective

    Get PDF
    This research project deals with the user-experience related to embodied music mediation technologies. More specifically, adoption and policy problems surrounding new media (art) are considered, which arise from the usability issues that to date pervade new interfaces for musical expression. Since the emergence of new wireless mediators and control devices for musical expression, there is an explicit aspiration of the creative industries and various research centers to embed such technologies into different areas of the cultural industries. The number of applications and their uses have exponentially increased over the last decade. Conversely, many of the applications to date still suffer from severe usability problems, which not only hinder the adoption by the cultural sector, but also make culture participants take a rather cautious, hesitant, or even downright negative stance towards these technologies. Therefore, this thesis takes a vantage point that is in part sociological in nature, yet has a link to cultural studies as well. It combines this with a musicological frame of reference to which it introduces empirical user-oriented approaches, predominantly taken from the field of human-computer-interaction studies. This interdisciplinary strategy is adopted to cope with the complex nature of digital embodied music controlling technologies. Within the Flanders cultural (and creative) industries, opportunities of systems affiliated with embodied interaction are created and examined. This constitutes an epistemological jigsaw that looks into 1) “which stakeholders require what various levels of involvement, what interactive means and what artistic possibilities?”, 2) “the way in which artistic aspirations, cultural prerequisites and operational necessities of (prospective) users can be defined?”, 3) “how functional, artistic and aesthetic requirements can be accommodated?”, and 4) “how quality of use and quality of experience can be achieved, quantified, evaluated and, eventually, improved?”. Within this multi-facetted problem, the eventual aim is to assess the applicability of the foresaid technology, both from a theoretically and empirically sound basis, and to facilitate widening and enhancing the adoption of said technologies. Methodologically, this is achieved by 1) applied experimentation, 2) interview techniques, 3) self-reporting and survey research, 4) usability evaluation of existing devices, and 5) human-computer interaction methods applied – and attuned – to the specific case of embodied music mediation technologies. Within that scope, concepts related to usability, flow, presence, goal assessment and game enjoyment are scrutinized and applied, and both task- and experience-oriented heuristics and metrics are developed and tested. In the first part, covering three chapters, the general context of the thesis is given. In the first chapter, an introduction to the topic is offered and the current problems are enumerated. In the second chapter, a broader theoretical background is presented of the concepts that underpin the project, namely 1) the paradigm of embodiment and its connection to musicology, 2) a state of the arts concerning new interfaces for musical expression, 3) an introduction into HCI-usability and its application domain in systematic musicology, 4) an insight into user-centered digital design procedures, and 5) the challenges brought about by e-culture and digitization for the cultural-creative industries. In the third chapter, the state of the arts concerning the available methodologies related to the thesis’ endeavor is discussed, a set of literature-based design guidelines are enumerated and from this a conceptual model is deduced which is gradually presented throughout the thesis, and fully deployed in the “SoundField”-project (as described in Chapter 9). The following chapters, contained in the second part of the thesis, give a quasi-chronological overview of how methodological concepts have been applied throughout the empirical case studies, aimed specifically at the exploration of the various aspects of the complex status quaestionis. In the fourth chapter, a series of application-based tests, predominantly revolving around interface evaluation, illustrate the complex relation between gestural interfaces and meaningful musical expression, advocating a more user-centered development approach to be adopted. In the fifth chapter, a multi-purpose questionnaire dubbed “What Moves You” is discussed, which aimed at creating a survey of the (prospective) end-users of embodied music mediation technologies. Therefore, it primarily focused on cultural background, musical profile and preferences, views on embodied interaction, literacy of and attitudes towards new technology and participation in digital culture. In the sixth chapter, the ethnographical studies that accompanied the exhibition of two interactive art pieces, entitled "Heart as an Ocean" & "Lament", are discussed. In these studies, the use of interview and questionnaire methodologies together with the presentation and reception of interactive art pieces, are probed. In the seventh chapter, the development of the collaboratively controlled music-game “Sync-In-Team” is presented, in which interface evaluation, presence, game enjoyment and goal assessment are the pivotal topics. In the eighth chapter, two usability studies are considered, that were conducted on prototype systems/interfaces, namely a heuristic evaluation of the “Virtual String” and a usability metrics evaluation on the “Multi-Level Sonification Tool”. The findings of these two studies in conjunction with the exploratory studies performed in association with the interactive art pieces, finally gave rise to the “SoundField”-project, which is recounted in full throughout the ninth chapter. The integrated participatory design and evaluation method, presented in the conceptual model is fully applied over the course of the “SoundField”-project, in which technological opportunities and ecological validity and applicability are investigated through user-informed development of numerous use cases. The third and last part of the thesis renders the final conclusions of this research project. The tenth chapter sets out with an epilogue in which a brief overview is given on how the state of the arts has evolved since the end of the project (as the research ended in 2012, but the research field has obviously moved on), and attempts to consolidate the implications of the research studies with some of the realities of the Flemish cultural-creative industries. Chapter eleven continues by discussing the strengths and weaknesses of the conceptual model throughout the various stages of the project. Also, it comprises the evaluation of the hypotheses, how the assumptions that were made held up, and how the research questions eventually could be assessed. Finally, the twelfth and last chapter concludes with the most important findings of the project. Also, it discusses some of the implications on cultural production, artistic research policy and offers an outlook on future research beyond the scope of the “SoundField” project

    From head to toe:body movement for human-computer interaction

    Get PDF
    Our bodies are the medium through which we experience the world around us, so human-computer interaction can highly benefit from the richness of body movements and postures as an input modality. In recent years, the widespread availability of inertial measurement units and depth sensors led to the development of a plethora of applications for the body in human-computer interaction. However, the main focus of these works has been on using the upper body for explicit input. This thesis investigates the research space of full-body human-computer interaction through three propositions. The first proposition is that there is more to be inferred by natural users’ movements and postures, such as the quality of activities and psychological states. We develop this proposition in two domains. First, we explore how to support users in performing weight lifting activities. We propose a system that classifies different ways of performing the same activity; an object-oriented model-based framework for formally specifying activities; and a system that automatically extracts an activity model by demonstration. Second, we explore how to automatically capture nonverbal cues for affective computing. We developed a system that annotates motion and gaze data according to the Body Action and Posture coding system. We show that quality analysis can add another layer of information to activity recognition, and that systems that support the communication of quality information should strive to support how we implicitly communicate movement through nonverbal communication. Further, we argue that working at a higher level of abstraction, affect recognition systems can more directly translate findings from other areas into their algorithms, but also contribute new knowledge to these fields. The second proposition is that the lower limbs can provide an effective means of interacting with computers beyond assistive technology To address the problem of the dispersed literature on the topic, we conducted a comprehensive survey on the lower body in HCI, under the lenses of users, systems and interactions. To address the lack of a fundamental understanding of foot-based interactions, we conducted a series of studies that quantitatively characterises several aspects of foot-based interaction, including Fitts’s Law performance models, the effects of movement direction, foot dominance and visual feedback, and the overhead incurred by using the feet together with the hand. To enable all these studies, we developed a foot tracker based on a Kinect mounted under the desk. We show that the lower body can be used as a valuable complementary modality for computing input. Our third proposition is that by treating body movements as multiple modalities, rather than a single one, we can enable novel user experiences. We develop this proposition in the domain of 3D user interfaces, as it requires input with multiple degrees of freedom and offers a rich set of complex tasks. We propose an approach for tracking the whole body up close, by splitting the sensing of different body parts across multiple sensors. Our setup allows tracking gaze, head, mid-air gestures, multi-touch gestures, and foot movements. We investigate specific applications for multimodal combinations in the domain of 3DUI, specifically how gaze and mid-air gestures can be combined to improve selection and manipulation tasks; how the feet can support the canonical 3DUI tasks; and how a multimodal sensing platform can inspire new 3D game mechanics. We show that the combination of multiple modalities can lead to enhanced task performance, that offloading certain tasks to alternative modalities not only frees the hands, but also allows simultaneous control of multiple degrees of freedom, and that by sensing different modalities separately, we achieve a more detailed and precise full body tracking
    corecore