47 research outputs found

    Tools for expressive gesture recognition and mapping in rehearsal and performance

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2010.Cataloged from PDF version of thesis.Includes bibliographical references (p. 97-101).As human movement is an incredibly rich mode of communication and expression, performance artists working with digital media often use performers' movement and gestures to control and shape that digital media as part of a theatrical, choreographic, or musical performance. In my own work, I have found that strong, semantically-meaningful mappings between gesture and sound or visuals are necessary to create compelling performance interactions. However, the existing systems for developing mappings between incoming data streams and output media have extremely low-level concepts of "gesture." The actual programming process focuses on low-level sensor data, such as the voltage values of a particular sensor, which limits the user in his or her thinking process, requires users to have significant programming experience, and loses the expressive, meaningful, and metaphor-rich content of the movement. To remedy these difficulties, I have created a new framework and development environment for gestural control of media in rehearsal and performance, allowing users to create clear and intuitive mappings in a simple and flexible manner by using high-level descriptions of gestures and of gestural qualities. This approach, the Gestural Media Framework, recognizes continuous gesture and translates Laban Effort Notation into the realm of technological gesture analysis, allowing for the abstraction and encapsulation of sensor data into movement descriptions. As part of the evaluation of this system, I choreographed four performance pieces that use this system throughout the performance and rehearsal process to map dancers' movements to manipulation of sound and visual elements. This work has been supported by the MIT Media Laboratory.by Elena Naomi Jessop.S.M

    MMixte: a software architecture for Live Electronics with acoustic instruments : exemplary application cases

    Get PDF
    MMixte is a middleware based on Max for mixed music with live electronics. It enables programming for a “patcher concerto”, a platform, that is, for the management of live electronics in just a few minutes and with extreme simplicity. Dedicated to average and expert users, MMixte enables true programming of live electronics in very little time while also enabling easy adapting of previously developed modules, depending on the case and its needs. The architecture behind MMixte is based on a variation of so-called “pipeline architecture"; the analysis of the most widely used software architectures in the market and design patterns to program graphic interfaces has led to the conception of ways of organizing communication between various modules, the way they are being used and their graphic appearence. Analysis of other, “state of the art” module collections and other software programs dedicated to mixed music shows the absence of another work on software architecture for mixed music. Application of MMixte to some of my personal works shows demonstrates its flexibility and ease of adaptation. Computer programming for a piece of mixed music requires much that goes beyond just programming of audio signal processing. The present work seeks to provide an example of a solution to such needs

    THE VARIETIES OF USER EXPERIENCE BRIDGING EMBODIED METHODOLOGIES FROM SOMATICS AND PERFORMANCE TO HUMAN COMPUTER INTERACTION

    Get PDF
    Embodied Interaction continues to gain significance within the field of Human Computer Interaction (HCI). Its growing recognition and value is evidenced in part by a remarkable increase in systems design and publication focusing on various aspects of Embodiment. The enduring need to interact through experience has spawned a variety of interdisciplinary bridging strategies in the hope of gaining deeper understanding of human experience. Along with phenomenology, cognitive science, psychology and the arts, recent interdisciplinary contributions to HCI include the knowledge-rich domains of Somatics and Performance that carry long-standing traditions of embodied practice. The common ground between HCI and the fields of Somatics and Performance is based on the need to understand and model human experience. Yet, Somatics and Performance differ from normative HCI in their epistemological frameworks of embodiment. This is particularly evident in their histories of knowledge construction and representation. The contributions of Somatics and Performance to the history of embodiment are not yet fully understood within HCI. Differing epistemologies and their resulting approaches to experience identify an under-theorized area of research and an opportunity to develop a richer knowledge and practice base. This is examined by comparing theories and practices of embodied experience between HCI and Somatics (Performance) and analyzing influences, values and assumptions underlying epistemological frameworks. The analysis results in a set of design strategies based in embodied practices within Somatics and Performance. The subsequent application of these strategies is examined through a series of interactive art installations that employ embodied interaction as a central expression of technology. Case Studies provide evidence in the form of rigorously documented design processes that illustrate these strategies. This research exemplifies 'Research through Art' applied in the context of experience design for tangible, wearable and social interaction

    A forearm controller and tactile display

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2005.Includes bibliographical references (leaves 187-192).This thesis discusses the design and implementation of ARMadillo, a simple virtual environment interface in the form of a small wireless device that is worn on the forearm. Designed to be portable, intuitive, and low cost, the device tracks the orientation of the arm with accelerometers, magnetic field sensors, and gyroscopes, fusing the data with a quaternion based Unscented Kalman Filter. The orientation estimate is mapped to a virtual space that is perceived through a tactile display containing an array of vibrating motors. The controller is driven with an 8051 microcontroller, and includes a BlueTooth module and an extension slot for CompactFlash cards. The device was designed to be simple and modular, and can support a variety of interesting applications, some of which were implemented and will be discussed. These fall into two main classes. The first is a set of artistic applications, represented by a suite of virtual musical instruments that can be played with arm movements and felt through the tactile display, The second class involves utilitarian applications, including a custom Braille-like system called Arm Braille, and tactile guidance. A wearable Braille display intended to be used for reading navigational signs and text messages was tested on two sight-impaired subjects who were able to recognize Braille characters reliably after 25 minutes of training, and read words by the end of an hour.by David Matthew Sachs.S.M
    corecore