8 research outputs found

    Developing the Dance Jockey system for musical interaction with the Xsens MVN suit

    Get PDF
    In this paper we present the Dance Jockey System, a system developed for using a full body inertial motion capture suit (Xsens MVN) in music/dance performances. We present different strategies for extracting relevant postures and actions from the continuous data, and how these postures and actions can be used to control sonic and musical features. The system has been used in several public performances, and we believe it has great potential for further exploration. However, to overcome the current practical and technical challenges when working with the system, it is important to further refine tools and software in order to facilitate making of new performance pieces. Proceedings of the 12th International Conference on New Interfaces for Musical Expression. University of Michigan Press 2012 ISBN 978-0-9855720-1-3

    The Tesseract: using the body’s movement to shape my compositional practice.

    Get PDF
    The aim of this research project is to challenge my practice as a popular music songwriter and producer. To achieve this, I have developed a theoretically informed dance/movement paradigm to shape sound and provoke alternative production methodologies within my compositional practice. My work questions how concepts from Modern Architecture, The Bauhaus, Laban Movement Theory, Dance Movement Therapy and the psychology of movement can be coherently integrated into my technologically-mediated creative practice. Inspired by Laban Movement Theory, I have created The Tesseract. This is the culmination of research investigating how I can create a compositional system that enables the mapping of my body’s movement to parametric control in custom audio effects and what are the performative and psychological implications I encounter arising from this system. The Tesseract allows me to integrate my intuitive physical movement and embodied cognition to create an interactive and dynamic means of sculpting sound in space. The Tesseract provides me with a compositionalkinaesthetic feedback tool that provides an auto-ethnographic reflection on my long established compositional methods and how these have changed as a result of this research. The tactility and physicality of controlling sound through movement offers a fundamentally different experience of working in a studio: one in which the perception of what is occurring sonically at any given moment can be immediately reconfigured through movement projection. Ultimately, this research project facilitates the integration of The Tesseract as a theoretically informed way to shape sound through movement and a technologically mediated means for mindfulness. Through creating works in this way, I investigate connections between Dance Movement Therapy and psychotherapy as it relates to embodied cognition, memory, and trauma

    Human-Computer interaction methodologies applied in the evaluation of haptic digital musical instruments

    Get PDF
    Recent developments in interactive technologies have seen major changes in the manner in which artists, performers, and creative individuals interact with digital music technology; this is due to the increasing variety of interactive technologies that are readily available today. Digital Musical Instruments (DMIs) present musicians with performance challenges that are unique to this form of computer music. One of the most significant deviations from conventional acoustic musical instruments is the level of physical feedback conveyed by the instrument to the user. Currently, new interfaces for musical expression are not designed to be as physically communicative as acoustic instruments. Specifically, DMIs are often void of haptic feedback and therefore lack the ability to impart important performance information to the user. Moreover, there currently is no standardised way to measure the effect of this lack of physical feedback. Best practice would expect that there should be a set of methods to effectively, repeatedly, and quantifiably evaluate the functionality, usability, and user experience of DMIs. Earlier theoretical and technological applications of haptics have tried to address device performance issues associated with the lack of feedback in DMI designs and it has been argued that the level of haptic feedback presented to a user can significantly affect the user’s overall emotive feeling towards a musical device. The outcome of the investigations contained within this thesis are intended to inform new haptic interface

    OSC Implementation and Evaluation of the Xsens MVN suit

    Get PDF
    The paper presents research about implementing a full body inertial motion capture system, the Xsens MVN suit, for musical interaction. Three di erent approaches for streaming real time and prerecorded motion capture data with Open Sound Control have been implemented. Furthermore, we present technical performance details and our experience with the motion capture system in realistic practice. Part of Proceedings of the International Conference on New Interfaces for Musical Expression 2011 http://urn.nb.no/URN:NBN:no-2936

    Computational Shifts in Theatrical Space

    Get PDF
    This dissertation describes a set of research projects that were conducted between 2012 and 2014 in order to answer the question how do computational ideas alter our understanding of place? Each project was produced in the context of the performing arts and included plays, dance performances and film and installation work. For each project new software and hardware systems were created as a means of exploring different types of mediated communication. These systems include a scalable depth-camera based tracking system for performance on stage, a tool for manipulation of live-streamed video incorporated into stage performance, a method of tracking biometric data of performers live during the performance and a game-engine for creating interactive environments. Collectively these experiments establish a framework for the discussion of the nature of the shifts caused by applying computational ideas to space. Finally, the results lay the foundation for further theoretical work concerning the creation of cultural artifacts that exist somewhere between the material and immaterial, the influence of computation on the nature of modeling, and the impact of ubiquitous computing on contemporary notions of performance and play

    Diseño en tres niveles en el ámbito de la salud. Captura de movimiento para análisis de la marcha en rehabilitación.

    Get PDF
    En el ámbito de la salud, tecnologías como la captura de movimiento, la dinamometría, o la electromiografía de superficie, entre otras, ofrecen amplias posibilidades para objetivar la capacidad musculoesquelética de los pacientes, favoreciendo el diagnóstico o el seguimiento del proceso de rehabilitación. No obstante, conseguir que estas tecnologías se adapten e integren adecuadamente en el contexto de los servicios sanitarios implica un reto complejo de abordar. En esta tesis se presenta un compendio de publicaciones que dan respuesta a diferentes retos detectados durante el diseño, desarrollo y uso de tecnologías de evaluación del sistema musculoesquelético en el ámbito biosanitario. Particularmente, nos centramos en los sistemas de captura de movimiento; su complejidad a nivel operativo (colocación de diferentes elementos sobre el cuerpo), tecnológico (multitud de dispositivos electrónicos inalámbricos), y de análisis (generación de gran volumen de información) pone de manifiesto la necesidad de abordar esta investigación. - La primera publicación presenta las necesidades que han motivado esta tesis, exponiendo las bases y objetivos de la misma, que se enmarcan en healthcare, biomechanics, y usability. - La segunda introduce la metodología desarrollada Octopus, dirigida a apoyar el diseño de sistemas de captura de movimiento. Esta investigación clasifica y esquematiza los factores que deben considerarse durante el diseño y propone la idea de “Diseño en tres niveles”: servicio, producto y software; que son las principales líneas de trabajo a la hora de desarrollar aplicaciones dirigidas a evaluar el sistema musculoesquelético. Llegados a este punto, y fruto de diferentes colaboraciones del grupo de investigación con hospitales públicos de nuestra comunidad, así como una revisión exhaustiva del estado del arte, los trabajos de investigación se dirigieron hacia un nicho o caso de estudio enmarcado en el contexto de rehabilitación hospitalaria. En concreto, diseñar una prueba clínica de análisis de la marcha basada en captura de movimiento para monitorizar tratamientos de rehabilitación mediante sesiones de medición previas y posteriores a los tratamientos.- Como resultado se elaboró una publicación encuadra en el primer nivel de Octopus, el diseño de servicios. Esta investigación estudia cómo integrar el test de análisis de la marcha en la rehabilitación hospitalaria. Incorporar un micro-servicio (test de análisis de la marcha) en un macro-servicio como es la rehabilitación, no es una tarea sencilla de abordar. Por ello, se propone un enfoque metodológico para evaluar cualitativamente el test de marcha en su contexto, cuya aplicación permitió obtener guías de diseño que proporcionan conocimiento multidisciplinar para integrar el test en la rehabilitación.- Asimismo, se realizó una publicación que responde a otro de los niveles de Octopus, el diseño de producto. Este trabajo presenta un sistema de captura de movimiento llamado Move-Human Sensors (MH) que permite realizar pruebas de análisis de la marcha. Este sistema responde a las necesidades detectadas en el estudio del servicio, e incorpora dos funcionalidades clave: un procedimiento de calibración anatómica que evita las perturbaciones magnéticas las cuales tienen efectos negativos en la captura de movimiento inercial; así como un algoritmo que detecta gait events a partir de los datos de movimiento sin requerir de instrumentación complementaria. - Finalmente, se desarrolla una publicación enmarcada en el último nivel de Octopus, el diseño de software. Este estudio propone un método para gestionar los datos resultantes del test de análisis de la marcha. Dicho método permite comparar las capturas pre- y post-tratamiento para realizar el seguimiento de pacientes en rehabilitación, proporcionando información visual y específica al facultativo que puede apoyar la toma de decisiones clínicas.Esta tesis contribuye al ámbito del diseño, aportando una perspectiva global basada en tres niveles: producto, servicio, y software. En esta línea, se presenta un sistema de captura de movimiento, cuyo diseño y desarrollo recorre los tres niveles y da solución a los retos clave detectados. Conforme se profundiza en el propio caso de estudio de análisis de la marcha, se avanza en wearables, biomecánica, tecnología de captura de movimiento, usabilidad y algoritmos de análisis de datos. Las implicaciones que tiene esta investigación van más allá de las publicaciones descritas, ya que se enmarca en diferentes proyectos más amplios que dan sentido a la unidad temática de los capítulos abordados. En consecuencia, este trabajo está apoyado por otras comunicaciones científicas, desarrollos y colaboraciones que se han elaborado de manera paralela. Asimismo, los resultados de esta investigación están siendo extrapolados a otras áreas relacionadas; tanto en el propio sector sanitario -para la realización pruebas de valoración funcional- como en el sector industrial -para la realización de evaluación ergonómica de puestos de trabajo-, donde esta tecnología puede aportar valor.<br /

    Mechanisms of Stability and Energy Expenditure in Human Locomotion.

    Full text link
    Although humans normally walk with both stability and energy economy, either feature may be challenging for persons with disabilities. For example, in patients with lower-limb amputation, falling is pervasive, and may lead to activity avoidance. Similarly, energy expenditure is higher than for healthy subjects and may deter patients from walking, reducing mobility. A better understanding of the fundamental principles of stability and economy could lead to better prostheses that increase quality of life for patients. When designing a mechanism to assist or mimic human gait, such as orthoses or walking robots, the stability and economy of the resulting gait should be considered. To further our understanding of these fundamental principles of gait, I explore a lesser known balance mechanism, foot heading, as well as the role of muscle force production costs in gait. To investigate the stabilizing role of foot heading, I first characterize a method of measuring natural human gait variability outside of lab environments using foot mounted inertial sensors. Accuracy is found comparable to motion capture, while allowing capture of gait in natural environments. Then, using both a simple model of walking, and a variability analysis of human walking, I present evidence that humans stabilize gait laterally by altering foot heading step-to-step. I then consider the metabolic cost of force production in human locomotion. First, an optimization study of a simple model of locomotion shows that force fluctuation costs have a stronger role in determining gait than force amplitude costs. I then illustrate the connection between force fluctuation and a cost for calcium pumping in muscles using a simple muscle model. Finally, a human subject experiment altering force fluctuation in walking demonstrates the higher metabolic cost of fluctuating forces. While human locomotion is a complex activity involving many muscles, sensory systems, and neural circuitry, we can use basic mechanical models to study underlying principles of gait. A better understanding of stability and economy could have applications to many fields involving locomotion, such as the diagnosis of fall-risk in elderly subjects, the development of rehabilitation techniques, the design of prostheses, and the creation of robust and practical walking machines.PHDMechanical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/108908/1/jrebula_1.pd

    Proceedings of the International Conference on New Interfaces for Musical Expression

    Get PDF
    Editors: Alexander Refsum Jensenius, Anders Tveit, Rolf Inge Godøy, Dan Overholt Table of Contents -Tellef Kvifte: Keynote Lecture 1: Musical Instrument User Interfaces: the Digital Background of the Analog Revolution - page 1 -David Rokeby: Keynote Lecture 2: Adventures in Phy-gital Space - page 2 -Sergi Jordà: Keynote Lecture 3: Digital Lutherie and Multithreaded Musical Performance: Artistic, Scientific and Commercial Perspectives - page 3 Paper session A — Monday 30 May 11:00–12:30 -Dan Overholt: The Overtone Fiddle: an Actuated Acoustic Instrument - page 4 -Colby Leider, Matthew Montag, Stefan Sullivan and Scott Dickey: A Low-Cost, Low-Latency Multi-Touch Table with Haptic Feedback for Musical Applications - page 8 -Greg Shear and Matthew Wright: The Electromagnetically Sustained Rhodes Piano - page 14 -Laurel Pardue, Christine Southworth, Andrew Boch, Matt Boch and Alex Rigopulos: Gamelan Elektrika: An Electronic Balinese Gamelan - page 18 -Jeong-Seob Lee and Woon Seung Yeo: Sonicstrument: A Musical Interface with Stereotypical Acoustic Transducers - page 24 Poster session B— Monday 30 May 13:30–14:30 -Scott Smallwood: Solar Sound Arts: Creating Instruments and Devices Powered by Photovoltaic Technologies - page 28 -Niklas Klügel, Marc René Frieß and Georg Groh: An Approach to Collaborative Music Composition - page 32 -Nicolas Gold and Roger Dannenberg: A Reference Architecture and Score Representation for Popular Music Human-Computer Music Performance Systems - page 36 -Mark Bokowiec: V’OCT (Ritual): An Interactive Vocal Work for Bodycoder System and 8 Channel Spatialization - page 40 -Florent Berthaut, Haruhiro Katayose, Hironori Wakama, Naoyuki Totani and Yuichi Sato: First Person Shooters as Collaborative Multiprocess Instruments - page 44 -Tilo Hähnel and Axel Berndt: Studying Interdependencies in Music Performance: An Interactive Tool - page 48 -Sinan Bokesoy and Patrick Adler: 1city 1001vibrations: development of a interactive sound installation with robotic instrument performance - page 52 -Tim Murray-Browne, Di Mainstone, Nick Bryan-Kinns and Mark D. Plumbley:The medium is the message: Composing instruments and performing mappings - page 56 -Seunghun Kim, Luke Keunhyung Kim, Songhee Jeong and Woon Seung Yeo: Clothesline as a Metaphor for a Musical Interface - page 60 -Pietro Polotti and Maurizio Goina: EGGS in action - page 64 -Berit Janssen: A Reverberation Instrument Based on Perceptual Mapping - page 68 -Lauren Hayes: Vibrotactile Feedback-Assisted Performance - page 72 -Daichi Ando: Improving User-Interface of Interactive EC for Composition-Aid by means of Shopping Basket Procedure - page 76 -Ryan McGee, Yuan-Yi Fan and Reza Ali: BioRhythm: a Biologically-inspired Audio-Visual Installation - page 80 -Jon Pigott: Vibration, Volts and Sonic Art: A practice and theory of electromechanical sound - page 84 -George Sioros and Carlos Guedes: Automatic Rhythmic Performance in Max/MSP: the kin.rhythmicator - page 88 -Andre Goncalves: Towards a Voltage-Controlled Computer — Control and Interaction Beyond an Embedded System - page 92 -Tae Hun Kim, Satoru Fukayama, Takuya Nishimoto and Shigeki Sagayama: Polyhymnia: An automatic piano performance system with statistical modeling of polyphonic expression and musical symbol interpretation - page 96 -Juan Pablo Carrascal and Sergi Jorda: Multitouch Interface for Audio Mixing - page 100 -Nate Derbinsky and Georg Essl: Cognitive Architecture in Mobile Music Interactions - page 104 -Benjamin D. Smith and Guy E. Garnett: The Self-Supervising Machine - page 108 -Aaron Albin, Sertan Senturk, Akito Van Troyer, Brian Blosser, Oliver Jan and Gil Weinberg: Beatscape, a mixed virtual-physical environment for musical ensembles - page 112 -Marco Fabiani, Gaël Dubus and Roberto Bresin: MoodifierLive: Interactive and collaborative expressive music performance on mobile devices - page 116 -Benjamin Schroeder, Marc Ainger and Richard Parent: A Physically Based Sound Space for Procedural Agents - page 120 -Francisco Garcia, Leny Vinceslas, Esteban Maestre and Josep Tubau Acquisition and study of blowing pressure profiles in recorder playing - page 124 -Anders Friberg and Anna Källblad:Experiences from video-controlled sound installations - page 128 -Nicolas d’Alessandro, Roberto Calderon and Stefanie Müller: ROOM#81 —Agent-Based Instrument for Experiencing Architectural and Vocal Cues - page 132 Demo session C — Monday 30 May 13:30–14:30 -Yasuo Kuhara and Daiki Kobayashi: Kinetic Particles Synthesizer Using Multi-Touch Screen Interface of Mobile Devices - page 136 -Christopher Carlson, Eli Marschner and Hunter Mccurry: The Sound Flinger: A Haptic Spatializer - page 138 -Ravi Kondapalli and Benzhen Sung: Daft Datum – an Interface for Producing Music Through Foot-Based Interaction - page 140 -Charles Martin and Chi-Hsia Lai: Strike on Stage: a percussion and media performance - page 142 Paper session D — Monday 30 May 14:30–15:30 -Baptiste Caramiaux, Patrick Susini, Tommaso Bianco, Frédéric Bevilacqua, Olivier Houix, Norbert Schnell and Nicolas Misdariis: Gestural Embodiment of Environmental Sounds: an Experimental Study - page 144 -Sebastian Mealla, Aleksander Valjamae, Mathieu Bosi and Sergi Jorda: Listening to Your Brain: Implicit Interaction in Collaborative Music Performances - page 149 -Dan Newton and Mark Marshall: Examining How Musicians Create Augmented Musical Instruments - page 155 Paper session E — Monday 30 May 16:00–17:00 -Zachary Seldess and Toshiro Yamada: Tahakum: A Multi-Purpose Audio Control Framework - page 161 -Dawen Liang, Guangyu Xia and Roger Dannenberg: A Framework for Coordination and Synchronization of Media - page 167 -Edgar Berdahl and Wendy Ju: Satellite CCRMA: A Musical Interaction and Sound Synthesis Platform - page 173 Paper session F — Tuesday 31 May 09:00–10:50 -Nicholas J. Bryan and Ge Wang: Two Turntables and a Mobile Phone - page 179 -Nick Kruge and Ge Wang: MadPad: A Crowdsourcing System for Audiovisual Sampling - page 185 -Patrick O’Keefe and Georg Essl: The Visual in Mobile Music Performance - page 191 -Ge Wang, Jieun Oh and Tom Lieber: Designing for the iPad: Magic Fiddle - page 197 -Benjamin Knapp and Brennon Bortz: MobileMuse: Integral Music Control Goes Mobile - page 203 -Stephen Beck, Chris Branton, Sharath Maddineni, Brygg Ullmer and Shantenu Jha: Tangible Performance Management of Grid-based Laptop Orchestras - page 207 Poster session G— Tuesday 31 May 13:30–14:30 -Smilen Dimitrov and Stefania Serafin: Audio Arduino—an ALSA (Advanced Linux Sound Architecture) audio driver for FTDI-based Arduinos - page 211 -Seunghun Kim and Woon Seung Yeo: Musical control of a pipe based on acoustic resonance - page 217 -Anne-Marie Hansen, Hans Jørgen Andersen and Pirkko Raudaskoski: Play Fluency in Music Improvisation Games for Novices - page 220 -Izzi Ramkissoon: The Bass Sleeve: A Real-time Multimedia Gestural Controller for Augmented Electric Bass Performance - page 224 -Ajay Kapur, Michael Darling, James Murphy, Jordan Hochenbaum, Dimitri Diakopoulos and Trimpin: The KarmetiK NotomotoN: A New Breed of Musical Robot for Teaching and Performance - page 228 -Adrian Barenca Aliaga and Giuseppe Torre: The Manipuller: Strings Manipulation and Multi-Dimensional Force Sensing - page 232 -Alain Crevoisier and Cécile Picard-Limpens: Mapping Objects with the Surface Editor - page 236 -Jordan Hochenbaum and Ajay Kapur: Adding Z-Depth and Pressure Expressivity to Tangible Tabletop Surfaces - page 240 -Andrew Milne, Anna Xambó, Robin Laney, David B. Sharp, Anthony Prechtl and Simon Holland: Hex Player—A Virtual Musical Controller - page 244 -Carl Haakon Waadeland: Rhythm Performance from a Spectral Point of View - page 248 -Josep M Comajuncosas, Enric Guaus, Alex Barrachina and John O’Connell: Nuvolet : 3D gesture-driven collaborative audio mosaicing - page 252 -Erwin Schoonderwaldt and Alexander Refsum Jensenius: Effective and expressive movements in a French-Canadian fiddler’s performance - page 256 -Daniel Bisig, Jan Schacher and Martin Neukom: Flowspace – A Hybrid Ecosystem - page 260 -Marc Sosnick and William Hsu: Implementing a Finite Difference-Based Real-time Sound Synthesizer using GPUs - page 264 -Axel Tidemann: An Artificial Intelligence Architecture for Musical Expressiveness that Learns by Imitation - page 268 -Luke Dahl, Jorge Herrera and Carr Wilkerson: TweetDreams: Making music with the audience and the world using real-time Twitter data - page 272 -Lawrence Fyfe, Adam Tindale and Sheelagh Carpendale: JunctionBox: A Toolkit for Creating Multi-touch Sound Control Interfaces - page 276 -Andrew Johnston: Beyond Evaluation: Linking Practice and Theory in New Musical Interface Design - page 280 -Phillip Popp and Matthew Wright: Intuitive Real-Time Control of Spectral Model Synthesis - page 284 -Pablo Molina, Martin Haro and Sergi Jordà: BeatJockey: A new tool for enhancing DJ skills - page 288 -Jan Schacher and Angela Stoecklin: Traces – Body, Motion and Sound - page 292 -Grace Leslie and Tim Mullen: MoodMixer: EEG-based Collaborative Sonification - page 296 -Ståle A. Skogstad, Kristian Nymoen, Yago de Quay and Alexander Refsum Jensenius: OSC Implementation and Evaluation of the Xsens MVN suit - page 300 -Lonce Wyse, Norikazu Mitani and Suranga Nanayakkara: The effect of visualizing audio targets in a musical listening and performance task - page 304 -Freed Adrian, John Maccallum and Andrew Schmeder: Composability for Musical Gesture Signal Processing using new OSC-based Object and Functional Programming Extensions to Max/MSP - page 308 -Kristian Nymoen, Ståle A. Skogstad and Alexander Refsum Jensenius: SoundSaber —A Motion Capture Instrument - page 312 -Øyvind Brandtsegg, Sigurd Saue and Thom Johansen: A modulation matrix for complex parameter sets - page 316 Demo session H— Tuesday 31 May 13:30–14:30 -Yu-Chung Tseng, Che-Wei Liu, Tzu-Heng Chi and Hui-Yu Wang: Sound Low Fun- page 320 -Edgar Berdahl and Chris Chafe: Autonomous New Media Artefacts (AutoNMA) - page 322 -Min-Joon Yoo, Jin-Wook Beak and In-Kwon Lee: Creating Musical Expression using Kinect - page 324 -Staas de Jong: Making grains tangible: microtouch for microsound - page 326 Baptiste Caramiaux, Frederic Bevilacqua and Norbert Schnell: Sound Selection by Gestures - page 329 Paper session I — Tuesday 31 May 14:30–15:30 -Hernán KerlleÃevich, Manuel Eguia and Pablo Riera: An Open Source Interface based on Biological Neural Networks for Interactive Music Performance - page 331 -Nicholas Gillian, R. Benjamin Knapp and Sile O’Modhrain: Recognition Of Multivariate Temporal Musical Gestures Using N-Dimensional Dynamic Time Warping - page 337 -Nicholas Gillian, R. Benjamin Knapp and Sile O’Modhrain: A Machine Learning Toolbox For Musician Computer Interaction - page 343 Paper session J — Tuesday 31 May 16:00–17:00 -Elena Jessop, Peter Torpey and Benjamin Bloomberg: Music and Technology in Death and the Powers - page 349 -Victor Zappi, Dario Mazzanti, Andrea Brogni and Darwin Caldwell: Design and Evaluation of a Hybrid Reality Performance - page 355 -Jérémie Garcia, Theophanis Tsandilas, Carlos Agon and Wendy Mackay: InkSplorer : Exploring Musical Ideas on Paper and Computer - page 361 Paper session K — Wednesday 1 June 09:00–10:30 -Pedro Lopes, Alfredo Ferreira and Joao Madeiras Pereira: Battle of the DJs: an HCI perspective of Traditional, Virtual, Hybrid and Multitouch DJing - page 367 -Adnan Marquez-Borbon, Michael Gurevich, A. Cavan Fyans and Paul Stapleton: Designing Digital Musical Interactions in Experimental Contexts - page 373 -Jonathan Reus: Crackle: A mobile multitouch topology for exploratory sound interaction - page 377 -Samuel Aaron, Alan F. Blackwell, Richard Hoadley and Tim Regan: A principled approach to developing new languages for live coding - page 381 -Jamie Bullock, Daniel Beattie and Jerome Turner: Integra Live: a new graphical user interface for live electronic music - page 387 Paper session L — Wednesday 1 June 11:00–12:30 -Jung-Sim Roh, Yotam Mann, Adrian Freed and David Wessel: Robust and Reliable Fabric, Piezoresistive Multitouch Sensing Surfaces for Musical Controllers - page 393 -Mark Marshall and Marcelo Wanderley: Examining the Effects of Embedded Vibrotactile Feedback on the Feel of a Digital Musical Instrument - page 399 -Dimitri Diakopoulos and Ajay Kapur: HIDUINO: A firmware for building driverless USB-MIDI devices using the Arduino microcontroller - page 405 -Emmanuel Flety and Côme Maestracci: Latency improvement in sensor wireless transmission using IEEE 802.15.4 - page 409 -Jeff Snyder: The Snyderphonics Manta, a Novel USB Touch Controller - page 413 Poster session M — Wednesday 1 June 13:30–14:30 -William Hsu: On Movement, Structure and Abstraction in Generative Audiovisual Improvisation - page 417 -Claudia Robles Angel: Creating Interactive Multimedia Works with Bio-data - page 421 -Paula Ustarroz: TresnaNet: musical generation based on network protocols - page 425 -Matti Luhtala, Tiina Kymäläinen and Johan Plomp: Designing a Music Performance Space for Persons with Intellectual Learning Disabilities - page 429 -Tom Ahola, Teemu Ahmaniemi, Koray Tahiroglu, Fabio Belloni and Ville Ranki: Raja —A Multidisciplinary Artistic Performance - page 433 -Emmanuelle Gallin and Marc Sirguy: Eobody3: A ready-to-use pre-mapped & multi-protocol sensor interface- page 437 -Rasmus Bååth, Thomas Strandberg and Christian Balkenius: Eye Tapping: How to Beat Out an Accurate Rhythm using Eye Movements - page 441 -Eric Rosenbaum: MelodyMorph: A Reconfigurable Musical Instrument - page 445 -Karmen Franinovic: Flo)(ps: Between Habitual and Explorative Action-Sound Relationships - page 448 -Margaret Schedel, Rebecca Fiebrink and Phoenix Perry: Wekinating 000000Swan: Using Machine Learning to Create and Control Complex Artistic Systems - page 453 -Carles F. Julià, Daniel Gallardo and Sergi Jordà: MTCF: A framework for designing and coding musical tabletop applications directly in Pure Data - page 457 -David Pirrò and Gerhard Eckel: Physical modelling enabling enaction: an example - page 461 -Thomas Mitchell and Imogen Heap: SoundGrasp: A Gestural Interface for the Performance of Live Music - page 465 -Tim Mullen, Richard Warp and Adam Jansch: Minding the (Transatlantic) Gap: An Internet-Enabled Acoustic Brain-Computer Music Interface - page 469 -Stefano Papetti, Marco Civolani and Federico Fontana: Rhythm’n’Shoes: a wearable foot tapping interface with audio-tactile feedback - page 473 -Cumhur Erkut, Antti Jylhä and Reha Di¸sçio˘glu: A structured design and evaluation model with application to rhythmic interaction displays - page 477 -Marco Marchini, Panos Papiotis, Alfonso Perez and Esteban Maestre: A Hair Ribbon Deflection Model for Low-Intrusiveness Measurement of Bow Force in Violin Performance - page 481 -Jonathan Forsyth, Aron Glennon and Juan Bello: Random Access Remixing on the iPad - page 487 -Erika Donald, Ben Duinker and Eliot Britton: Designing the EP trio: Instrument identities, control and performance practice in an electronic chamber music ensemble - page 491 -Cavan Fyans and Michael Gurevich: Perceptions of Skill in Performances with Acoustic and Electronic Instruments - page 495 -Hiroki Nishino: Cognitive Issues in Computer Music Programming - page 499 -Roland Lamb and Andrew Robertson: Seaboard: a new piano keyboard-related interface combining discrete and continuous control - page 503 -Gilbert Beyer and Max Meier: Music Interfaces for Novice Users: Composing Music on a Public Display with Hand Gestures - page 507 -Birgitta Cappelen and Anders-Petter Andersson: Expanding the role of the instrument - page 511 -Todor Todoroff: Wireless Digital/Analog Sensors for Music and Dance Performances - page 515 -Trond Engum: Real-time control and creative convolution— exchanging techniques between distinct genres - page 519 -Andreas Bergsland: The Six Fantasies Machine – an instrument modelling phrases from Paul Lansky’s Six Fantasies - page 523 Demo session N — Wednesday 1 June 13:30–14:30 -Jan Trützschler von Falkenstein: Gliss: An Intuitive Sequencer for the iPhone and iPad - page 527 -Jiffer Harriman, Locky Casey, Linden Melvin and Mike Repper: Quadrofeelia — A New Instrument for Sliding into Notes - page 529 -Johnty Wang, Nicolas D’Alessandro, Sidney Fels and Bob Pritchard: SQUEEZY: Extending a Multi-touch Screen with Force Sensing Objects for Controlling Articulatory Synthesis - page 531 -Souhwan Choe and Kyogu Lee: SWAF: Towards a Web Application Framework for Composition and Documentation of Soundscape - page 533 -Norbert Schnell, Frederic Bevilacqua, Nicolas Rasamimana, Julien Blois, Fabrice Guedy and Emmanuel Flety: Playing the "MO" —Gestural Control and Re-Embodiment of Recorded Sound and Music - page 535 -Bruno Zamborlin, Marco Liuni and Giorgio Partesana: (LAND)MOVES - page 537 -Bill Verplank and Francesco Georg: Can Haptics make New Music? —Fader and Plank Demos - page 53
    corecore