8 research outputs found

    AEMI: The Actuated Embedded Musical Instrument

    Get PDF
    This dissertation is a combination of acoustic and electronic musical creation, acoustic instruments and digital instruments, and a combination of all of these areas. Part I is an original composition for orchestra with the new instrument set as soloist. Part II is an examination of the development and influences of creating a new electronic musical instrument. Part I is a composition for AEMI (the Actuated Embedded Musical Instrument) and orchestra, entitled “Meditation on Solids, Liquids, and Gas.” This composition is a dialogue between the orchestra and instrument, set as an exchange of ideas; sometimes ideas lead to conflict, others lead to resolution. This also serves as a way to feature some of the musical capabilities of this new instrument. Part II is an examination of AEMI and its influences. Chapter 1 includes a discussion of existing instruments whose similar features influenced the development of AEMI: the Theremin, Manta, JD-1, Buchla controller, EVI and EWI, and Chameleon Guitar. While AEMI instrument does not have the same performance mechanics as the Theremin, Evi, or Ewi, understanding the physicality issues of an instrument, like the Theremin, provided insights into creating a versatile instrument that can be easily learned yet have virtuosic character. Ultimately, embedding expressivity, such as subtlety and nuance into the instrument, would be one of the most difficult aspects of creating an instrument and would demand the largest amount of work. Chapter 2 describes the aesthetics, technical aspects, difficulties, and musical abilities of the instrument. Attempts to combine acoustic and electronic music are not novel, the incorporation of acoustically driven resonance by electronic embedded instruments is new. The electroacoustic nature of this instrument is different than most electronic instruments. The controller and user interface is electronically driven, and its speakers/acoustic drivers are embedded within the instrument. This discussion may provide insights to musicians, composers, and instrument makers involved in the finding of new avenues of musical expression

    Affordance of vibrational excitation for music composition and performance

    Get PDF
    Mechanical vibrations have typically been used in the performance domain within feedback systems to inform musicians of system states or as communication channels between performers. In this paper, we propose the addi- tional taxonomic category of vibrational excitation of mu- sical instruments for sound generation. To explore the va- riety of possibilities associated with this extended taxon- omy, we present the Oktopus, a multi-purpose wireless sys- tem capable of motorised vibrational excitation. The sys- tem can receive up to eight inputs and generates vibrations as outputs through eight motors that can be positioned ac- cordingly to produce a wide range of sounds from an ex- cited instrument. We demonstrate the usefulness of the proposed system and extended taxonomy through the de- velopment and performance of Live Mechanics, a compo- sition for piano and interactive electronics

    Examining The Effects Of Embedded Vibrotactile Feedback On The Feel Of A Digital Musical Instrument

    No full text
    This paper deals with the effects of integrated vibrotactile feedback on the "feel" of a digital musical instrument(DMI). Building on previous work developing a DMI withintegrated vibrotactile feedback actuators, we discuss howto produce instrument-like vibrations, compare these simulated vibrations with those produced by an acoustic instrument and examine how the integration of this feedbackeffects performer ratings of the instrument. We found thatintegrated vibrotactile feedback resulted in an increase inperformer engagement with the instrument, but resulted ina reduction in the perceived control of the instrument. Wediscuss these results and their implications for the design ofnew digital musical instruments

    Human-Computer interaction methodologies applied in the evaluation of haptic digital musical instruments

    Get PDF
    Recent developments in interactive technologies have seen major changes in the manner in which artists, performers, and creative individuals interact with digital music technology; this is due to the increasing variety of interactive technologies that are readily available today. Digital Musical Instruments (DMIs) present musicians with performance challenges that are unique to this form of computer music. One of the most significant deviations from conventional acoustic musical instruments is the level of physical feedback conveyed by the instrument to the user. Currently, new interfaces for musical expression are not designed to be as physically communicative as acoustic instruments. Specifically, DMIs are often void of haptic feedback and therefore lack the ability to impart important performance information to the user. Moreover, there currently is no standardised way to measure the effect of this lack of physical feedback. Best practice would expect that there should be a set of methods to effectively, repeatedly, and quantifiably evaluate the functionality, usability, and user experience of DMIs. Earlier theoretical and technological applications of haptics have tried to address device performance issues associated with the lack of feedback in DMI designs and it has been argued that the level of haptic feedback presented to a user can significantly affect the user’s overall emotive feeling towards a musical device. The outcome of the investigations contained within this thesis are intended to inform new haptic interface

    Embodied interaction with guitars: instruments, embodied practices and ecologies

    Get PDF
    In this thesis I investigate the embodied performance preparation practices of guitarists to design and develop tools to support them. To do so, I employ a series of human-centred design methodologies such as design ethnography, participatory design, and soma design. The initial ethnographic study I conducted involved observing guitarists preparing to perform individually and with their bands in their habitual places of practice. I also interviewed these musicians on their preparation activities. Findings of this study allowed me to chart an ecology of tools and resources employed in the process, as well as pinpoint a series of design opportunities for augmenting guitars, namely supporting (1) encumbered interactions, (2) contextual interactions, and (3) connected interactions. Going forward with the design process I focused on remediating encumbered interactions that emerge during performance preparation with multimedia devices, particularly during instrumental transcription. I then prepared and ran a series of hands-on co-design workshops with guitarists to discuss five media controller prototypes, namely, instrument-mounted controls, pedal-based controls, voice-based controls, gesture-based controls, and “music-based” controls. This study highlighted the value that guitarists give to their guitars and to their existing practice spaces, tools, and resources by critically reflecting on how these interaction modalities would support or disturb their existing embodied preparation practices with the instrument. In parallel with this study, I had the opportunity to participate in a soma design workshop (and then prepare my own) in which I harnessed my first-person perspective of guitar playing to guide the design process. By exploring a series of embodied ideation and somatic methods, as well as materials and sensors across several points of contact between our bodies and the guitar, we collaboratively ideated a series of design concepts for guitar across both workshops, such as a series of breathing guitars, stretchy straps, and soft pedals. I then continued to develop and refine the Stretchy Strap concept into a guitar strap augmented with electronic textile stretch sensors to harness it as an embodied media controller to remediate encumbered interaction during musical transcription with guitar when using secondary multimedia resources. The device was subsequently evaluated by guitarists at a home practicing space, providing insights on nuanced aspects of its embodied use, such as how certain media control actions like play and pause are better supported by the bodily gestures enacted with the strap, whilst other actions, like rewinding the play back or setting in and out points for a loop are better supported by existing peripherals like keyboards and mice, as these activities do not necessarily happen in the flow of the embodied practice of musical transcription. Reflecting on the overall design process, a series of considerations are extracted for designing embodied interactions with guitars, namely, (1) considering the instrument and its potential for augmentation, i.e., considering the shape of the guitar, its material and its cultural identity, (2) considering the embodied practices with the instrument, i.e., the body and the subjective felt experience of the guitarist during their skilled embodied practices with the instrument and how these determine its expert use according to a particular instrumental tradition and/or musical practice, and (3) considering the practice ecology of the guitarist, i.e., the tools, resources, and spaces they use according to their practice

    Embodied interaction with guitars: instruments, embodied practices and ecologies

    Get PDF
    In this thesis I investigate the embodied performance preparation practices of guitarists to design and develop tools to support them. To do so, I employ a series of human-centred design methodologies such as design ethnography, participatory design, and soma design. The initial ethnographic study I conducted involved observing guitarists preparing to perform individually and with their bands in their habitual places of practice. I also interviewed these musicians on their preparation activities. Findings of this study allowed me to chart an ecology of tools and resources employed in the process, as well as pinpoint a series of design opportunities for augmenting guitars, namely supporting (1) encumbered interactions, (2) contextual interactions, and (3) connected interactions. Going forward with the design process I focused on remediating encumbered interactions that emerge during performance preparation with multimedia devices, particularly during instrumental transcription. I then prepared and ran a series of hands-on co-design workshops with guitarists to discuss five media controller prototypes, namely, instrument-mounted controls, pedal-based controls, voice-based controls, gesture-based controls, and “music-based” controls. This study highlighted the value that guitarists give to their guitars and to their existing practice spaces, tools, and resources by critically reflecting on how these interaction modalities would support or disturb their existing embodied preparation practices with the instrument. In parallel with this study, I had the opportunity to participate in a soma design workshop (and then prepare my own) in which I harnessed my first-person perspective of guitar playing to guide the design process. By exploring a series of embodied ideation and somatic methods, as well as materials and sensors across several points of contact between our bodies and the guitar, we collaboratively ideated a series of design concepts for guitar across both workshops, such as a series of breathing guitars, stretchy straps, and soft pedals. I then continued to develop and refine the Stretchy Strap concept into a guitar strap augmented with electronic textile stretch sensors to harness it as an embodied media controller to remediate encumbered interaction during musical transcription with guitar when using secondary multimedia resources. The device was subsequently evaluated by guitarists at a home practicing space, providing insights on nuanced aspects of its embodied use, such as how certain media control actions like play and pause are better supported by the bodily gestures enacted with the strap, whilst other actions, like rewinding the play back or setting in and out points for a loop are better supported by existing peripherals like keyboards and mice, as these activities do not necessarily happen in the flow of the embodied practice of musical transcription. Reflecting on the overall design process, a series of considerations are extracted for designing embodied interactions with guitars, namely, (1) considering the instrument and its potential for augmentation, i.e., considering the shape of the guitar, its material and its cultural identity, (2) considering the embodied practices with the instrument, i.e., the body and the subjective felt experience of the guitarist during their skilled embodied practices with the instrument and how these determine its expert use according to a particular instrumental tradition and/or musical practice, and (3) considering the practice ecology of the guitarist, i.e., the tools, resources, and spaces they use according to their practice

    Proceedings of the International Conference on New Interfaces for Musical Expression

    Get PDF
    Editors: Alexander Refsum Jensenius, Anders Tveit, Rolf Inge GodĂžy, Dan Overholt Table of Contents -Tellef Kvifte: Keynote Lecture 1: Musical Instrument User Interfaces: the Digital Background of the Analog Revolution - page 1 -David Rokeby: Keynote Lecture 2: Adventures in Phy-gital Space - page 2 -Sergi JordĂ : Keynote Lecture 3: Digital Lutherie and Multithreaded Musical Performance: Artistic, Scientific and Commercial Perspectives - page 3 Paper session A — Monday 30 May 11:00–12:30 -Dan Overholt: The Overtone Fiddle: an Actuated Acoustic Instrument - page 4 -Colby Leider, Matthew Montag, Stefan Sullivan and Scott Dickey: A Low-Cost, Low-Latency Multi-Touch Table with Haptic Feedback for Musical Applications - page 8 -Greg Shear and Matthew Wright: The Electromagnetically Sustained Rhodes Piano - page 14 -Laurel Pardue, Christine Southworth, Andrew Boch, Matt Boch and Alex Rigopulos: Gamelan Elektrika: An Electronic Balinese Gamelan - page 18 -Jeong-Seob Lee and Woon Seung Yeo: Sonicstrument: A Musical Interface with Stereotypical Acoustic Transducers - page 24 Poster session B— Monday 30 May 13:30–14:30 -Scott Smallwood: Solar Sound Arts: Creating Instruments and Devices Powered by Photovoltaic Technologies - page 28 -Niklas KlĂŒgel, Marc RenĂ© Frieß and Georg Groh: An Approach to Collaborative Music Composition - page 32 -Nicolas Gold and Roger Dannenberg: A Reference Architecture and Score Representation for Popular Music Human-Computer Music Performance Systems - page 36 -Mark Bokowiec: V’OCT (Ritual): An Interactive Vocal Work for Bodycoder System and 8 Channel Spatialization - page 40 -Florent Berthaut, Haruhiro Katayose, Hironori Wakama, Naoyuki Totani and Yuichi Sato: First Person Shooters as Collaborative Multiprocess Instruments - page 44 -Tilo HĂ€hnel and Axel Berndt: Studying Interdependencies in Music Performance: An Interactive Tool - page 48 -Sinan Bokesoy and Patrick Adler: 1city 1001vibrations: development of a interactive sound installation with robotic instrument performance - page 52 -Tim Murray-Browne, Di Mainstone, Nick Bryan-Kinns and Mark D. Plumbley:The medium is the message: Composing instruments and performing mappings - page 56 -Seunghun Kim, Luke Keunhyung Kim, Songhee Jeong and Woon Seung Yeo: Clothesline as a Metaphor for a Musical Interface - page 60 -Pietro Polotti and Maurizio Goina: EGGS in action - page 64 -Berit Janssen: A Reverberation Instrument Based on Perceptual Mapping - page 68 -Lauren Hayes: Vibrotactile Feedback-Assisted Performance - page 72 -Daichi Ando: Improving User-Interface of Interactive EC for Composition-Aid by means of Shopping Basket Procedure - page 76 -Ryan McGee, Yuan-Yi Fan and Reza Ali: BioRhythm: a Biologically-inspired Audio-Visual Installation - page 80 -Jon Pigott: Vibration, Volts and Sonic Art: A practice and theory of electromechanical sound - page 84 -George Sioros and Carlos Guedes: Automatic Rhythmic Performance in Max/MSP: the kin.rhythmicator - page 88 -Andre Goncalves: Towards a Voltage-Controlled Computer — Control and Interaction Beyond an Embedded System - page 92 -Tae Hun Kim, Satoru Fukayama, Takuya Nishimoto and Shigeki Sagayama: Polyhymnia: An automatic piano performance system with statistical modeling of polyphonic expression and musical symbol interpretation - page 96 -Juan Pablo Carrascal and Sergi Jorda: Multitouch Interface for Audio Mixing - page 100 -Nate Derbinsky and Georg Essl: Cognitive Architecture in Mobile Music Interactions - page 104 -Benjamin D. Smith and Guy E. Garnett: The Self-Supervising Machine - page 108 -Aaron Albin, Sertan Senturk, Akito Van Troyer, Brian Blosser, Oliver Jan and Gil Weinberg: Beatscape, a mixed virtual-physical environment for musical ensembles - page 112 -Marco Fabiani, GaĂ«l Dubus and Roberto Bresin: MoodifierLive: Interactive and collaborative expressive music performance on mobile devices - page 116 -Benjamin Schroeder, Marc Ainger and Richard Parent: A Physically Based Sound Space for Procedural Agents - page 120 -Francisco Garcia, Leny Vinceslas, Esteban Maestre and Josep Tubau Acquisition and study of blowing pressure profiles in recorder playing - page 124 -Anders Friberg and Anna KĂ€llblad:Experiences from video-controlled sound installations - page 128 -Nicolas d’Alessandro, Roberto Calderon and Stefanie MĂŒller: ROOM#81 —Agent-Based Instrument for Experiencing Architectural and Vocal Cues - page 132 Demo session C — Monday 30 May 13:30–14:30 -Yasuo Kuhara and Daiki Kobayashi: Kinetic Particles Synthesizer Using Multi-Touch Screen Interface of Mobile Devices - page 136 -Christopher Carlson, Eli Marschner and Hunter Mccurry: The Sound Flinger: A Haptic Spatializer - page 138 -Ravi Kondapalli and Benzhen Sung: Daft Datum – an Interface for Producing Music Through Foot-Based Interaction - page 140 -Charles Martin and Chi-Hsia Lai: Strike on Stage: a percussion and media performance - page 142 Paper session D — Monday 30 May 14:30–15:30 -Baptiste Caramiaux, Patrick Susini, Tommaso Bianco, FrĂ©dĂ©ric Bevilacqua, Olivier Houix, Norbert Schnell and Nicolas Misdariis: Gestural Embodiment of Environmental Sounds: an Experimental Study - page 144 -Sebastian Mealla, Aleksander Valjamae, Mathieu Bosi and Sergi Jorda: Listening to Your Brain: Implicit Interaction in Collaborative Music Performances - page 149 -Dan Newton and Mark Marshall: Examining How Musicians Create Augmented Musical Instruments - page 155 Paper session E — Monday 30 May 16:00–17:00 -Zachary Seldess and Toshiro Yamada: Tahakum: A Multi-Purpose Audio Control Framework - page 161 -Dawen Liang, Guangyu Xia and Roger Dannenberg: A Framework for Coordination and Synchronization of Media - page 167 -Edgar Berdahl and Wendy Ju: Satellite CCRMA: A Musical Interaction and Sound Synthesis Platform - page 173 Paper session F — Tuesday 31 May 09:00–10:50 -Nicholas J. Bryan and Ge Wang: Two Turntables and a Mobile Phone - page 179 -Nick Kruge and Ge Wang: MadPad: A Crowdsourcing System for Audiovisual Sampling - page 185 -Patrick O’Keefe and Georg Essl: The Visual in Mobile Music Performance - page 191 -Ge Wang, Jieun Oh and Tom Lieber: Designing for the iPad: Magic Fiddle - page 197 -Benjamin Knapp and Brennon Bortz: MobileMuse: Integral Music Control Goes Mobile - page 203 -Stephen Beck, Chris Branton, Sharath Maddineni, Brygg Ullmer and Shantenu Jha: Tangible Performance Management of Grid-based Laptop Orchestras - page 207 Poster session G— Tuesday 31 May 13:30–14:30 -Smilen Dimitrov and Stefania Serafin: Audio Arduino—an ALSA (Advanced Linux Sound Architecture) audio driver for FTDI-based Arduinos - page 211 -Seunghun Kim and Woon Seung Yeo: Musical control of a pipe based on acoustic resonance - page 217 -Anne-Marie Hansen, Hans JĂžrgen Andersen and Pirkko Raudaskoski: Play Fluency in Music Improvisation Games for Novices - page 220 -Izzi Ramkissoon: The Bass Sleeve: A Real-time Multimedia Gestural Controller for Augmented Electric Bass Performance - page 224 -Ajay Kapur, Michael Darling, James Murphy, Jordan Hochenbaum, Dimitri Diakopoulos and Trimpin: The KarmetiK NotomotoN: A New Breed of Musical Robot for Teaching and Performance - page 228 -Adrian Barenca Aliaga and Giuseppe Torre: The Manipuller: Strings Manipulation and Multi-Dimensional Force Sensing - page 232 -Alain Crevoisier and CĂ©cile Picard-Limpens: Mapping Objects with the Surface Editor - page 236 -Jordan Hochenbaum and Ajay Kapur: Adding Z-Depth and Pressure Expressivity to Tangible Tabletop Surfaces - page 240 -Andrew Milne, Anna XambĂł, Robin Laney, David B. Sharp, Anthony Prechtl and Simon Holland: Hex Player—A Virtual Musical Controller - page 244 -Carl Haakon Waadeland: Rhythm Performance from a Spectral Point of View - page 248 -Josep M Comajuncosas, Enric Guaus, Alex Barrachina and John O’Connell: Nuvolet : 3D gesture-driven collaborative audio mosaicing - page 252 -Erwin Schoonderwaldt and Alexander Refsum Jensenius: Effective and expressive movements in a French-Canadian fiddler’s performance - page 256 -Daniel Bisig, Jan Schacher and Martin Neukom: Flowspace – A Hybrid Ecosystem - page 260 -Marc Sosnick and William Hsu: Implementing a Finite Difference-Based Real-time Sound Synthesizer using GPUs - page 264 -Axel Tidemann: An Artificial Intelligence Architecture for Musical Expressiveness that Learns by Imitation - page 268 -Luke Dahl, Jorge Herrera and Carr Wilkerson: TweetDreams: Making music with the audience and the world using real-time Twitter data - page 272 -Lawrence Fyfe, Adam Tindale and Sheelagh Carpendale: JunctionBox: A Toolkit for Creating Multi-touch Sound Control Interfaces - page 276 -Andrew Johnston: Beyond Evaluation: Linking Practice and Theory in New Musical Interface Design - page 280 -Phillip Popp and Matthew Wright: Intuitive Real-Time Control of Spectral Model Synthesis - page 284 -Pablo Molina, Martin Haro and Sergi JordĂ : BeatJockey: A new tool for enhancing DJ skills - page 288 -Jan Schacher and Angela Stoecklin: Traces – Body, Motion and Sound - page 292 -Grace Leslie and Tim Mullen: MoodMixer: EEG-based Collaborative Sonification - page 296 -StĂ„le A. Skogstad, Kristian Nymoen, Yago de Quay and Alexander Refsum Jensenius: OSC Implementation and Evaluation of the Xsens MVN suit - page 300 -Lonce Wyse, Norikazu Mitani and Suranga Nanayakkara: The effect of visualizing audio targets in a musical listening and performance task - page 304 -Freed Adrian, John Maccallum and Andrew Schmeder: Composability for Musical Gesture Signal Processing using new OSC-based Object and Functional Programming Extensions to Max/MSP - page 308 -Kristian Nymoen, StĂ„le A. Skogstad and Alexander Refsum Jensenius: SoundSaber —A Motion Capture Instrument - page 312 -Øyvind Brandtsegg, Sigurd Saue and Thom Johansen: A modulation matrix for complex parameter sets - page 316 Demo session H— Tuesday 31 May 13:30–14:30 -Yu-Chung Tseng, Che-Wei Liu, Tzu-Heng Chi and Hui-Yu Wang: Sound Low Fun- page 320 -Edgar Berdahl and Chris Chafe: Autonomous New Media Artefacts (AutoNMA) - page 322 -Min-Joon Yoo, Jin-Wook Beak and In-Kwon Lee: Creating Musical Expression using Kinect - page 324 -Staas de Jong: Making grains tangible: microtouch for microsound - page 326 Baptiste Caramiaux, Frederic Bevilacqua and Norbert Schnell: Sound Selection by Gestures - page 329 Paper session I — Tuesday 31 May 14:30–15:30 -HernĂĄn KerlleÃevich, Manuel Eguia and Pablo Riera: An Open Source Interface based on Biological Neural Networks for Interactive Music Performance - page 331 -Nicholas Gillian, R. Benjamin Knapp and Sile O’Modhrain: Recognition Of Multivariate Temporal Musical Gestures Using N-Dimensional Dynamic Time Warping - page 337 -Nicholas Gillian, R. Benjamin Knapp and Sile O’Modhrain: A Machine Learning Toolbox For Musician Computer Interaction - page 343 Paper session J — Tuesday 31 May 16:00–17:00 -Elena Jessop, Peter Torpey and Benjamin Bloomberg: Music and Technology in Death and the Powers - page 349 -Victor Zappi, Dario Mazzanti, Andrea Brogni and Darwin Caldwell: Design and Evaluation of a Hybrid Reality Performance - page 355 -JĂ©rĂ©mie Garcia, Theophanis Tsandilas, Carlos Agon and Wendy Mackay: InkSplorer : Exploring Musical Ideas on Paper and Computer - page 361 Paper session K — Wednesday 1 June 09:00–10:30 -Pedro Lopes, Alfredo Ferreira and Joao Madeiras Pereira: Battle of the DJs: an HCI perspective of Traditional, Virtual, Hybrid and Multitouch DJing - page 367 -Adnan Marquez-Borbon, Michael Gurevich, A. Cavan Fyans and Paul Stapleton: Designing Digital Musical Interactions in Experimental Contexts - page 373 -Jonathan Reus: Crackle: A mobile multitouch topology for exploratory sound interaction - page 377 -Samuel Aaron, Alan F. Blackwell, Richard Hoadley and Tim Regan: A principled approach to developing new languages for live coding - page 381 -Jamie Bullock, Daniel Beattie and Jerome Turner: Integra Live: a new graphical user interface for live electronic music - page 387 Paper session L — Wednesday 1 June 11:00–12:30 -Jung-Sim Roh, Yotam Mann, Adrian Freed and David Wessel: Robust and Reliable Fabric, Piezoresistive Multitouch Sensing Surfaces for Musical Controllers - page 393 -Mark Marshall and Marcelo Wanderley: Examining the Effects of Embedded Vibrotactile Feedback on the Feel of a Digital Musical Instrument - page 399 -Dimitri Diakopoulos and Ajay Kapur: HIDUINO: A firmware for building driverless USB-MIDI devices using the Arduino microcontroller - page 405 -Emmanuel Flety and CĂŽme Maestracci: Latency improvement in sensor wireless transmission using IEEE 802.15.4 - page 409 -Jeff Snyder: The Snyderphonics Manta, a Novel USB Touch Controller - page 413 Poster session M — Wednesday 1 June 13:30–14:30 -William Hsu: On Movement, Structure and Abstraction in Generative Audiovisual Improvisation - page 417 -Claudia Robles Angel: Creating Interactive Multimedia Works with Bio-data - page 421 -Paula Ustarroz: TresnaNet: musical generation based on network protocols - page 425 -Matti Luhtala, Tiina KymĂ€lĂ€inen and Johan Plomp: Designing a Music Performance Space for Persons with Intellectual Learning Disabilities - page 429 -Tom Ahola, Teemu Ahmaniemi, Koray Tahiroglu, Fabio Belloni and Ville Ranki: Raja —A Multidisciplinary Artistic Performance - page 433 -Emmanuelle Gallin and Marc Sirguy: Eobody3: A ready-to-use pre-mapped & multi-protocol sensor interface- page 437 -Rasmus BĂ„Ă„th, Thomas Strandberg and Christian Balkenius: Eye Tapping: How to Beat Out an Accurate Rhythm using Eye Movements - page 441 -Eric Rosenbaum: MelodyMorph: A Reconfigurable Musical Instrument - page 445 -Karmen Franinovic: Flo)(ps: Between Habitual and Explorative Action-Sound Relationships - page 448 -Margaret Schedel, Rebecca Fiebrink and Phoenix Perry: Wekinating 000000Swan: Using Machine Learning to Create and Control Complex Artistic Systems - page 453 -Carles F. JuliĂ , Daniel Gallardo and Sergi JordĂ : MTCF: A framework for designing and coding musical tabletop applications directly in Pure Data - page 457 -David PirrĂČ and Gerhard Eckel: Physical modelling enabling enaction: an example - page 461 -Thomas Mitchell and Imogen Heap: SoundGrasp: A Gestural Interface for the Performance of Live Music - page 465 -Tim Mullen, Richard Warp and Adam Jansch: Minding the (Transatlantic) Gap: An Internet-Enabled Acoustic Brain-Computer Music Interface - page 469 -Stefano Papetti, Marco Civolani and Federico Fontana: Rhythm’n’Shoes: a wearable foot tapping interface with audio-tactile feedback - page 473 -Cumhur Erkut, Antti JylhĂ€ and Reha Dižsçio˘glu: A structured design and evaluation model with application to rhythmic interaction displays - page 477 -Marco Marchini, Panos Papiotis, Alfonso Perez and Esteban Maestre: A Hair Ribbon Deflection Model for Low-Intrusiveness Measurement of Bow Force in Violin Performance - page 481 -Jonathan Forsyth, Aron Glennon and Juan Bello: Random Access Remixing on the iPad - page 487 -Erika Donald, Ben Duinker and Eliot Britton: Designing the EP trio: Instrument identities, control and performance practice in an electronic chamber music ensemble - page 491 -Cavan Fyans and Michael Gurevich: Perceptions of Skill in Performances with Acoustic and Electronic Instruments - page 495 -Hiroki Nishino: Cognitive Issues in Computer Music Programming - page 499 -Roland Lamb and Andrew Robertson: Seaboard: a new piano keyboard-related interface combining discrete and continuous control - page 503 -Gilbert Beyer and Max Meier: Music Interfaces for Novice Users: Composing Music on a Public Display with Hand Gestures - page 507 -Birgitta Cappelen and Anders-Petter Andersson: Expanding the role of the instrument - page 511 -Todor Todoroff: Wireless Digital/Analog Sensors for Music and Dance Performances - page 515 -Trond Engum: Real-time control and creative convolution— exchanging techniques between distinct genres - page 519 -Andreas Bergsland: The Six Fantasies Machine – an instrument modelling phrases from Paul Lansky’s Six Fantasies - page 523 Demo session N — Wednesday 1 June 13:30–14:30 -Jan TrĂŒtzschler von Falkenstein: Gliss: An Intuitive Sequencer for the iPhone and iPad - page 527 -Jiffer Harriman, Locky Casey, Linden Melvin and Mike Repper: Quadrofeelia — A New Instrument for Sliding into Notes - page 529 -Johnty Wang, Nicolas D’Alessandro, Sidney Fels and Bob Pritchard: SQUEEZY: Extending a Multi-touch Screen with Force Sensing Objects for Controlling Articulatory Synthesis - page 531 -Souhwan Choe and Kyogu Lee: SWAF: Towards a Web Application Framework for Composition and Documentation of Soundscape - page 533 -Norbert Schnell, Frederic Bevilacqua, Nicolas Rasamimana, Julien Blois, Fabrice Guedy and Emmanuel Flety: Playing the "MO" —Gestural Control and Re-Embodiment of Recorded Sound and Music - page 535 -Bruno Zamborlin, Marco Liuni and Giorgio Partesana: (LAND)MOVES - page 537 -Bill Verplank and Francesco Georg: Can Haptics make New Music? —Fader and Plank Demos - page 53
    corecore