This paper was presented at the Live Interfaces conference 2012. Copyright @ 2012 The Authors.This paper presents a gesturally controlled, live-improvisation
system, developed for an experimental pianist and used
during a performance at the 2011 International Conference
on New Interfaces for Musical Expression. We describe
the gesture-recognition architecture used to recognize
the pianist’s real-time gestures, the audio infrastructure
developed specifically for this piece and the core lessons
learned over the process of developing this performance
system
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.