5 research outputs found
Synchronizing Sequencing Software to a Live Drummer
Copyright 2013 Massachusetts Institute of Technology. MIT allows authors to archive published versions of their articles after an embargo period. The article is available at
Musicians and Machines: Bridging the Semantic Gap In Live Performance
PhDThis thesis explores the automatic extraction of musical information from
live performances – with the intention of using that information to create
novel, responsive and adaptive performance tools for musicians.
We focus specifically on two forms of musical analysis – harmonic analysis
and beat tracking. We present two harmonic analysis algorithms –
specifically we present a novel chroma vector analysis technique which
we later use as the input for a chord recognition algorithm. We also
present a real-time beat tracker, based upon an extension of state of the
art non-causal models, that is computationally efficient and capable of
strong performance compared to other models. Furthermore, through a
modular study of several beat tracking algorithms we attempt to establish
methods to improve beat tracking and apply these lessons to our model.
Building upon this work, we show that these analyses can be combined
to create a beat-synchronous musical representation, with harmonic information
segmented at the level of the beat. We present a number of ways
of calculating these representations and discuss their relative merits.
We proceed by introducing a technique, which we call Performance
Following, for recognising repeated patterns in live musical performances.
Through examining the real-time beat-synchronous musical representation,
this technique makes predictions of future harmonic content in musical
performances with no prior knowledge in the form of a score.
Finally, we present a number of potential applications for live performances
that incorporate the real-time musical analysis techniques outlined
previously. The applications presented include audio effects informed by
beat tracking, a technique for synchronising video to a live performance,
the use of harmonic information to control visual displays and an automatic
accompaniment system based upon our performance following
technique.EPSR
Rhythmic analysis for real-time audio effects
We outline a set of audio effects that use rhythmic analysis, in particular the extraction of beat and tempo information, to automatically synchronise temporal parameters to the input signal. We demonstrate that this analysis, known as beat-tracking, can be used to create adaptive parameters that adjust themselves according to changes in the properties of the input signal. We present common audio effects such as delay, tremolo and auto-wah augmented in this fashion and discuss their real-time implementation as Audio Unit plug-ins and objects for Max/MSP
Rhythmic analysis for real-time audio effects
We outline a set of audio effects that use rhythmic analysis, in particular the extraction of beat and tempo information, to automatically synchronise temporal parameters to the input signal. We demonstrate that this analysis, known as beat-tracking, can be used to create adaptive parameters that adjust themselves according to changes in the properties of the input signal. We present common audio effects such as delay, tremolo and auto-wah augmented in this fashion and discuss their real-time implementation as Audio Unit plug-ins and objects for Max/MSP