42,828 research outputs found
Improvising Linguistic Style: Social and Affective Bases for Agent Personality
This paper introduces Linguistic Style Improvisation, a theory and set of
algorithms for improvisation of spoken utterances by artificial agents, with
applications to interactive story and dialogue systems. We argue that
linguistic style is a key aspect of character, and show how speech act
representations common in AI can provide abstract representations from which
computer characters can improvise. We show that the mechanisms proposed
introduce the possibility of socially oriented agents, meet the requirements
that lifelike characters be believable, and satisfy particular criteria for
improvisation proposed by Hayes-Roth.Comment: 10 pages, uses aaai.sty, lingmacros.sty, psfig.st
Computer improvisation of jazz solos
This thesis discusses the possibilities of using a computer to create Jazz solos. Various implementations using stochastic and rule based approaches were created and applied to analyze the original melody as well as the chord progressions. Based on the melodies generated, a rule-based approach that considered the original melody and the chord progression was found to produce the most musica
Piano_prosthesis
This is one of a developing series of duos for a human and a machine performer. Both “musicians” adapt to each other through mutual listening (i.e., via audio only) and response as the performance develops. The human’s improvisation is encoded by the computer through statistical analysis of extracted features and by cataloguing these in real time. Each observation made by the computer is assigned to a set of musical output behaviors. Recurring features of the player’s improvisation can then be recognized by the computer. The machine “expresses” this recognition by developing, and modifying, its own musical output, just as another player might
Modeling Joint Improvisation between Human and Virtual Players in the Mirror Game
Joint improvisation is observed to emerge spontaneously among humans
performing joint action tasks, and has been associated with high levels of
movement synchrony and enhanced sense of social bonding. Exploring the
underlying cognitive and neural mechanisms behind the emergence of joint
improvisation is an open research challenge. This paper investigates the
emergence of jointly improvised movements between two participants in the
mirror game, a paradigmatic joint task example. A theoretical model based on
observations and analysis of experimental data is proposed to capture the main
features of their interaction. A set of experiments is carried out to test and
validate the model ability to reproduce the experimental observations. Then,
the model is used to drive a computer avatar able to improvise joint motion
with a human participant in real time. Finally, a convergence analysis of the
proposed model is carried out to confirm its ability to reproduce the emergence
of joint movement between the participants
Instruments and Sounds as Objects of Improvisation in Collective Computer Music Practice
International audienceThis paper presents the authors' first attempt at a new (and unexpected) exercise: that of observing, contextualising and problema-tising their own collective Computer Music experiences. After two years practising emergent collective improvisation in private and public settings , which has led the authors to fundamentally reconsider both individual and collective musical creation, came the desire to methodologi-cally deconstruct this process-one that they never anticipated and, until now, had never formalised. By starting from the very notions or performance and improvisation in the context of Computer Music, and crossing prolific literature on these topics with humble observations from their own experience, the authors then elaborate on what appears to them as the most enticing perspective of this creative context: the systematic improvisation of both their tools and sounds in an unique flow
Lindenmeyer systems and the harmony of fractals
An interactive musical application is developed for realtime improvisation with a machine based on Lindenmeyer-systems. This has been used on an installation whose goal is to draw the attention of unexperienced users to the wealth of realtime applications in computer music. Issues on human computer interaction and improvisation grammars had to be dealt with, as well as probabilistic strategies for musical variation. The choice of L-systems as a basis for machine composition is a consequence of their ability to create results that easily have aesthetic appeal, both in the realms of sound and image.info:eu-repo/semantics/publishedVersio
MindMusic: Brain-Controlled Musical Improvisation
MindMusic explores a new form of creative expression through brain controlled musical improvisation. Using EEG technology and a musical improviser system, Impro-Visor (Keller, 2018), MindMusic engages users in musical improvisation sessions controlled with their brainwaves. Brain-controlled musical improvisation offers a unique blend of mindfulness meditation, EEG biofeedback, and real-time music generation, and stands to assist with stress reduction and widen access to musical creativity
Recommended from our members
Seeking out the spaces between: Using improvisation for collaborative composition and interactive technology
Copyright © 2010 ISASTThis article presents findings from experiments into piano performance live electronics undertaken by the author since early 2007. The use of improvisation has infused every step of the process---both as a methodology to obtain meaningful results using interactive technology and as a way to generate and characterize a collaborative musical space with composers. The technology used has included pre-built MIDI interfaces such as the PianoBar, actuators such as miniature DC motors and sensor interfaces including iCube and the Wii controller. Collaborators have included researchers at the Centre for Digital Music (QMUL), Richard Barrett, Pierre Alexandre Tremblay and Atau Tanaka. In seeking to create responsive “performance environments” at the piano, I explore live, performative control of electronics to create better connections for both performer (providing the same level of interpretive freedom as with a “pure” instrumental performance) and audience (communicating clearly to them). I have been lucky to witness first-hand many live interactive performances and to work with various empathetic composers/performers in flexible working environments. Collaborating with experienced technologists and musicians, I have witnessed time and again what, for me, is a fundamental truth in interactive instrumental performance: As a living, spontaneous form it must be nurtured and informed by the performer’s physicality and imagination as much as by the creativity or knowledge of the composer and/or technologist. Specifically in the case of sensors, their dependence on the detail of each person’s body and reactions is so refined as to necessitate, I would argue, an entirely collaborative approach and therefore one that involves at least directed improvisation and, more likely, fairly extensive improvised exploration. The fundamentally personal and intimate nature of sensor readings---the amount of tension created by each performer, the shape of the ancillary gestures or the level of emotional involvement (especially relevant when using galvanic skin response or EEG)---makes creating pieces with sensors extremely difficult for a composer to do in isolation. Improvisation therefore provides a way for performer and composer to generate a common musical and gestural language. Related to these issues is the fact that the technical or notational parameters in interactive music are not yet (and may never be) standardized, thereby creating a very real and practical need for improvisation to figure at least somewhere in the process.This study is funded by the Arts and Humanities Research Council
- …