3,121 research outputs found
Systems control theory applied to natural and synthetic musical sounds
Systems control theory is a far developped field which helps to study stability, estimation and control of dynamical systems. The physical behaviour of musical instruments, once described by dynamical systems, can then be controlled and numerically simulated for many purposes.
The aim of this paper is twofold: first, to provide the theoretical background on linear system theory, both in continuous and discrete time, mainly in the case of a finite number of degrees of freedom ; second, to give illustrative examples on wind instruments, such as the vocal tract represented as a waveguide, and a sliding flute
16th Sound and Music Computing Conference SMC 2019 (28–31 May 2019, Malaga, Spain)
The 16th Sound and Music Computing Conference (SMC 2019) took place in Malaga, Spain, 28-31 May 2019 and it was organized by the Application of Information and Communication Technologies Research group (ATIC) of the University of Malaga (UMA). The SMC 2019 associated Summer School took place 25-28 May 2019. The First International Day of Women in Inclusive Engineering, Sound and Music Computing Research (WiSMC 2019) took place on 28 May 2019. The SMC 2019 TOPICS OF INTEREST included a wide selection of topics related to acoustics, psychoacoustics, music, technology for music, audio analysis, musicology, sonification, music games, machine learning, serious games, immersive audio, sound synthesis, etc
Virtual-Acoustic Instrument Design: Exploring the Parameter Space of a String-Plate Model
Exploration is an intrinsic element of designing and engaging with acoustic as well as digital musical instruments. This paper reports on the ongoing development of a virtual-acoustic instrument based on a physical model of a string coupled nonlinearly to a plate. The performer drives the model by tactile interaction with a string-board controller fitted with piezo-electric sensors. The string-plate model is formulated in a way that prioritises its parametric explorability. Where the roles of creating performance gestures and designing instruments are traditionally separated, such a design provides a continuum across these domains. The string-plate model, its real-time implementation, and the control interface are described, and the system is preliminarily evaluated through informal observations of how musicians engage with the system
Finding Music in Chaos: Designing and Composing with Virtual Instruments Inspired by Chaotic Equations
Using chaos theory to design novel audio synthesis engines has been explored little in computer music. This could be because of the difficulty of obtaining harmonic tones or the likelihood of chaos-based synthesis engines to explode, which then requires re-instantiating of the engine to proceed with sound production. This process is not desirable when composing because of the time wasted fixing the synthesis engine instead of the composer being able to focus completely on the creative aspects of composition. One way to remedy these issues is to connect chaotic equations to individual parts of the synthesis engine instead of relying on the chaos as the primary source of all sound-producing procedures. To do this, one can create a physically-based synthesis model and connect chaotic equations to individual parts of the model.
The goal of this project is to design a physically-inspired virtual instrument based on a conceptual percussion instrument model that utilizes chaos theory in the synthesis engine to explore novel sounds in a reliable and repeatable way for other composers and performers to use. This project presents a two-movement composition utilizing these concepts and a modular set of virtual instruments that can be used by anyone, which can be interacted with by a new electronic music controller called the Hexapad controller and standard MIDI controllers. The physically-inspired instrument created for the Hexapad controller is called the Ambi-Drum and standard MIDI controllers are used to control synthesis parameters and other virtual instruments
Recommended from our members
Nonlinear Dynamical Processes in Musical Interactions: investigating the role of nonlinear dynamics in supporting surprise and exploration in interactions with digital musical instruments
Nonlinear dynamical processes play a central role in many acoustic instruments, yet they rarely feature in digital instruments, and are little understood from an interaction design perspective. Such processes exhibit behaviours that are complex, time-dependent, and chaotic, yet in the context of acoustic instruments can facilitate interactions that are explorable, learnable and repeatable. This suggests that these processes merit deeper investigation for digital music interaction design.
Two studies are presented which investigate user interaction with nonlinear dynamical musical tools. A lab-based study used four purpose-built digital musical instruments to test interaction designs featuring nonlinear dynamical processes. Evaluations with 28 musicians demonstrated the potential for these processes to provoke creative surprises, and support exploration without a corresponding loss of control. A subsequent ethnographically-informed study with 24 musicians linked these findings to a mode of engagement which we term âedge-like interactionâ. Edge-like interactions draw on the complex, unpredictable behaviours found in nonlinear dynamical processes close to critical thresholds, facilitating creative exploration.
The two complementary studies provide evidence both for the existing importance of nonlinear dynamical processes in musical interactions with acoustic interactions, and their potential for deployment in the development of new creative digital technologies, musical or otherwise
Virtual acoustics displays
The real time acoustic display capabilities are described which were developed for the Virtual Environment Workstation (VIEW) Project at NASA-Ames. The acoustic display is capable of generating localized acoustic cues in real time over headphones. An auditory symbology, a related collection of representational auditory 'objects' or 'icons', can be designed using ACE (Auditory Cue Editor), which links both discrete and continuously varying acoustic parameters with information or events in the display. During a given display scenario, the symbology can be dynamically coordinated in real time with 3-D visual objects, speech, and gestural displays. The types of displays feasible with the system range from simple warnings and alarms to the acoustic representation of multidimensional data or events
Reverberation still in business : Thickening and propagating micro-textures in physics-based sound modeling
Artificial reverberation is usually introduced, as a digital audio effect,
to give a sense of enclosing architectural space. In this paper
we argue about the effectiveness and usefulness of diffusive reverberators
in physically-inspired sound synthesis. Examples are
given for the synthesis of textural sounds, as they emerge from
solid mechanical interactions, as well as from aerodynamic and
liquid phenomena
Synthesis of Resonant Sounds with a Heterodyne Model
This paper considers the generation of resonant waveforms from a number of perspectives. Commencing with the well-known source filter model it introduces a more advantageous heterodyne interpretation. Some variations on the basic design and comparisons with previous methods are then given. An analysis on the use of three different digital music filter structures for resonance synthesis is made, followed by an example showing how timbrally rich Frequency Modulated resonant waveforms can be synthesized
- âŚ