2 research outputs found

    The Artists who Say Ni!: Incorporating the Python programming language into creative coding for the realisation of musical works

    Get PDF
    Even though Python is a very popular programming language with a wide range of applications, in the domain of music, specifically electronic music, it is much less used than other languages and programming environments that have been built explicitly for musical creation, such as SuperCollider, Pure Data, Csound, Max, and Chuck. Since 2010 a Python module for DSP called Pyo has been available. This module consists of a complete set of DSP algorithms, Unit Generators, filters, effects, and other tools for the creation of electronic music and sound, yet its community is rather limited. Being part of Python, this module can be combined with a big variety of native and external Python modules for musical or extra-musical tasks, facilitating the realisation of interdisciplinary artworks focusing on music and sound. Starting a creative journey with this module, I was led to more Pythonic techniques for tasks other than music, like mining tweets from Twitter or creating code poetry, which I incorporated into my musical activity. This practice-based research explores the field of the creation of musical works based on Python by focusing on three works. The first one is a live coding poetry opera where the libretto is written in Python. The second one is a live algorithmic composition for an acoustic ensemble based on input from Twitter. The last work is a combination of live coding with live patching on a hardware modular synthesiser system. The main objective of this thesis is to determine the creative potential of Python in music and mixed media art by posing questions that are answered through these works. By doing this, this research aims to provide a conceptual framework for artistic creation that can function as inspiration to other musicians and artists. The title of this thesis is based on one of the most popular lines of the Monty Python comedy troupe, “the Knights who say Ni!”, since the initial developer of the Python programming language, Guido van Rossum, gave this name to this language inspired by Monty Python

    BRAIN-COMPUTER MUSIC INTERFACING: DESIGNING PRACTICAL SYSTEMS FOR CREATIVE APPLICATIONS

    Get PDF
    Brain-computer music interfacing (BCMI) presents a novel approach to music making, as it requires only the brainwaves of a user to control musical parameters. This presents immediate benefits for users with motor disabilities that may otherwise prevent them from engaging in traditional musical activities such as composition, performance or collaboration with other musicians. BCMI systems with active control, where a user can make cognitive choices that are detected within brain signals, provide a platform for developing new approaches towards accomplishing these activities. BCMI systems that use passive control present an interesting alternate to active control, where control over music is accomplished by harnessing brainwave patterns that are associated with subconscious mental states. Recent developments in brainwave measuring technologies, in particular electroencephalography (EEG), have made brainwave interaction with computer systems more affordable and accessible and the time is ripe for research into the potential such technologies can offer for creative applications for users of all abilities. This thesis presents an account of BCMI development that investigates methods of active, passive and hybrid (multiple control methods) control that include control over electronic music, acoustic instrumental music, multi-brain systems and combining methods of brainwave control. In practice there are many obstacles associated with detecting useful brainwave signals, in particular when scaling systems otherwise designed for medical studies for use outside of laboratory settings. Two key areas are addressed throughout this thesis. Firstly, improving the accuracy of meaningful brain signal detection in BCMI, and secondly, exploring the creativity available in user control through ways in which brainwaves can be mapped to musical features. Six BCMIs are presented in this thesis, each with the objective of exploring a unique aspect of user control. Four of these systems are designed for live BCMI concert performance, one evaluates a proof-of-concept through end-user testing and one is designed as a musical composition tool. The thesis begins by exploring the field of brainwave detection and control and identifies the steady-state visually evoked potential (SSVEP) method of eliciting brainwave control as a suitable technique for use in BCMI. In an attempt to improve signal accuracy of the SSVEP technique a new modular hardware unit is presented that provides accurate SSVEP stimuli, suitable for live music performance. Experimental data confirms the performance of the unit in tests across three different EEG hardware platforms. Results across 11 users indicate that a mean accuracy of 96% and an average response time of 3.88 seconds are attainable with the system. These results contribute to the development of the BCMI for Activating Memory, a multi-user system. Once a stable SSVEP platform is developed, control is extended through the integration of two more brainwave control techniques: affective (emotional) state detection and motor imagery response. In order to ascertain the suitability of the former an experiment confirms the accuracy of EEG when measuring affective states in response to music in a pilot study. This thesis demonstrates how a range of brainwave detection methods can be used for creative control in musical applications. Video and audio excerpts of BCMI pieces are also included in the Appendices
    corecore