36 research outputs found

    Data Sonification in Creative Practice

    Get PDF
    Sonification is the process of data transmission with non-speech audio. While finding increasing acceptance as a scientific method, particularly where a visual representation of data is inadequate, it is still often derided as a ‘gimmick’. Composers have also shown growing interest in sonification as a compositional method. Both in science and in music, the criticism towards this method relates to poor aesthetics and gratuitous applications. This thesis aims to address these issues through an accompanying portfolio of pieces which use sonification as a compositional tool. It establishes the principles of ‘musification’, which can be defined as a sonification which uses musical structures; a sonification organised by musical principles. The practice-as-research portfolio explores a number of data sources, musical genres and science-music collaborations. The main contributions to knowledge derived from the project are a portfolio of compositions, a compositional framework for sonification and an evaluation framework for musification. This thesis demonstrates the validity of practice-as-research as a methodology in sonification research

    Musical Borrowing in Sonification

    Get PDF
    Sonification presents some challenges in communicating information, particularly because of the large difference between possible data to sound mappings and cognitively valid mappings. It is an information transmission process which can be described through the Shannon-Weaver Theory of Mathematical Communication. Musical borrowing is proposed as a method in sonification which can aid the information transmission process as the composer’s and listener’s shared musical knowledge is used. This article describes the compositional process of WasgiischwashĂ€sch (2017) which uses Rossini’s William Tell Overture (1829) to sonify datasets relating to climate change in Switzerland. It concludes that the familiarity of audiences with the original piece, and the humorous effect produced by the distortion of a well-known piece, contribute to a more effective transmission process

    Application of Musical Computing to Creating a Dynamic Reconfigurable Multilayered Chamber Orchestra Composition

    Get PDF
    With increasing virtualization and the recognition that today’s virtual computers are faster than hardware computers of 10 years ago, modes of computation are now limited only by the imagination. Pulsed Melodic Affective Processing (PMAP) is an unconventional computation protocol that makes affective computation more human-friendly by making it audible. Data sounds like the emotion it carries. PMAP has been demonstrated in nonmusical applications, e.g. quantum computer entanglement and stock market trading. This article presents a musical application and demonstration of PMAP: a dynamic reconfigurable score for acoustic orchestral performance, in which the orchestra acts as a PMAP half-adder to add two numbers. </jats:p

    A CONTINUED MUSICAL AND PERSONAL DIALOGUE WITH THE WAVES OF EPILEPSY.

    Get PDF
    This thesis is accompanied by EEG audio data and experiment examples, EEG Epilepsy Electronic Music Soundscapes and a MAX/MSP Synthesizer.A Continued Musical and Personal Dialogue with the Waves of Epilepsy. In the early hours of the morning several years ago I awoke with paramedics leaning over me. In a state of confusion, my first conscious decision was to enter my music production studio while they attempted to lead me to the ambulance. Music was important to me even in a disorientated post-ictal state (an altered state of consciousness following a seizure). Two weeks later I awoke with paramedics standing over me again. I had started to experience multiple seizures. During the previous weeks, I also experienced numerous incidents of memory loss when delivering presentations at work, feelings of being returned to the room following an absence of consciousness and suffering from temporal disorientation. I also experienced multiple episodes of dĂ©jĂ  vu, aromas that were difficult to identify, visual distortions and waves of euphoria like momentary intoxication of an unknown origin. These experiences began to increase in frequency until my first tonic-clinic seizure. Following medical tests, I was diagnosed with epilepsy. It was a confusing period with no history of epilepsy in my family and no physiological causes could be identified. I viewed epilepsy as an overwhelming authority, it takes control of your life and asserts its power upon you, forcibly changing your reality in an instant. When I saw the EEG readouts from my tests I noticed how similar they were to sound waves. As an electronic musician, this project is being used as an artistic and cathartic opportunity to creatively transform the power of epilepsy and reassert my personal identity upon it. Symbolically reclaiming personal control and creatively transforming the psychological perception of personal power that is lost through the experience of epilepsy. Transforming it from an internal destructive force into an external and creative activity in my life. Capturing the cultural and emotional experiences of epilepsy and transforming them into cinematic electronic soundscapes using research and musical experimentation with EEG epilepsy signals. It is an existential exploration, the results will be tangible, accessible and reasonable in the transformation of the EEG epilepsy recordings from the uncontrollable unconscious into the creative conscious. This project will apply transposition, mathematics, research and creative exploration to map epilepsy EEG events into computer synthesized soundscapes, transforming the passive nature of diagnoses and treatment into a proactive and creative process. This thesis shares an individual's research and experiences of epilepsy with a community that have an interest in transforming the passive sufferer into a creatively active and articulate patient. Professor Dan Lloyd (Thomas C. Brownell) Professor of Philosophy at Trinity College states that: “It is observed that fMRI (Brain) activity is more similar to music than it is to language...” Lloyd D. (2011). If, as Lloyd suggests, brain activity is more like music than language then what might epilepsy be saying or possibly singing during these events? What are the audible timbres of these events? Researchers such as Wu et al, Psyche et al, Chafe and Parvizi have previously interpreted EEG data of epilepsy EEG events to aid medical research, but it is not exploring the emotional timbre of epilepsy from a patient’s perspective. The previous research derived musical notes from EEG signals to trigger MIDI instruments and modulate non-epilepsy related audio sources for medical identification purposes. This project examines the possible timbres derived directly from the EEG data to explore and creatively describe the emotional and physical experience from a patient’s perspective.  This thesis presents the personal experience of epilepsy, the development of electroencephalography (EEG), the sociocultural history of epilepsy. the sonification and musification of EEG data, plus the concepts involved in the design of timbre and sound effect. For this project, a bespoke granular synthesizer called ‘The Oblique-Granizer’ (programmed with Cycling74's MAXMSP) has been constructed that employs EEG signals, converted to digital audio, to synthesize timbres that explore the description of human experience and emotions related to epilepsy. This thesis includes research that has been carried out into mathematical algorithms to generate musical notes and melodic information in electronic music compositions using EEG epilepsy seizure activity. The aim is to take back personal control by creatively transforming the EEG data and my psychological perception of epilepsy into electronic soundscapes and sonic textures through exploration of sonification and musification techniques.dBs Music U

    Earth system music: music generated from the United Kingdom Earth System Model (UKESM1)

    Get PDF
    Scientific data are almost always represented graphically in figures or in videos. With the ever-growing interest from the general public in understanding climate sciences, it is becoming increasingly important that scientists present this information in ways that are both accessible and engaging to non-experts. In this pilot study, we use time series data from the first United Kingdom Earth System Model (UKESM1) to create six procedurally generated musical pieces. Each of these pieces presents a unique aspect of the ocean component of the UKESM1, either in terms of a scientific principle or a practical aspect of modelling. In addition, each piece is arranged using a different musical progression, style and tempo. These pieces were created in the Musical Instrument Digital Interface (MIDI) format and then performed by a digital piano synthesiser. An associated video showing the time development of the data in time with the music was also created. The music and video were published on the lead author's YouTube channel. A brief description of the methodology was also posted alongside the video. We also discuss the limitations of this pilot study and describe several approaches to extend and expand upon this work

    BCI for Music Making: Then, Now, and Next

    Get PDF
    Brain–computer music interfacing (BCMI) is a growing field with a history of experimental applications derived from the cutting edge of BCI research as adapted to music making and performance. BCMI offers some unique possibilities over traditional music making, including applications for emotional music selection and emotionally driven music creation for individuals as communicative aids (either in cases where users might have physical or mental disabilities that otherwise preclude them from taking part in music making or in music therapy cases where emotional communication between a therapist and a patient by means of traditional music making might otherwise be impossible). This chapter presents an overview of BCMI and its uses in such contexts, including existing techniques as they are adapted to musical control, from P300 and SSVEP (steady-state visually evoked potential) in EEG (electroencephalogram) to asymmetry, hybrid systems, and joint fMRI (functional magnetic resonance imaging) studies correlating affective induction (by means of music) with neurophysiological cues. Some suggestions for further work are also volunteered, including the development of collaborative platforms for music performance by means of BCMI
    corecore