12 research outputs found

    The Water Is Always Running: Vaporwave, Fluxus, And The Role Of Defamiliarization In Music-Led Virtual Realities

    Get PDF
    This thesis examines how a Fluxus approach to participatory music can leverage environmental metaphors and affordances, heighten user awareness of their interactions through a making-strange approach to musical interaction, and balance functional and creative user experiences in a meaningful and participatory musical interaction. This research has been conducted using the virtual reality experience The Water Is Always Running. This work presents an unusual music-making environment, a 3D kitchen with dishwashing simulation, to explore how a making-strange approach to musical interaction and participation can heighten the awareness of process for the user. To create this work, ideas have been leveraged from acoustic ecology, vaporwave, virtual reality musical instrument design, and human-centered design

    Mobile Music Development Tools for Creative Coders

    Get PDF
    This project is a body of work that facilitates the creation of musical mobile artworks. The project includes a code toolkit that enhances and simplifies the development of mobile music iOS applications, a flexible notation system designed for mobile musical interactions, and example apps and scored compositions to demonstrate the toolkit and notation system. The code library is designed to simplify the technical aspect of user-centered design and development with a more direct connection between concept and deliverable. This sim- plification addresses learning problems (such as motivation, self-efficacy, and self-perceived understanding) by bridging the gap between idea and functional prototype and improving the ability to contextualize the development process for musicians and other creatives. The toolkit helps to circumvent the need to learn complex iOS development patterns and affords more readable code. CSPD (color, shape, pattern, density) notation is a pseudo-tablature that describes performance interactions. The system leverages visual density and patterns of both color and shape to describe types of gestures (physical or musical) and their relationships rather than focusing on strict rhythmic or pitch/frequency content. The primary design goal is to visualize macro musical concepts that create middleground structure

    La música visual del compositor José López-Montes

    Get PDF
    153 páginas.Trabajo Fin de Máster en Patrimonio Musical. Tutor: Dr. D. Rafael Liñán Vallecillos. En el presente trabajo nuestro cometido principal ha sido investigar las fuentes primarias y secundarias referentes a la producción de música visual del compositor José López-Montes. Para ello, en primer lugar hemos ofrecido un acercamiento al término de “música visual” que nos permite establecer un punto de partida, un breve repaso a las tradiciones históricas que dan lugar a la práctica contemporánea con el fin de delimitar las características más importantes de la música visual de nueva creación, un estudio de las publicaciones más importantes sobre la práctica contemporánea de este género que nos permitiera establecer un contexto teórico adecuado y, ya en el cuerpo del trabajo, un estudio de la producción de música visual de López-Montes en el que ofrecemos una catalogación, una perspectiva general en la que abordamos comparativamente los aspectos característicos de cada una de las piezas junto a un repaso a las técnicas y procedimientos más importantes empleados y, finalmente, aproximaciones analíticas de dos de las piezas en las que exploramos en detalle las estructuras formales, los elementos que construyen el discurso, los métodos para musicalizar el material visual y los tipos de relaciones audiovisuales que tienen lugar

    Musicking with an interactive musical system: The effects of task motivation and user interface mode on non-musicians' creative engagement

    Get PDF
    Creative engagement with novel musical interfaces can be rewarding for non-musicians. However, designing novel musical interfaces for non-musicians can be challenging because they lack conceptual and technical musical skills. In this paper we explore the effects of task motivation (experiential goal vs utilitarian goal) and user interface mode (whether the content is editable, and whether content can be replayed), on non-musicians’ creative engagement with novel musical interfaces. We show through an empirical study of twenty-four parti- cipants that an experiential exploratory goal encourages users’ creative engagement compared to a utilitarian creative goal. We found that being able to replay records is less important when participants have an experiential exploratory goal than when they have a utilitarian creative goal. Results also indicate that allowing people to replay their musical ideas increased some aspects of their creative engagement which was further increased when they were able to edit their creations. We also found that creative engagement increased when the in- terface supported users in planning ahead. A descriptive model of non-musician’s creative engagement with musical interfaces is proposed including three modes of musicking. An optimal trajectory of creative engagement through these modes is suggested and a description of inferred motivations, output, status and activities during creative processes is discussed. Design implications are proposed for supporting novices’ creative engagement taking into consideration their motivation and skills, and supporting insight and real-time activity

    Musicking with an Interactive Musical System: the Effects of Task Motivation and User Interface Mode on Non-musicians’ Creative Engagement

    Get PDF
    Creative engagement with novel musical interfaces can be rewarding for non-musicians. However, designing novel musical interfaces for non-musicians can be challenging because they lack conceptual and technical musical skills. In this paper we explore the effects of task motivation (experiential goal vs utilitarian goal) and user interface mode (whether the content is editable, and whether content can be replayed), on non-musicians’ creative engagement with novel musical interfaces. We show through an empirical study of twenty-four parti- cipants that an experiential exploratory goal encourages users’ creative engagement compared to a utilitarian creative goal. We found that being able to replay records is less important when participants have an experiential exploratory goal than when they have a utilitarian creative goal. Results also indicate that allowing people to replay their musical ideas increased some aspects of their creative engagement which was further increased when they were able to edit their creations. We also found that creative engagement increased when the in- terface supported users in planning ahead. A descriptive model of non-musician’s creative engagement with musical interfaces is proposed including three modes of musicking. An optimal trajectory of creative engagement through these modes is suggested and a description of inferred motivations, output, status and activities during creative processes is discussed. Design implications are proposed for supporting novices’ creative engagement taking into consideration their motivation and skills, and supporting insight and real-time activity

    Development of an Augmented Reality musical instrument

    Get PDF
    Nowadays, Augmented Reality and Virtual Reality are concepts of which people are becoming more and more aware of due to their application to the video-game industry (speceially in the case of VR). Such raise is partly due to a decrease in costs of Head Mounted Displays, which are consequently becoming more and more accessible to the public and developers worldwide. All of these novelties, along with the frenetic development of Information Technologies applied to essentially, all markets; have also made digital artists and manufacturers aware of the never-ending interaction possibilities these paradigms provide and a variety of systems have appeared, which offer innovative creative capabilities. Due to the personal interest of the author in music and the technologies surrounding its creation by digital means, this document covers the application of the Virtuality- Reality-Continuum (VR and AR) paradigms to the field of interfaces for the musical expression. More precisely, it covers the development of an electronic drumset which integrates Arduino-compatible hardware with a 3D visualisation application (developed based on Unity) to create a complete functioning instrument musical instrument, The system presented along the document attempts to leverage three-dimensional visual feedback with tangible interaction based on hitting, which is directly translated to sound and visuals in the sound generation application. Furthermore, the present paper provides a notably deep study of multiple technologies and areas that are ultimately applied to the target system itself. Hardware concerns, time requirements, approaches to the creation of NIMEs (New Interfaces for Musical Expression), Virtual Musical Instrument (VMI) design, musical-data transmission protocols (MIDI and OSC) and 3D modelling constitute the fundamental topics discussed along the document. At the end of this paper, conclusions reflect on the difficulties found along the project, the unfulfilled objectives and all deviations from the initial concept that the project suffered during the development process. Besides, future work paths will be listed and depicted briefly and personal comments will be included as well as humble pieces of advice targeted at readers interested in facing an ambitious project on their own.En la actualidad, los conceptos de Realidad Aumentada (AR) y Realidad Virtual (VR) son cada vez más conocidos por la gente de a pie, debido en gran parte a su aplicación al ámbito de los videojuegos, donde el desarollo para dispositivos HMDs está en auge. Esta popularidad se debe en gran parte al abaratamiento de este tipo de dispositivos, los cuales son cada vez más accesibles al público y a los desarrolladores de todo el mundo. Todas estas novedades sumadas al frenético desarrollo de la industria de IT han llamado la atención de artistas y empresas que han visto en estos paradigmas (VR and AR) una oportunidad para proporcionar nuevas e ilimitadas formas de interacción y creación de arte en alguna de sus formas. Debido al interés personal del autor de este TFG en la música y las tecnologías que posiblitan la creación musical por medios digitales, este documento explora la aplicación de los paradigmas del Virtuality-Reality Continuum de Milgram (AR y VR) al ámbito de las interfaces para la creación musical. Concretamente, este TFG detalla el desarrollo de una batería electrónica, la cual combina una interfaz tangible creada con hardware compatible con Arduino con una aplicación de generación de sonidos y visualización, desarrollada utilizando Unity como base. Este sistema persigue lograr una interacción natural por parte del usuario por medio de integrar el hardware en unas baquetas, las cuales permiten detectar golpes a cualquier tipo de superficie y convierten estos en mensajes MIDI que son utilizados por el sistema generador de sonido para proporcionar feedback al usuario (tanto visual como auditivo); por tanto, este sistema se distingue por abogar por una interacción que permita golpear físicamente objetos (e.g. una cama), mientras que otros sistemas similates basan su modo de interacción en “air-drumming”. Además, este sistema busca solventar algunos de los inconvenientes principales asociados a los baterías y su normalmente conflictivo instrumento, como es el caso de las limitaciones de espacio, la falta de flexibilidad en cuanto a los sonidos que pueden ser generados y el elevado coste del equipo. Por otro lado, este documento pormenoriza diversos aspectos relacionados con el sistema descrito en cuestión, proporcionando al lector una completa panorámica de sistemas similares al propuesto. Asimismo, se describen los aspectos más importantes en relación al desarrollo del TFG, como es el caso de protocolos de transmisión de información musical (MIDI y OSC), algoritmos de control, guías de diseño para interfaces de creación musical (NIMEs) y modelado 3D. Se incluye un íntegro proceso de Ingeniería de Software para mantener la formalidad y tratar de garantizar un desarrollo más organizado y se discute la metodología utilizada para este proceso. Por último, este documento reflexiona sobre las dificultades encontradas, se enumeran posibilidades de Trabajo Futuro y se finaliza con algunas conclusiones personales derivadas de este trabajo de investigación.Ingeniería Informátic

    "Knowing is Seeing:" The Digital Audio Workstation and the Visualization of Sound

    Get PDF
    The computers visual representation of sound has revolutionized the creation of music through the interface of the Digital Audio Workstation software (DAW). With the rise of DAW-based composition in popular music styles, many artists sole experience of musical creation is through the computer screen. I assert that the particular sonic visualizations of the DAW propagate certain assumptions about music, influencing aesthetics and adding new visually-based parameters to the creative process. I believe many of these new parameters are greatly indebted to the visual structures, interactional dictates and standardizations (such as the office metaphor depicted by operating systems such as Apples OS and Microsofts Windows) of the Graphical User Interface (GUI). Whether manipulating text, video or audio, a users interaction with the GUI is usually structured in the same mannerclicking on windows, icons and menus with a mouse-driven cursor. Focussing on the dialogs from the Reddit communities of Making hip-hop and EDM production, DAW user manuals, as well as interface design guidebooks, this dissertation will address the ways these visualizations and methods of working affect the workflow, composition style and musical conceptions of DAW-based producers

    Real-time 3D Graphic Augmentation of Therapeutic Music Sessions for People on the Autism Spectrum

    Get PDF
    This thesis looks at the requirements analysis, design, development and evaluation of an application, CymaSense, as a means of improving the communicative behaviours of autistic participants through therapeutic music sessions, via the addition of a visual modality. Autism spectrum condition (ASC) is a lifelong neurodevelopmental disorder that can affect people in a number of ways, commonly through difficulties in communication. Interactive audio-visual feedback can be an effective way to enhance music therapy for people on the autism spectrum. A multi-sensory approach encourages musical engagement within clients, increasing levels of communication and social interaction beyond the sessions.Cymatics describes a resultant visualised geometry of vibration through a variety of mediums, typically through salt on a brass plate or via water. The research reported in this thesis focuses on how an interactive audio-visual application, based on Cymatics, might improve communication for people on the autism spectrum.A requirements analysis was conducted through interviews with four therapeutic music practitioners, aimed at identifying working practices with autistic clients. CymaSense was designed for autistic users in exploring effective audio-visual feedback, and to develop meaningful cross-modal mappings of musical practitioner-client communication. CymaSense mappings were tested by 17 high functioning autistic participants, and by 30 neurotypical participants. The application was then trialled as a multimodal intervention for eight participants with autism, over a 12-week series of therapeutic music sessions. The study captured the experiences of the users and identified behavioural changes as a result, including information on how CymaSense could be developed further. This dissertation contributes evidence that multimodal applications can be used within therapeutic music sessions as a tool to increase communicative behaviours for autistic participants

    Exploring Collaborative Music Making Experience in Shared Virtual Environments.

    Get PDF
    PhD ThesisVirtual Environments (VEs), as media providing high-level immersion, o er people an opportunity to mimic natural interpersonal interactions digitally. As a multi-player version of VEs, Shared Virtual Environments (SVEs) inherit VEs' advantages in enabling natural interactions and generating a high level of immersion, and will possibly play an increasingly important role in supporting digitally-mediated collaboration. Though SVEs have been extensively explored for education, entertainment, work, and training, as yet, few SVEs exist in the eld of supporting creative collaboration and as a result, research on the creative aspect of collaboration in SVEs remains very poor. This raises questions about how to design the user experience to support creative collaboration in SVEs. This thesis starts with an introduction and related work. An SVE called Let's Move (LeMo) will then be briefed. LeMo allows two people to interact with each other and create music collaboratively in its virtual environment. Three studies based on LeMo will then be presented: Study I explores how free-form visual 3D annotations and work identity in uence the collaboration, Study II and Study III explore how working space con gurations a ect the collaboration. Results indicate that: (1) 3D annotations can support people's collaborative music making (CMM) in SVEs through ve classes of use; (2) group territory, personal territory, and territorial behaviour emerge during collaborative music making in SVEs; (3) manipulating characteristics of personal space a ected collaborative behaviour, formation of territory, work e ciency, sense of contribution, preference, and so on. Then an overall discussion between studies is made and further implications for SVEs supporting collaborative music making (and other types of collaboration) in SVEs are given. The ndings of this thesis contribute towards the design of Human-Computer Interaction of Shared Virtual Environments focusing on supporting collaborative music making
    corecore