381 research outputs found

    SonicDraw: a web-based tool for sketching sounds and drawings

    Get PDF
    We present SonicDraw, a web browser tool that lies in between a drawing and a sound design interface. Through this ambiguity we aim to explore new kinds of user interactions as the creative process can be led either by sound or visual feedback loops. We performed a user evaluation to assess how users negotiated the affordances of the system and how it supported their creativity. We measured the System Usability Scale (SUS), the Creativity Support Index (CSI) and conducted an inductive thematic analysis of qualitative feedback. Results indicate that users find SonicDraw a very easy and intuitive tool which fosters the exploration for new unexpected combinations of sounds and drawings. However, the tool seems to fail in engaging high-skilled musicians or drawers wanting to create more complex pieces. To infer knowledge about user interaction, we also propose a quantitative analysis of drawing dynamics. Two contrasting modes of interaction are likely occurring, one where sketches act as direct controls of sonic attributes (sound focus), and the other where sketches feature semantic content (e.g. a house) that indirectly controls sound (visual focus)

    User experience in an interactive music virtual reality system: An exploratory study

    Get PDF
    The Objects VR interface and study explores interactive music and virtual reality, focusing on user experience, understanding of musical functionality, and interaction issues. Our system offers spatio-temporal music interaction using 3D geometric shapes and their designed relationships. Control is provided by tracking of the hands, and the experience is rendered across a head-mounted display with binaural sound presented over headphones. The evaluation of the system uses a mixed methods approach based on semi-structured interviews, surveys and video-based interaction analysis. On average the system was positively received in terms of interview self-report, metrics for spatial presence and creative support. Interaction analysis and interview thematic analysis also revealed instances of frustration with interaction and levels of confusion with system functionality. Our results allow reflection on design criteria and discussion of implications for facilitating music engagement in virtual reality. Finally our work discusses the effectiveness of measures with respect to future evaluation of novel interactive music systems in virtual reality

    Co-design of a Smart Cajon

    Get PDF
    The work of Luca Turchet is supported by a Marie-Curie Individual fellowship of European Union’s Horizon 2020 research and innovation program, under grant agreement No. 749561. Mathieu Barthet also acknowledges support from the EU H2020 Audio Commons grant (688382)

    Affective gaming using adaptive speed controlled by biofeedback

    Get PDF
    This work is part of a larger project exploring how affective computing can support the design of player-adaptive video games. We investigate how controlling some of the game mechanics using biofeedback affects physiological reactions, performance, and the experience of the player. More specifically, we assess how different game speeds affect player physiological responses and game performance. We developed a game prototype with Unity1 which includes a biofeedback loop system based on the level of physiological activation through skin resistance (SKR) measured with a smart wristband. In two conditions, the player moving speed was driven by SKR, to increase (respectively decrease) speed when the player is less activated (SKR decreases). A control condition was also used where player speed is not affected by SKR. We collected and synchronized biosignals (heart rate [HR], skin temperature [SKT] and SKR), and game information, such as the total time to complete a level, the number of ennemy collisions, and their timestamps. Additionally, emotional profiling (TIPI, I-Panas-SF), measured using a Likert scale in a post-task questionnaire, and semi-open questions about the game experience were used. The results show that SKR was significantly higher in the speed down condition, and game performance improved in the speed up condition. Study collected data involved 13 participants (10 males, 3 females) aged from 18 to 50 (M = 24.30, SD = 9.00). Most of the participants felt engaged with the game (M = 6.46, SD = 0.96) and their level of immersion was not affected by wearing the prototype smartband. Thematic analysis (TA) revealed that the game speed impacted the participants stress levels such as high speed was more stressful than hypothesized; many participants described game level-specific effects in which they felt that their speed of movement reflected their level of stress or relaxation. Slowing down the participants indeed increased the participant stress levels, but counter intuitively, more stress was detected in high speed situations

    "It's cleaner, definitely": Collaborative Process in Audio Production.

    Get PDF
    Working from vague client instructions, how do audio producers collaborate to diagnose what specifically is wrong with a piece of music, where the problem is and what to do about it? This paper presents a design ethnography that uncovers some of the ways in which two music producers co-ordinate their understanding of complex representations of pieces of music while working together in a studio. Our analysis shows that audio producers constantly make judgements based on audio and visual evidence while working with complex digital tools, which can lead to ambiguity in assessments of issues. We show how multimodal conduct guides the process of work and that complex media objects are integrated as elements of interaction by the music producers. The findings provide an understanding how people currently collaborate when producing audio, to support the design of better tools and systems for collaborative audio production in the future

    Novel Methods in Facilitating Audience and Performer Interaction Using the Mood Conductor Framework

    Get PDF
    While listeners’ emotional response to music is the subject of numerous studies, less attention is paid to the dynamic emotion variations due to the interaction between artists and audiences in live improvised music performances. By opening a direct communication channel from audience members to performers, the Mood Conductor system provides an experimental framework to study this phenomenon. Mood Conductor facilitates interactive performances and thus also has an inherent entertainment value. The framework allows audience members to send emotional directions using their mobile devices in order to “conduct” improvised performances. Emotion coordinates indicted by the audience in the arousal-valence space are aggregated and clustered to create a video projection. This is used by the musicians as guidance, and provides visual feedback to the audience. Three different systems were developed and tested within our framework so far. These systems were trialled in several public performances with different ensembles. Qualitative and quantitative evaluations demonstrated that musicians and audiences were highly engaged with the system, and raised new insights enabling future improvements of the framework

    Moodplay: an interactive mood-based musical experience

    Get PDF

    Examining Emotion Perception Agreement in Live Music Performance

    Get PDF
    Current music emotion recognition (MER) systems rely on emotion data averaged across listeners and over time to infer the emotion expressed by a musical piece, often neglecting time- and listener-dependent factors. These limitations can restrict the efficacy of MER systems and cause misjudgements. In a live music concert setting, fifteen audience members annotated perceived emotion in valence-arousal space over time using a mobile application. Analyses of inter-rater reliability yielded widely varying levels of agreement in the perceived emotions. A follow-up lab study to uncover the reasons for such variability was conducted, where twenty-one listeners annotated their perceived emotions through a recording of the original performance and offered open-ended explanations. Thematic analysis reveals many salient features and interpretations that can describe the cognitive processes. Some of the results confirm known findings of music perception and MER studies. Novel findings highlight the importance of less frequently discussed musical attributes, such as musical structure, performer expression, and stage setting, as perceived across different modalities. Musicians are found to attribute emotion change to musical harmony, structure, and performance technique more than non-musicians. We suggest that listener-informed musical features can benefit MER in addressing emotional perception variability by providing reasons for listener similarities and idiosyncrasies

    Crossroads: Interactive Music Systems Transforming Performance, Production and Listening

    Get PDF
    date-added: 2017-12-22 18:26:58 +0000 date-modified: 2017-12-22 18:38:33 +0000 keywords: mood-based interaction, intelligent music production, HCI local-url: https://qmro.qmul.ac.uk/xmlui/handle/123456789/12502 publisher-url: http://mcl.open.ac.uk/music-chi/uploads/19/HCIMUSIC_2016_paper_15.pdf bdsk-url-1: https://qmro.qmul.ac.uk/xmlui/bitstream/handle/123456789/12502/Barthet%20Crossroads%3A%20Interactive%20Music%20Systems%202016%20Accepted.pdfdate-added: 2017-12-22 18:26:58 +0000 date-modified: 2017-12-22 18:38:33 +0000 keywords: mood-based interaction, intelligent music production, HCI local-url: https://qmro.qmul.ac.uk/xmlui/handle/123456789/12502 publisher-url: http://mcl.open.ac.uk/music-chi/uploads/19/HCIMUSIC_2016_paper_15.pdf bdsk-url-1: https://qmro.qmul.ac.uk/xmlui/bitstream/handle/123456789/12502/Barthet%20Crossroads%3A%20Interactive%20Music%20Systems%202016%20Accepted.pdfdate-added: 2017-12-22 18:26:58 +0000 date-modified: 2017-12-22 18:38:33 +0000 keywords: mood-based interaction, intelligent music production, HCI local-url: https://qmro.qmul.ac.uk/xmlui/handle/123456789/12502 publisher-url: http://mcl.open.ac.uk/music-chi/uploads/19/HCIMUSIC_2016_paper_15.pdf bdsk-url-1: https://qmro.qmul.ac.uk/xmlui/bitstream/handle/123456789/12502/Barthet%20Crossroads%3A%20Interactive%20Music%20Systems%202016%20Accepted.pdfWe discuss several state-of-the-art systems that propose new paradigms and user workflows for music composition, production, performance, and listening. We focus on a selection of systems that exploit recent advances in semantic and affective computing, music information retrieval (MIR) and semantic web, as well as insights from fields such as mobile computing and information visualisation. These systems offer the potential to provide transformative experiences for users, which is manifested in creativity, engagement, efficiency, discovery and affect
    • …
    corecore