7,084 research outputs found

    Designing constraints: composing and performing with digital musical systems

    Get PDF
    This paper investigates two central terms in Human Computer Interaction (HCI) – affordances and constraints – and studies their relevance to the design and understanding of digital musical systems. It argues that in the analysis of complex systems, such as new interfaces for musical expression (NIME), constraints are a more productive analytical tool than the common HCI usage of affordances. Constraints are seen as limitations enabling the musician to encapsulate a specific search space of both physical and compositional gestures, proscribing complexity in favor of a relatively simple set of rules that engender creativity. By exploring the design of three different digital musical systems, the paper defines constraints as a core attribute of mapping, whether in instruments or compositional systems. The paper describes the aspiration for designing constraints as twofold: to save time, as musical performance is typically a real-time process, and to minimize the performer’s cognitive load. Finally, it discusses skill and virtuosity in the realm of new musical interfaces for musical expression with regard to constraints

    A MODELLER-SIMULATOR FOR INSTRUMENTAL PLAYING OF VIRTUAL MUSICAL INSTRUMENTS

    No full text
    International audienceThis paper presents a musician-oriented modelling and simulation environment for designing physically modelled virtual instruments and interacting with them via a high performance haptic device. In particular, our system allows restoring the physical coupling between the user and the manipulated virtual instrument, a key factor for expressive playing of traditional acoustical instruments that is absent in the vast majority of computer-based musical systems. We first analyse the various uses of haptic devices in Computer Music, and introduce the various technologies involved in our system. We then present the modeller and simulation environments, and examples of musical virtual instruments created with this new environment

    The Implementation of Augmented Reality and Low Latency Protocols in Musical Instrumental Collaborations

    Get PDF
    Past projects involving musical software have been completely virtual, while these software do well in entertainment and education, there is the question of whether these software are playable to the same extent as physical musical instruments. The software presented in this paper, AR Jam , utilizes various software and hardware tools to form a networked mixed reality system for the users to play music on. The intention of this project is to seek new ways to explore more playable musical instruments in the digital world. The paper presents the software\u27s implementation, challenges such as optimization problems of the synthesizer, and the proposal of new ways to improve various aspects of this system in the future

    Physical Interactions with Digital Strings - A hybrid approach to a digital keyboard instrument

    Get PDF
    A new hybrid approach to digital keyboard playing is presented, where the actual acoustic sounds from a digital keyboard are captured with contact microphones and applied as excitation signals to a digital model of a prepared piano, i.e., an extended wave-guide model of strings with the possibility of stopping and muting the strings at arbitrary positions. The parameters of the string model are controlled through TouchKeys multitouch sensors on each key, combined with MIDI data and acoustic signals from the digital keyboard frame, using a novel mapping. The instrument is evaluated from a performing musician's perspective, and emerging playing techniques are discussed. Since the instrument is a hybrid acoustic-digital system with several feedback paths between the domains, it provides for expressive and dynamic playing, with qualities approaching that of an acoustic instrument, yet with new kinds of control. The contributions are two-fold. First, the use of acoustic sounds from a physical keyboard for excitations and resonances results in a novel hybrid keyboard instrument in itself. Second, the digital model of "inside piano" playing, using multitouch keyboard data, allows for performance techniques going far beyond conventional keyboard playing

    Music in Virtual Space: Theories and Techniques for Sound Spatialization and Virtual Reality-Based Stage Performance

    Get PDF
    This research explores virtual reality as a medium for live concert performance. I have realized compositions in which the individual performing on stage uses a VR head-mounted display complemented by other performance controllers to explore a composed virtual space. Movements and objects within the space are used to influence and control sound spatialization and diffusion, musical form, and sonic content. Audience members observe this in real-time, watching the performer\u27s journey through the virtual space on a screen while listening to spatialized audio on loudspeakers variable in number and position. The major artistic challenge I will explore through this activity is the relationship between virtual space and musical form. I will also explore and document the technical challenges of this activity, resulting in a shareable software tool called the Multi-source Ambisonic Spatialization Interface (MASI), which is useful in creating a bridge between VR technologies and associated software, ambisonic spatialization techniques, sound synthesis, and audio playback and effects, and establishes a unique workflow for working with sound in virtual space

    Contextualizing musical organics: an ad-hoc organological classification approach

    Get PDF
    As a research field, NIME is characterised by a plethora of design approaches, hardware, and software technologies. Formed of an interdisciplinary research community with divergent end-goals, the diversity of aims, objectives, methods, and outcomes is striking. Ranging from expressive interfaces, to musicological concerns, novel sensor technologies, and artificial creativity, the research presented is heterogeneous, distinct, and original. The design of digital instruments is very different from the making of acoustic instruments, due to the bespoke traditions and production environments of the disciplines mentioned above, but notably also because of the heightened epistemic dimension inscribed in the materiality of digital systems. These new materialities are often hardware and software technologies manufactured for purposes other than music. Without having to support established traditions and relationships between the instrument maker and the performer or composer, new digital musical instruments often develop at the speed of the computer’s technical culture, as opposed to the slower evolution of more culturally engrained acoustic instrument design

    Multiparametric interfaces for fine-grained control of digital music

    Get PDF
    Digital technology provides a very powerful medium for musical creativity, and the way in which we interface and interact with computers has a huge bearing on our ability to realise our artistic aims. The standard input devices available for the control of digital music tools tend to afford a low quality of embodied control; they fail to realise our innate expressiveness and dexterity of motion. This thesis looks at ways of capturing more detailed and subtle motion for the control of computer music tools; it examines how this motion can be used to control music software, and evaluates musicians’ experience of using these systems. Two new musical controllers were created, based on a multiparametric paradigm where multiple, continuous, concurrent motion data streams are mapped to the control of musical parameters. The first controller, Phalanger, is a markerless video tracking system that enables the use of hand and finger motion for musical control. EchoFoam, the second system, is a malleable controller, operated through the manipulation of conductive foam. Both systems use machine learning techniques at the core of their functionality. These controllers are front ends to RECZ, a high-level mapping tool for multiparametric data streams. The development of these systems and the evaluation of musicians’ experience of their use constructs a detailed picture of multiparametric musical control. This work contributes to the developing intersection between the fields of computer music and human-computer interaction. The principal contributions are the two new musical controllers, and a set of guidelines for the design and use of multiparametric interfaces for the control of digital music. This work also acts as a case study of the application of HCI user experience evaluation methodology to musical interfaces. The results highlight important themes concerning multiparametric musical control. These include the use of metaphor and imagery, choreography and language creation, individual differences and uncontrol. They highlight how this style of interface can fit into the creative process, and advocate a pluralistic approach to the control of digital music tools where different input devices fit different creative scenarios

    16th Sound and Music Computing Conference SMC 2019 (28–31 May 2019, Malaga, Spain)

    Get PDF
    The 16th Sound and Music Computing Conference (SMC 2019) took place in Malaga, Spain, 28-31 May 2019 and it was organized by the Application of Information and Communication Technologies Research group (ATIC) of the University of Malaga (UMA). The SMC 2019 associated Summer School took place 25-28 May 2019. The First International Day of Women in Inclusive Engineering, Sound and Music Computing Research (WiSMC 2019) took place on 28 May 2019. The SMC 2019 TOPICS OF INTEREST included a wide selection of topics related to acoustics, psychoacoustics, music, technology for music, audio analysis, musicology, sonification, music games, machine learning, serious games, immersive audio, sound synthesis, etc

    On hyperstructure and musical structure

    Get PDF
    corecore