6,948 research outputs found

    Simulating Vocal Imitation in Infants, using a Growth Articulatory Model and Speech Robotics

    Get PDF
    In order to shed lights on the cognitive representations likely to underlie early vocal imitation, we tried to simulate Kuhl and Meltzoff's experiment (1996), using Bayesian robotics and a statistical model of the vocal tract that had been fitted to pre-babblers' actual vocalizations. It was shown that audition is compulsory to account for infants' early vocal imitation performance, inasmuch as the simulation of purely visual imitation failed to reproduce infants' score and pattern of imitation. Further, a small number of vocalizations (less than 100!) appeared to be enough for a learning process to provide scores at least as high as those of pre-babblers. Thus, early vocal imitation lies in the reach of a baby robot, with only a few assumptions about learning and imitation

    Going beyond Quietness: Determining the Emotionally Restorative Effect of Acoustic Environments in Urban Open Public Spaces

    Get PDF
    The capacity of natural settings to promote psychological restoration has attracted increasing research attention, especially with regards to the visual dimension. However, there is a need to extend these studies to urban settings, such as squares, parks or gardens, due to the global trend towards urbanisation, and to integrate the dimension of sound into landscape. Such was the main aim of this study, in which 53 participants assessed four public spaces in Vitoria-Gasteiz (Spain) as part of the CITI-SENSE Project (137 observations were used for analysis). A smartphone application was used to simultaneously collect objective and subjective data. The results show that at the end of the urban environmental experience, there was a statistically significant reduction in negative emotions and perceived stress, and a slight increase in positive emotions. Emotional restoration was mainly associated with prior emotional states, but also with global environmental comfort and acoustic comfort. The soundscape characteristics that contributed to greater emotional restoration and a reduction in perceived stress were pleasantness, calm, fun and naturalness. Therefore, in agreement with previous research, the findings of the present study indicate that besides contributing to the quietness of the urban environment, the urban soundscape can promote psychological restoration in users of these spaces.This research formed part of the CITI-SENSE project funded under the European Union Seventh Framework Programme for research, technological development and demonstration, grant agreement no 308524

    Multiparametric interfaces for fine-grained control of digital music

    Get PDF
    Digital technology provides a very powerful medium for musical creativity, and the way in which we interface and interact with computers has a huge bearing on our ability to realise our artistic aims. The standard input devices available for the control of digital music tools tend to afford a low quality of embodied control; they fail to realise our innate expressiveness and dexterity of motion. This thesis looks at ways of capturing more detailed and subtle motion for the control of computer music tools; it examines how this motion can be used to control music software, and evaluates musicians’ experience of using these systems. Two new musical controllers were created, based on a multiparametric paradigm where multiple, continuous, concurrent motion data streams are mapped to the control of musical parameters. The first controller, Phalanger, is a markerless video tracking system that enables the use of hand and finger motion for musical control. EchoFoam, the second system, is a malleable controller, operated through the manipulation of conductive foam. Both systems use machine learning techniques at the core of their functionality. These controllers are front ends to RECZ, a high-level mapping tool for multiparametric data streams. The development of these systems and the evaluation of musicians’ experience of their use constructs a detailed picture of multiparametric musical control. This work contributes to the developing intersection between the fields of computer music and human-computer interaction. The principal contributions are the two new musical controllers, and a set of guidelines for the design and use of multiparametric interfaces for the control of digital music. This work also acts as a case study of the application of HCI user experience evaluation methodology to musical interfaces. The results highlight important themes concerning multiparametric musical control. These include the use of metaphor and imagery, choreography and language creation, individual differences and uncontrol. They highlight how this style of interface can fit into the creative process, and advocate a pluralistic approach to the control of digital music tools where different input devices fit different creative scenarios

    Theatre Noise Conference

    Get PDF
    Three days of Performances, Installations, Residencies, Round Table Discussions, Presentations and Workshops More than an academic conference, Theatre Noise is a diverse collection of events exploring the sound of theatre from performance to the spaces inbetween. Featuring keynote presentations, artists in residence, electroacoustic, percussive and digital performances, industry workshops and installations, Theatre Noise is an immersive journey into sound

    Idiomatic Patterns and Aesthetic Influence in Computer Music Languages

    Get PDF
    It is widely accepted that acoustic and digital musical instruments shape the cognitive processes of the performer on both embodied and conceptual levels, ultimately influencing the structure and aesthetics of the resulting performance. In this article we examine the ways in which computer music languages might similarly influence the aesthetic decisions of the digital music practitioner, even when those languages are designed for generality and theoretically capable of implementing any sound-producing process. We examine the basis for querying the non-neutrality of tools with a particular focus on the concept of idiomaticity: patterns of instruments or languages which are particularly easy or natural to execute in comparison to others. We then present correspondence with the developers of several major music programming languages and a survey of digital musical instrument creators examining the relationship between idiomatic patterns of the language and the characteristics of the resulting instruments and pieces. In an open-ended creative domain, asserting causal relationships is difficult and potentially inappropriate, but we find a complex interplay between language, instrument, piece and performance that suggests that the creator of the music programming language should be considered one party to a creative conversation that occurs each time a new instrument is designed.Peer reviewe

    Effect of visual landscape factors on soundscape evaluation in old residential areas

    Get PDF
    The visual landscape influences the soundscape experience of urban public spaces. The purpose of this study is to analyze the relationships among visual landscape factors and soundscape evaluation of old residential areas and determine their main influencing factors. In Tianjin, China, six typical old residential areas were selected to collect sound and video information. Virtual reality (VR) was used to create an evaluation environment, and subjective evaluations of the visual landscape and soundscape were accessed through questionnaire (N = 256). The results show that the evaluation of soundscape and visual landscape satisfaction of the central square in the old residential area is superior to that of the public space along the street, as affected by spatial location, sound characteristics and other factors. Greenery satisfaction, environmental cleanliness, and architectural aesthetics were significantly positively correlated with soundscape evaluation. Additionally, three latent variables, namely, visual landscape factors, spatial factors and soundscape evaluation factors, were identified through factor analysis, and a structural equation model (SEM) of “visual landscape factors–soundscape evaluation” was built. The visual landscape factors in old residential areas were found to be important factors affecting soundscape evaluation. The standardization coefficient was 0.46 (P ≤ 0.01). Although the spatial factors have no direct contribution to the soundscape evaluation of the old residential areas, its observation variable, environmental cleanliness, is significantly positively correlated with all the observed variables of the soundscape evaluation factors

    Interactive computer music: a performer\u27s guide to issues surrounding Kyma with live clarinet input

    Get PDF
    Musicians are familiar with interaction in rehearsal and performance of music. Technology has become sophisticated and affordable to the point where interaction with a computer in real time performance is also possible. The nature of live interactive electronic music has blurred the distinction between the formerly exclusive realm of composition and that of performance. It is quite possible for performers to participate in the genre but currently little information is available for those wishing to explore it. This written document contains a definition of interaction, discussion on how it occurs in traditional music-making and a brief history of the emergence of live interaction in computer music. It also discusses the concept of live interaction, its aesthetic value, and highlights the possibilities of live interactive computer music using clarinet and the Kyma system, revealing ways a performer may maximize the interactive experience. The document, written from a player\u27s perspective, contains descriptions of possible methods of interaction with Kyma and live clarinet input divided into two areas: the clarinet can be used as a controller and the clarinet can be used as a source of sound. Information upon technical issues such as the speaker system, performance-space acoustics and diffusion options, possible interactive inputs, and specifically on microphone choices for clarinet is provided. There is little information for musicians contemplating the use of Kyma; specifically clarinetists will find in this paper a practical guide to many aspects of live electronic interaction and be better informed to explore the field. This area has the potential to expand not only our performing opportunities, but might increase economic development. Application of interactive music technology can be used in a traditional recital and for collaborative work with other art forms, installation projects and even music therapy. Knowledge of these programs also opens possibilities for sound design in theatre, film and other commercial applications

    Modeling the development of pronunciation in infant speech acquisition.

    Get PDF
    Pronunciation is an important part of speech acquisition, but little attention has been given to the mechanism or mechanisms by which it develops. Speech sound qualities, for example, have just been assumed to develop by simple imitation. In most accounts this is then assumed to be by acoustic matching, with the infant comparing his output to that of his caregiver. There are theoretical and empirical problems with both of these assumptions, and we present a computational model- Elija-that does not learn to pronounce speech sounds this way. Elija starts by exploring the sound making capabilities of his vocal apparatus. Then he uses the natural responses he gets from a caregiver to learn equivalence relations between his vocal actions and his caregiver's speech. We show that Elija progresses from a babbling stage to learning the names of objects. This demonstrates the viability of a non-imitative mechanism in learning to pronounce
    corecore