107,958 research outputs found

    Investigating sound intensity gradients as feedback for embodied learning

    Get PDF
    This paper explores an intensity-based approach to sound feedback in systems for embodied learning. We describe a theoretical framework, design guidelines, and the implementation of and results from an informant workshop. The specific context of embodied activity is considered in light of the challenges of designing meaningful sound feedback, and a design approach is shown to be a generative way of uncovering significant sound design patterns. The exploratory workshop offers preliminary directions and design guidelines for using intensity-based ambient sound display in interactive learning environments. The value of this research is in its contribution towards the development of a cohesive and ecologically valid model for using audio feedback in systems, which can guide embodied interaction. The approach presented here suggests ways that multi-modal auditory feedback can support interactive collaborative learning and problem solving

    Exploring ambient sound techniques in the design of responsive environments for children

    Get PDF
    This paper describes the theoretical framework, design, implementation and results from an exploratory informant workshop that examines an alternative approach to sound feedback in the design of responsive environments for children. This workshop offers preliminary directions and models for using intensity-based ambient sound display in the design of interactive learning environments for children that offer assistance in task-oriented activities. We see the value of this research in developing a more cohesive and ecological model for use of audio feedback in the design of embedded interactions for children. The approach presented here takes the design of multi-modal feedback beyond being experiential, to one that supports learning and problem solving

    SPATIAL SOUND SYSTEM TO AID INTERACTIVITY IN A HUMAN CENTRED DESIGN EVALUATION OF AN AIRCRAFT CABIN ENVIRONMENT

    Get PDF
    There is a lot of research towards the concept of 3D sound in virtual reality environments. With the incipient growth in the significance of designing more realistic and immersive experiences for a Human Centred Design (HCD) approach, sound perception is believed to add an interactive element in maximizing the human perspective. In this context, the concept of an audio-visual interaction model between a passenger and a crew member in an immersive aircraft cabin environment is studied and presented in this paper. The study focuses on the design and usability of spatial sources as an interactive component in a regional aircraft cabin design for Human in the Loop evaluation. Sound sources are placed among the virtual manikins acting as passengers with the aim of building a realistic virtual environment for the user enacting the role of a crew member. The crew member, while walking throughthe cabin can orient and identify the position of the sound source inside the immersive Cabin environment. We review the 3D sound approaches and cues for sound spatialization in a virtual environment and propose that audio-visual interactivity aids the immersive Human centred design analysis

    Springboard: exploring embodied metaphor in the design of sound feedback for physical responsive environments

    Get PDF
    Presented at the 15th International Conference on Auditory Display (ICAD2009), Copenhagen, Denmark, May 18-22, 2009In this paper we propose a role for suing embodied metaphor in the design of sound feedback for interactive physical environments. We describe the application of a balance metaphor in the design of the interaction model for a prototype interactive environment called Springboard. We focus specifically on the auditory feedback, and conclude with a discussion of design choices and future research directions based on our prototype

    Synaesthetic audio-visual sound toys in virtual reality

    Get PDF
    This paper discusses the design of audio-visual sound toys in Cyberdream, a virtual reality music visualization. While an earlier version of this project for Oculus GearVR provided a journey through audio-visual environments related to 1990s rave culture, the most recent iteration for Oculus Quest provides the addition of three audio-visual sound toys, the discussion of which is the main focus of this paper. In the latest version, the user flies through synaesthetic environments, while using the interactive controllers to manipulate the audio-visual sound toys and 'paint with sound'. These toys allow the user to playfully manipulate sound and image in a way that is complementary to, and interfaces with, the audio-visual backdrop provided by the VR music visualization. Through the discussion of novel approaches to design, the project informs new strategies in the field of VR music visualizations

    ECHO & NarSYS - An accoustic modeler and sound renderer

    Get PDF
    International audienceComputer graphics simulations are now widely used in the field of environmental modelling, for example to evaluate the visual impact of an architectural project on its environment and interactively change its design. Realistic sound simulation is equally important for environmental modelling. At iMAGIS, a joint project of INRIA, CNRS, Joseph Fourier University and the Institut National Polytechnique of Grenoble, we are currently developing an integrated interactive acoustic modelling and sound rendering system for virtual environments. The aim of the system is to provide an interactive simulation of global sound propagation in a given environment and an integrated sound/computer graphics rendering to obtain computer simulated movies of the environment with realistic and coherent soundtracks

    How far away is plug 'n' play? Assessing the near-term potential of sonification and auditory display

    Get PDF
    The commercial music industry offers a broad range of plug 'n' play hardware and software scaled to music professionals and scaled to a broad consumer market. The principles of sound synthesis utilized in these products are relevant to application in virtual environments (VE). However, the closed architectures used in commercial music synthesizers are prohibitive to low-level control during real-time rendering, and the algorithms and sounds themselves are not standardized from product to product. To bring sound into VE requires a new generation of open architectures designed for human-controlled performance from interfaces embedded in immersive environments. This presentation addresses the state of the sonic arts in scientific computing and VE, analyzes research challenges facing sound computation, and offers suggestions regarding tools we might expect to become available during the next few years. A list of classes of audio functionality in VE includes sonification -- the use of sound to represent data from numerical models; 3D auditory display (spatialization and localization, also called externalization); navigation cues for positional orientation and for finding items or regions inside large spaces; voice recognition for controlling the computer; external communications between users in different spaces; and feedback to the user concerning his own actions or the state of the application interface. To effectively convey this considerable variety of signals, we apply principles of acoustic design to ensure the messages are neither confusing nor competing. We approach the design of auditory experience through a comprehensive structure for messages, and message interplay we refer to as an Automated Sound Environment. Our research addresses real-time sound synthesis, real-time signal processing and localization, interactive control of high-dimensional systems, and synchronization of sound and graphics
    • …
    corecore