808 research outputs found

    Reducing reversal errors in localizing the source of sound in virtual environment without head tracking

    Get PDF
    International audienceThis paper presents a study about the effect of using additional audio cueing and Head-Related Transfer Function (HRTF) on human performance in sound source localization task without using head movement. The existing techniques of sound spatialization generate reversal errors. We intend to reduce these errors by introducing sensory cues based on sound effects. We conducted and experimental study to evaluate the impact of additional cues in sound source localization task. The results showed the benefit of combining the additional cues and HRTF in terms of the localization accuracy and the reduction of reversal errors. This technique allows significant reduction of reversal errors compared to the use of the HRTF separately. For instance, this technique could be used to improve audio spatial alerting, spatial tracking and target detection in simulation applications when head movement is not included

    Reducing reversal errors in localizing the source of sound in virtual environment without head tracking

    Get PDF
    This paper presents a study about the effect of using additional audio cueing and Head-Related Transfer Function (HRTF) on human performance in sound source localization task without using head movement. The existing techniques of sound spatialization generate reversal errors. We intend to reduce these errors by introducing sensory cues based on sound effects. We conducted and experimental study to evaluate the impact of additional cues in sound source localization task. The results showed the benefit of combining the additional cues and HRTF in terms of the localization accuracy and the reduction of reversal errors. This technique allows significant reduction of reversal errors compared to the use of the HRTF separately. For instance, this technique could be used to improve audio spatial alerting, spatial tracking and target detection in simulation applications when head movement is not included

    Using 3D sound for providing 3D interaction in virtual environment

    Get PDF
    In this paper we describe a proposal based on the use of 3D sound metaphors for providing precise spatial cueing in virtual environment. A 3D sound metaphor is a combination of the audio spatialization and audio cueing techniques. The 3D sound metaphors are supposed to improve the user performance and perception. The interest of this kind of stimulation mechanism is that it could allow providing efficient 3D interaction for interactive tasks such as selection, manipulation and navigation among others. We describe the main related concepts, the most relevant related work, the current theoretical and technical problems, the description of our approach, our scientific objectives, our methodology and our research perspectives

    Effect of Head Movement on Sound Localization in Real and Simulated Cochlear Implant Users

    Get PDF
    Cochlear implant (CI) users’ limited ability to use acoustical cues for sound localization causes left/right confusions and front/back reversals. Head movement is beneficial in reducing these errors in acoustically hearing listeners. This study investigated the effect of head movement on localization throughout 360o of azimuth for both real and simulated CI users. Listeners in a bilateral electro-acoustic (CI with ipsilateral hearing aid) simulation derived the greatest head movement benefit in reducing front/back reversals. Left/right confusions were reduced in simulations with matched bilateral stimulation. Sensitivity to both timing and level cues for sound localization was correlated with sound localization performance without head movement for simulated device users. Sensitivity to timing cues was correlated with sound localization performance with head movement cues for simulated device users. Simulations of bilateral CI and bimodal users (CI with contralateral hearing aid) listening predicted real users’ sound localization performance, binaural sensitivity and head movement patterns

    Auditory Displays and Assistive Technologies: the use of head movements by visually impaired individuals and their implementation in binaural interfaces

    Get PDF
    Visually impaired people rely upon audition for a variety of purposes, among these are the use of sound to identify the position of objects in their surrounding environment. This is limited not just to localising sound emitting objects, but also obstacles and environmental boundaries, thanks to their ability to extract information from reverberation and sound reflections- all of which can contribute to effective and safe navigation, as well as serving a function in certain assistive technologies thanks to the advent of binaural auditory virtual reality. It is known that head movements in the presence of sound elicit changes in the acoustical signals which arrive at each ear, and these changes can improve common auditory localisation problems in headphone-based auditory virtual reality, such as front-to-back reversals. The goal of the work presented here is to investigate whether the visually impaired naturally engage head movement to facilitate auditory perception and to what extent it may be applicable to the design of virtual auditory assistive technology. Three novel experiments are presented; a field study of head movement behaviour during navigation, a questionnaire assessing the self-reported use of head movement in auditory perception by visually impaired individuals (each comparing visually impaired and sighted participants) and an acoustical analysis of inter-aural differences and cross- correlations as a function of head angle and sound source distance. It is found that visually impaired people self-report using head movement for auditory distance perception. This is supported by head movements observed during the field study, whilst the acoustical analysis showed that interaural correlations for sound sources within 5m of the listener were reduced as head angle or distance to sound source were increased, and that interaural differences and correlations in reflected sound were generally lower than that of direct sound. Subsequently, relevant guidelines for designers of assistive auditory virtual reality are proposed

    Psychophysical Evaluation of Three-Dimensional Auditory Displays

    Get PDF
    This report describes the process made during the first year of a three-year Cooperative Research Agreement (CRA NCC2-542). The CRA proposed a program of applied of psychophysical research designed to determine the requirements and limitations of three-dimensional (3-D) auditory display systems. These displays present synthesized stimuli to a pilot or virtual workstation operator that evoke auditory images at predetermined positions in space. The images can be either stationary or moving. In previous years. we completed a number of studies that provided data on listeners' abilities to localize stationary sound sources with 3-D displays. The current focus is on the use of 3-D displays in 'natural' listening conditions, which include listeners' head movements, moving sources, multiple sources and 'echoic' sources. The results of our research on two of these topics, the role of head movements and the role of echoes and reflections, were reported in the most recent Semi-Annual Pro-ress Report (Appendix A). In the period since the last Progress Report we have been studying a third topic, the localizability of moving sources. The results of this research are described. The fidelity of a virtual auditory display is critically dependent on precise measurement of the listener''s Head-Related Transfer Functions (HRTFs), which are used to produce the virtual auditory images. We continue to explore methods for improving our HRTF measurement technique. During this reporting period we compared HRTFs measured using our standard open-canal probe tube technique and HRTFs measured with the closed-canal insert microphones from the Crystal River Engineering Snapshot system

    Current Use and Future Perspectives of Spatial Audio Technologies in Electronic Travel Aids

    Get PDF
    Electronic travel aids (ETAs) have been in focus since technology allowed designing relatively small, light, and mobile devices for assisting the visually impaired. Since visually impaired persons rely on spatial audio cues as their primary sense of orientation, providing an accurate virtual auditory representation of the environment is essential. This paper gives an overview of the current state of spatial audio technologies that can be incorporated in ETAs, with a focus on user requirements. Most currently available ETAs either fail to address user requirements or underestimate the potential of spatial sound itself, which may explain, among other reasons, why no single ETA has gained a widespread acceptance in the blind community. We believe there is ample space for applying the technologies presented in this paper, with the aim of progressively bridging the gap between accessibility and accuracy of spatial audio in ETAs.This project has received funding from the European Union’s Horizon 2020 Research and Innovation Programme under Grant Agreement no. 643636.Peer Reviewe

    3-D audio using loudspeakers

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Program in Media Arts & Sciences, 1997.Includes bibliographical references (p. 145-153).by William G. Gardner.Ph.D
    corecore