5,896 research outputs found

    Virtual Audio - Three-Dimensional Audio in Virtual Environments

    Get PDF
    Three-dimensional interactive audio has a variety ofpotential uses in human-machine interfaces. After lagging seriously behind the visual components, the importance of sound is now becoming increas-ingly accepted. This paper mainly discusses background and techniques to implement three-dimensional audio in computer interfaces. A case study of a system for three-dimensional audio, implemented by the author, is described in great detail. The audio system was moreover integrated with a virtual reality system and conclusions on user tests and use of the audio system is presented along with proposals for future work at the end of the paper. The thesis begins with a definition of three-dimensional audio and a survey on the human auditory system to give the reader the needed knowledge of what three-dimensional audio is and how human auditory perception works

    Training spatial hearing skills in virtual reality through a sound-reaching task

    Get PDF
    Sound localization is crucial for interacting with the surrounding world. This ability can be learned across time and improved by multisensory and motor cues. In the last decade, studying the contributions of multisensory and motor cues has been facilitated by the increased adoption of virtual reality (VR). In a recent study, sound localization had been trained through a task where the visual stimuli were rendered through a VR headset, and the auditory ones through a loudspeaker moved around by the experimenter. Physically reaching to sound sources reduced sound localization errors faster and to a greater extent if compared to naming sources’ positions. Interestingly, training efficacy extended also to hearing-impaired people. Yet, this approach is unfeasible for rehabilitation at home. Fullyvirtual approaches have been used to study spatial hearing learning processes, performing headphones-rendered acoustic simulations. In the present study, we investigate whether the effects of our reaching-based training can be observed when taking advantage of such simulations, showing that the improvement is comparable between the full-VR and blended VR conditions. This validates the use of training paradigms that are completely based on portable equipment and don’t require an external operator, opening new perspectives in the field of remote rehabilitation

    Sensitivity to Angular and Radial Source Movements as a Function of Acoustic Complexity in Normal and Impaired Hearing

    Get PDF
    In contrast to static sounds, spatially dynamic sounds have received little attention in psychoacoustic research so far. This holds true especially for acoustically complex (reverberant, multisource) conditions and impaired hearing. The current study therefore investigated the influence of reverberation and the number of concurrent sound sources on source movement detection in young normal-hearing (YNH) and elderly hearing-impaired (EHI) listeners. A listening environment based on natural environmental sounds was simulated using virtual acoustics and rendered over headphones. Both near-far (‘radial’) and left-right (‘angular’) movements of a frontal target source were considered. The acoustic complexity was varied by adding static lateral distractor sound sources as well as reverberation. Acoustic analyses confirmed the expected changes in stimulus features that are thought to underlie radial and angular source movements under anechoic conditions and suggested a special role of monaural spectral changes under reverberant conditions. Analyses of the detection thresholds showed that, with the exception of the single-source scenarios, the EHI group was less sensitive to source movements than the YNH group, despite adequate stimulus audibility. Adding static sound sources clearly impaired the detectability of angular source movements for the EHI (but not the YNH) group. Reverberation, on the other hand, clearly impaired radial source movement detection for the EHI (but not the YNH) listeners. These results illustrate the feasibility of studying factors related to auditory movement perception with the help of the developed test setup

    Risk-driven behaviour in the African leopard:how is leopard behaviour mediated by lion presence?

    Get PDF
    Agricultural expansion is restricting many carnivore species to smaller tracts of land, potentially forcing increased levels of overlap between competitors by constraining spatial partitioning. Understanding encounters between competitors is important because competition can influence species densities, distributions, and reproductive success. Despite this, little is known of the mechanisms that mediate coexistence between the African leopard (Panthera pardus) and its competitors. This project used GPS radiocollar data and playback experiments to understand risk-driven changes in the leopard’s behaviour and movement during actual and perceived encounters with lions (Panthera leo). Targeted playbacks of lion roars were used to elucidate immediate and short-lived behavioural responses in leopards when lions were perceived to be within the immediate area. To investigate the post-encounter spatial dynamics of leopard movements, the project used datasets from high-resolution GPS radiocollars deployed on leopards and lions with overlapping territories in the Okavango Delta, Botswana. Leopards were found to adapt behaviours and movements when lions were perceived to be nearby. Specifically, roar playbacks elicited longer periods of vigilance than controls, and movement directions were influenced by speaker locations. Further, leopard movements were quicker and more directional after encountering lions. However, adjustments in behaviour and movement were short-lived. The results provide insights into mechanisms used by the leopard to coexist with its competitors and are a useful case study of the methods that could be used to investigate encounter dynamics within other systems

    Influence of Auditory Cues on the visually-induced Self-Motion Illusion (Circular Vection) in Virtual Reality

    Get PDF
    This study investigated whether the visually induced selfmotion illusion (“circular vection”) can be enhanced by adding a matching auditory cue (the sound of a fountain that is also visible in the visual stimulus). Twenty observers viewed rotating photorealistic pictures of a market place projected onto a curved projection screen (FOV: 54°x45°). Three conditions were randomized in a repeated measures within-subject design: No sound, mono sound, and spatialized sound using a generic head-related transfer function (HRTF). Adding mono sound increased convincingness ratings marginally, but did not affect any of the other measures of vection or presence. Spatializing the fountain sound, however, improved vection (convincingness and vection buildup time) and presence ratings significantly. Note that facilitation was found even though the visual stimulus was of high quality and realism, and known to be a powerful vection-inducing stimulus. Thus, HRTF-based auralization using headphones can be employed to improve visual VR simulations both in terms of self-motion perception and overall presence

    Measuring Spatial Hearing Abilities in Listeners with Simulated Unilateral Hearing Loss

    Get PDF
    Spatial hearing is the ability to use auditory cues to determine the location, direction, and distance of sound in space. Listeners with unilateral hearing loss (UHL) typically have difficulty understanding speech in the presence of competing sound; this is likely due to the lack of access to spatial cues. The assessment of spatial hearing abilities in individuals with UHL is of growing clinical interest, particularly for everyday listening environments. Current approaches used to measure spatial hearing abilities include Spatial Release from Masking (SRM), Binaural Intelligibility Level Difference (BILD), and Listening in Spatialized Noise-Sentences (LiSN-S) test. Spatial Release from Masking is the improvement in speech recognition thresholds (SRT) when the target and masker are co-located as opposed to when they are spatially separated, utilizing a sound-field setup. The LiSN-S test also measures improvement in SRTs when the target and masker are spatially separated. Although similar, the LiSN-S utilizes a more clinically assessable procedure by simulating a three-dimensional auditory environment under headphones. Akin to the LiSN-S, the BILD also utilizes headphones but instead elicits improved SRTs by presenting target speech 180° out-of-phase to one ear instead of in-phase to two ears. The purposes of this study were (a) to determine if patterns of individual variability were similar across the three measures for 30 adults with normal hearing and 28 adults with simulated UHL and (b) to evaluate the effects of simulated UHL on performance. Results of this study confirmed the three tests were all sensitive measures of binaural hearing deficits in participants with UHL. Although all measures were correlated with each other, only the measures conducted under headphones (BILD and LiSN-S) were influenced by magnitude of asymmetry. These findings suggested that although the measures were producing similar results, they might be reflecting different aspects of binaural processing

    Real virtuality: emerging technology for virtually recreating reality

    Get PDF

    Auditory distance perception in humans: a review of cues, development, neuronal bases, and effects of sensory loss.

    Get PDF
    Auditory distance perception plays a major role in spatial awareness, enabling location of objects and avoidance of obstacles in the environment. However, it remains under-researched relative to studies of the directional aspect of sound localization. This review focuses on the following four aspects of auditory distance perception: cue processing, development, consequences of visual and auditory loss, and neurological bases. The several auditory distance cues vary in their effective ranges in peripersonal and extrapersonal space. The primary cues are sound level, reverberation, and frequency. Nonperceptual factors, including the importance of the auditory event to the listener, also can affect perceived distance. Basic internal representations of auditory distance emerge at approximately 6 months of age in humans. Although visual information plays an important role in calibrating auditory space, sensorimotor contingencies can be used for calibration when vision is unavailable. Blind individuals often manifest supranormal abilities to judge relative distance but show a deficit in absolute distance judgments. Following hearing loss, the use of auditory level as a distance cue remains robust, while the reverberation cue becomes less effective. Previous studies have not found evidence that hearing-aid processing affects perceived auditory distance. Studies investigating the brain areas involved in processing different acoustic distance cues are described. Finally, suggestions are given for further research on auditory distance perception, including broader investigation of how background noise and multiple sound sources affect perceived auditory distance for those with sensory loss.The research was supported by MRC grant G0701870 and the Vision and Eye Research Unit (VERU), Postgraduate Medical Institute at Anglia Ruskin University.This is the final version of the article. It first appeared from Springer via http://dx.doi.org/10.3758/s13414-015-1015-

    Binaural Technique:Basic Methods for Recording, Synthesis, and Reproduction

    Get PDF
    • …
    corecore