226 research outputs found

    Effects of Coordinated Bilateral Hearing Aids and Auditory Training on Sound Localization

    Get PDF
    This thesis has two main objectives: 1) evaluating the benefits of the bilateral coordination of the hearing aid Digital Signal Processing (DSP) features by measuring and comparing the auditory performance with and without the activation of this coordination, and 2) evaluating the benefits of acclimatization and auditory training on such auditory performance and, determining whether receiving training in one aspect of auditory performance (sound localization) would generalize to an improvement in another aspect of auditory performance (speech intelligibility in noise), and to what extent. Two studies were performed. The first study evaluated the speech intelligibility in noise and horizontal sound localization abilities in HI listeners using hearing aids that apply bilateral coordination of WDRC. A significant improvement was noted in sound localization with bilateral coordination on when compared to off, while speech intelligibility in noise did not seem to be affected. The second study was an extension of the first study, with a suitable period for acclimatization provided and then the participants were divided into training and control groups. Only the training group received auditory training. The training group performance was significantly better than the control group performance in some conditions, in both the speech intelligibility and the localization tasks. The bilateral coordination did not have significant effects on the results of the second study. This work is among the early literature to investigate the impact of bilateral coordination in hearing aids on the users’ auditory performance. Also, this work is the first to demonstrate the effect of auditory training in sound localization on the speech intelligibility performance

    Follow the Sound : Design of mobile spatial audio applications for pedestrian navigation

    Get PDF
    Auditory displays are slower than graphical user interfaces. We believe spatial audio can change that. Human perception can localize the position of sound sources due to psychoacoustical cues. Spatial audio reproduces these cues to produce virtual sound source position by headphones. The spatial attribute of sound can be used to produce richer and more effective auditory displays. In this work, there is proposed a set of interaction design guidelines for the use of spatial audio displays in a mobile context. These guidelines are inferred from psychoacoustical theory, design theory and experience with prototype development. The horizontal front arc is presented as the optimum area for sound localization, and the use of head- or body-tracking is stated to be highly beneficial. Blind and visually impaired pedestrians may use auditory displays on mobile devices as navigation aids. Such aids have the potential to give visually impaired access to the environment and independence of movement. Custom made hardware is not always needed, as today’s smartphones offer a powerful platform for specialized applications. The Sound Guide prototype application was developed for the Apple iPhone and offered route guidance through the spatial position of audio icons. Real-time directional guidance was achieved through the use of GPS, compass sensor and gyroscope sensor. Spatial audio was accomplished through the use of prefiltered audio tracks that represented a 360° horizontal circle around the user. The source code of this prototype is made available to the community. Field tests of the prototype were done with three participants and one pilot tester that were visually impaired. One route was navigated with the help of the prototype. Interviews were done to get background information on navigation for visually impaired pedestrians. This was done to see how the prototype was received by visually impaired test users and what can be done to improve the concept in later development. Even though the prototype suffered from technical instabilities during the field tests, the general responses were positive. The blind participants saw potential in this technology and how it could be used in providing directional information. A range of improvements on the concept has been proposed

    Making Spatial Information Accessible on Touchscreens for Users who are Blind and Visually Impaired

    Get PDF
    Touchscreens have become a de facto standard of input for mobile devices as they most optimally use the limited input and output space that is imposed by their form factor. In recent years, people who are blind and visually impaired have been increasing their usage of smartphones and touchscreens. Although basic access is available, there are still many accessibility issues left to deal with in order to bring full inclusion to this population. One of the important challenges lies in accessing and creating of spatial information on touchscreens. The work presented here provides three new techniques, using three different modalities, for accessing spatial information on touchscreens. The first system makes geometry and diagram creation accessible on a touchscreen through the use of text-to-speech and gestural input. This first study is informed by a qualitative study of how people who are blind and visually impaired currently access and create graphs and diagrams. The second system makes directions through maps accessible using multiple vibration sensors without any sound or visual output. The third system investigates the use of binaural sound on a touchscreen to make various types of applications accessible such as physics simulations, astronomy, and video games

    NAV-VIR: an audio-tactile virtual environment to assist visually impaired people

    Get PDF
    International audienceThis paper introduces the NAV-VIR system, a multimodal virtual environment to assist visually impaired people in virtually discovering and exploring unknown areas from the safety of their home. The originality of NAV-VIR resides in (1) an optimized representation of the surrounding topography, the spatial gist, based on human spatial cognition models and the sensorimotor supplementation framework, and (2) a multimodal orientation-aware immersive virtual environment relying on two synergetic interfaces: an interactive force feedback tablet, the F2T, and an immersive HRTF-based 3D audio simulation relying on binaural recordings of real environments. This paper presents NAV-VIR functionalities and its preliminary evaluation through a simple shape and movement perception task

    Current Use and Future Perspectives of Spatial Audio Technologies in Electronic Travel Aids

    Get PDF
    Electronic travel aids (ETAs) have been in focus since technology allowed designing relatively small, light, and mobile devices for assisting the visually impaired. Since visually impaired persons rely on spatial audio cues as their primary sense of orientation, providing an accurate virtual auditory representation of the environment is essential. This paper gives an overview of the current state of spatial audio technologies that can be incorporated in ETAs, with a focus on user requirements. Most currently available ETAs either fail to address user requirements or underestimate the potential of spatial sound itself, which may explain, among other reasons, why no single ETA has gained a widespread acceptance in the blind community. We believe there is ample space for applying the technologies presented in this paper, with the aim of progressively bridging the gap between accessibility and accuracy of spatial audio in ETAs.This project has received funding from the European Union’s Horizon 2020 Research and Innovation Programme under Grant Agreement no. 643636.Peer Reviewe

    Spatial Auditory Maps for Blind Travellers

    Get PDF
    Empirical research shows that blind persons who have the ability and opportunity to access geographic map information tactually, benefit in their mobility. Unfortunately, tangible maps are not found in large numbers. Economics is the leading explanation: tangible maps are expensive to build, duplicate and distribute. SAM, short for Spatial Auditory Map, is a prototype created to address the unavail- ability of tangible maps. SAM presents geographic information to a blind person encoded in sound. A blind person receives maps electronically and accesses them using a small in- expensive digitalizing tablet connected to a PC. The interface provides location-dependent sound as a stylus is manipulated by the user, plus a schematic visual representation for users with residual vision. The assessment of SAM on a group of blind participants suggests that blind users can learn unknown environments as complex as the ones represented by tactile maps - in the same amount of reading time. This research opens new avenues in visualization techniques, promotes alternative communication methods, and proposes a human-computer interaction framework for conveying map information to a blind person

    Personalization in object-based audio for accessibility : a review of advancements for hearing impaired listeners

    Get PDF
    Hearing loss is widespread and significantly impacts an individual’s ability to engage with broadcast media. Access can be improved through new object-based audio personalization methods. Utilizing the literature on hearing loss and intelligibility this paper develops three dimensions which are evidenced to improve intelligibility: spatial separation, speech to noise ratio and redundancy. These can be personalized, individually or concurrently, using object based audio. A systematic review of all work in object-based audio personalization is then undertaken. These dimensions are utilized to evaluate each project’s approach to personalisation, identifying successful approaches, commercial challenges and the next steps required to ensure continuing improvements to broadcast audio for hard of hearing individuals
    corecore