52 research outputs found

    Echoic Sensory Substitution Information in a Single Obstacle Circumvention Task.

    Get PDF
    Accurate motor control is required when walking around obstacles in order to avoid collisions. When vision is unavailable, sensory substitution can be used to improve locomotion through the environment. Tactile sensory substitution devices (SSDs) are electronic travel aids, some of which indicate the distance of an obstacle using the rate of vibration of a transducer on the skin. We investigated how accurately such an SSD guided navigation in an obstacle circumvention task. Using an SSD, 12 blindfolded participants navigated around a single flat 0.6 x 2 m obstacle. A 3-dimensional Vicon motion capture system was used to quantify various kinematic indices of human movement. Navigation performance under full vision was used as a baseline for comparison. The obstacle position was varied from trial to trial relative to the participant, being placed at two distances 25 cm to the left, right or directly ahead. Under SSD guidance, participants navigated without collision in 93% of trials. No collisions occurred under visual guidance. Buffer space (clearance between the obstacle and shoulder) was larger by a factor of 2.1 with SSD guidance than with visual guidance, movement times were longer by a factor of 9.4, and numbers of velocity corrections were larger by a factor of 5 (all p<0.05). Participants passed the obstacle on the side affording the most space in the majority of trials for both SSD and visual guidance conditions. The results are consistent with the idea that SSD information can be used to generate a protective envelope during locomotion in order to avoid collisions when navigating around obstacles, and to pass on the side of the obstacle affording the most space in the majority of trials.Vision and Eye Research Unit, Postgraduate Medical Institute at Anglia Ruskin University; Medical Research Council (Grant ID: G0701870)This is the final version of the article. It first appeared from the Public Library of Science via http://dx.doi.org/10.1371/journal.pone.016087

    Auditory distance perception in humans: a review of cues, development, neuronal bases, and effects of sensory loss.

    Get PDF
    Auditory distance perception plays a major role in spatial awareness, enabling location of objects and avoidance of obstacles in the environment. However, it remains under-researched relative to studies of the directional aspect of sound localization. This review focuses on the following four aspects of auditory distance perception: cue processing, development, consequences of visual and auditory loss, and neurological bases. The several auditory distance cues vary in their effective ranges in peripersonal and extrapersonal space. The primary cues are sound level, reverberation, and frequency. Nonperceptual factors, including the importance of the auditory event to the listener, also can affect perceived distance. Basic internal representations of auditory distance emerge at approximately 6 months of age in humans. Although visual information plays an important role in calibrating auditory space, sensorimotor contingencies can be used for calibration when vision is unavailable. Blind individuals often manifest supranormal abilities to judge relative distance but show a deficit in absolute distance judgments. Following hearing loss, the use of auditory level as a distance cue remains robust, while the reverberation cue becomes less effective. Previous studies have not found evidence that hearing-aid processing affects perceived auditory distance. Studies investigating the brain areas involved in processing different acoustic distance cues are described. Finally, suggestions are given for further research on auditory distance perception, including broader investigation of how background noise and multiple sound sources affect perceived auditory distance for those with sensory loss.The research was supported by MRC grant G0701870 and the Vision and Eye Research Unit (VERU), Postgraduate Medical Institute at Anglia Ruskin University.This is the final version of the article. It first appeared from Springer via http://dx.doi.org/10.3758/s13414-015-1015-

    A Framework to Account for the Effects of Visual Loss on Human Auditory Abilities

    Get PDF
    Until recently, a commonly held view was that blindness resulted in enhanced auditory abilities, underpinned by the beneficial effects of cross-modal neuroplasticity. This viewpoint has been challenged by studies showing that blindness results in poorer performance for some auditory spatial tasks. It is now clear that visual loss does not result in a general increase or decrease in all auditory abilities. Although several hypotheses have been proposed to explain why certain auditory abilities are enhanced while others are degraded, these are often limited to a specific subset of tasks. A comprehensive explanation encompassing auditory abilities assessed in fully blind and partially sighted populations and spanning spatial and non-spatial cognition has not so far been proposed. The current article proposes a framework comprising a set of nine principles that can be used to predict whether auditory abilities are enhanced or degraded. The validity of these principles is assessed by comparing their predictions with a wide range of empirical evidence concerning the effects of visual loss on spatial and non-spatial auditory abilities. Developmental findings and the effects of early- versus late-onset visual loss are discussed. Ways of improving auditory abilities for individuals with visual loss and reducing auditory spatial deficits are summarized. A new Perceptual Restructuring Hypothesis is proposed within the framework, positing that the auditory system is restructured to provide the most accurate information possible given the loss of the visual signal and utilizing available cortical resources, resulting in different auditory abilities getting better or worse according to the nine principles

    An assessment of auditory-guided locomotion in an obstacle circumvention task

    Get PDF
    This study investigated how effectively audition can be used to guide navigation around an obstacle. Ten blindfolded normally sighted participants navigated around a 0.6 × 2 m obstacle while producing self-generated mouth click sounds. Objective movement performance was measured using a Vicon motion capture system. Performance with full vision without generating sound was used as a baseline for comparison. The obstacle’s location was varied randomly from trial to trial: it was either straight ahead or 25 cm to the left or right relative to the participant. Although audition provided sufficient information to detect the obstacle and guide participants around it without collision in the majority of trials, buffer space (clearance between the shoulder and obstacle), overall movement times, and number of velocity corrections were significantly (p < 0.05) greater with auditory guidance than visual guidance. Collisions sometime occurred under auditory guidance, suggesting that audition did not always provide an accurate estimate of the space between the participant and obstacle. Unlike visual guidance, participants did not always walk around the side that afforded the most space during auditory guidance. Mean buffer space was 1.8 times higher under auditory than under visual guidance. Results suggest that sound can be used to generate buffer space when vision is unavailable, allowing navigation around an obstacle without collision in the majority of trials

    The accuracy of auditory spatial judgments in the visually impaired is dependent on sound source distance

    Get PDF
    Funder: This research was supported by the Vision and Eye Research Institute, School of Medicine at Anglia Ruskin University.Abstract: Blindness leads to substantial enhancements in many auditory abilities, and deficits in others. It is unknown how severe visual losses need to be before changes in auditory abilities occur, or whether the relationship between severity of visual loss and changes in auditory abilities is proportional and systematic. Here we show that greater severity of visual loss is associated with increased auditory judgments of distance and room size. On average participants with severe visual losses perceived sounds to be twice as far away, and rooms to be three times larger, than sighted controls. Distance estimates for sighted controls were most accurate for closer sounds and least accurate for farther sounds. As the severity of visual impairment increased, accuracy decreased for closer sounds and increased for farther sounds. However, it is for closer sounds that accurate judgments are needed to guide rapid motor responses to auditory events, e.g. planning a safe path through a busy street to avoid collisions with other people, and falls. Interestingly, greater visual impairment severity was associated with more accurate room size estimates. The results support a new hypothesis that crossmodal calibration of audition by vision depends on the severity of visual loss

    Partial visual loss disrupts the relationship between judged room size and sound source distance.

    Get PDF
    Funder: Vision and Eye Research Institute, School of Medicine, Faculty of Health, Education, Medicine and Social Care, Anglia Ruskin University.Visual spatial information plays an important role in calibrating auditory space. Blindness results in deficits in a number of auditory abilities, which have been explained in terms of the hypothesis that visual information is needed to calibrate audition. When judging the size of a novel room when only auditory cues are available, normally sighted participants may use the location of the farthest sound source to infer the nearest possible distance of the far wall. However, for people with partial visual loss (distinct from blindness in that some vision is present), such a strategy may not be reliable if vision is needed to calibrate auditory cues for distance. In the current study, participants were presented with sounds at different distances (ranging from 1.2 to 13.8 m) in a simulated reverberant (T60 = 700 ms) or anechoic room. Farthest distance judgments and room size judgments (volume and area) were obtained from blindfolded participants (18 normally sighted, 38 partially sighted) for speech, music, and noise stimuli. With sighted participants, the judged room volume and farthest sound source distance estimates were positively correlated (p < 0.05) for all conditions. Participants with visual losses showed no significant correlations for any of the conditions tested. A similar pattern of results was observed for the correlations between farthest distance and room floor area estimates. Results demonstrate that partial visual loss disrupts the relationship between judged room size and sound source distance that is shown by sighted participants

    Auditory spatial representations of the world are compressed in blind humans

    Get PDF
    Compared to sighted listeners, blind listeners often display enhanced auditory spatial abilities such as localization in azimuth. However, less is known about whether blind humans can accurately judge distance in extrapersonal space using auditory cues alone. Using virtualization techniques, we show that auditory spatial representations of the world beyond the peripersonal space of blind listeners are compressed compared to those for normally sighted controls. Blind participants overestimated the distance to nearby sources, and underestimated the distance to remote sound sources, in both reverberant and anechoic environments, and for speech, music and noise signals. Functions relating judged and actual virtual distance were well fitted by compressive power functions, indicating that the absence of visual information regarding the distance of sound sources may prevent accurate calibration of the distance information provided by auditory signals

    Comparison of auditory spatial bisection and minimum audible angle in front, lateral, and back space

    Get PDF
    Abstract: Although vision is important for calibrating auditory spatial perception, it only provides information about frontal sound sources. Previous studies of blind and sighted people support the idea that azimuthal spatial bisection in frontal space requires visual calibration, while detection of a change in azimuth (minimum audible angle, MAA) does not. The influence of vision on the ability to map frontal, lateral and back space has not been investigated. Performance in spatial bisection and MAA tasks was assessed for normally sighted blindfolded subjects using bursts of white noise presented frontally, laterally, or from the back relative to the subjects. Thresholds for both tasks were similar in frontal space, lower for the MAA task than for the bisection task in back space, and higher for the MAA task in lateral space. Two interpretations of the results are discussed, one in terms of visual calibration and the use of internal representations of source location and the other based on comparison of the magnitude or direction of change of the available binaural cues. That bisection thresholds were increased in back space relative to front space, where visual calibration information is unavailable, suggests that an internal representation of source location was used for the bisection task

    Auditory distance perception in front and rear space

    Get PDF
    The distance of sound sources relative to the body can be estimated using acoustic level and direct-to-reverberant ratio cues. However, the ability to do this may differ for sounds that are in front compared to behind the listener. One reason for this is that vision, which plays an important role in calibrating auditory distance cues early in life, is unavailable for rear space. Furthermore, the filtering of sounds by the pinnae differs if they originate from the front compared to the back. We investigated auditory distance discrimination in front and rear space by comparing performance for auditory spatial bisection of distance and minimum audible distance discrimination (MADD) tasks. In the bisection task, participants heard three successive bursts of noise at three different distances and indicated whether the second sound (probe) was closer in space to the first or third sound (references). In the MADD task, participants reported which of two successive sounds was closer. An analysis of variance with factors task and region of space showed worse performance for rear than for front space, but no significant interaction between task and region of space. For the bisection task, the point of subjective equality (PSE) was slightly biased towards the body, but the absolute magnitude of the PSE did not differ between front and rear space. These results are consistent with the hypothesis that visual information is important in calibrating the auditory representation of front space in distance early in lif

    Sensory substitution information informs locomotor adjustments when walking through apertures

    Get PDF
    The study assessed the ability of the central nervous system (CNS) to use echoic information from sensory substitution devices (SSDs) to rotate the shoulders and safely pass through apertures of different width. Ten visually normal participants performed this task with full vision, or blindfolded using an SSD to obtain information regarding the width of an aperture created by two parallel panels. Two SSDs were tested. Participants passed through apertures of +0%, +18%, +35%, and +70% of measured body width. Kinematic indices recorded movement time, shoulder rotation, average walking velocity across the trial, peak walking velocities before crossing, after crossing and throughout a whole trial. Analyses showed participants used SSD information to regulate shoulder rotation, with greater rotation associated with narrower apertures. Rotations made using an SSD were greater compared to vision, movement times were longer, average walking velocity lower and peak velocities before crossing, after crossing and throughout the whole trial were smaller, suggesting greater caution. Collisions sometimes occurred using an SSD but not using vision, indicating that substituted information did not always result in accurate shoulder rotation judgements. No differences were found between the two SSDs. The data suggest that spatial information, provided by sensory substitution, allows the relative position of aperture panels to be internally represented, enabling the CNS to modify shoulder rotation according to aperture width. Increased buffer space indicated by greater rotations (up to approximately 35% for apertures of +18% of body width), suggests that spatial representations are not as accurate as offered by full vision
    • …
    corecore