3,662 research outputs found

    Systematic biases in human heading estimation.

    Get PDF
    Heading estimation is vital to everyday navigation and locomotion. Despite extensive behavioral and physiological research on both visual and vestibular heading estimation over more than two decades, the accuracy of heading estimation has not yet been systematically evaluated. Therefore human visual and vestibular heading estimation was assessed in the horizontal plane using a motion platform and stereo visual display. Heading angle was overestimated during forward movements and underestimated during backward movements in response to both visual and vestibular stimuli, indicating an overall multimodal bias toward lateral directions. Lateral biases are consistent with the overrepresentation of lateral preferred directions observed in neural populations that carry visual and vestibular heading information, including MSTd and otolith afferent populations. Due to this overrepresentation, population vector decoding yields patterns of bias remarkably similar to those observed behaviorally. Lateral biases are inconsistent with standard bayesian accounts which predict that estimates should be biased toward the most common straight forward heading direction. Nevertheless, lateral biases may be functionally relevant. They effectively constitute a perceptual scale expansion around straight ahead which could allow for more precise estimation and provide a high gain feedback signal to facilitate maintenance of straight-forward heading during everyday navigation and locomotion

    Non-Optimal Perceptual Decision in Human Navigation

    Get PDF
    We highlight that optimal cue combination does not represent a general principle of cue interaction during navigation, extending Rahnev & Denison’s (R&D) summary of nonoptimal perceptual decisions to the navigation domain. However, we argue that the term ‘suboptimality’ does not capture the way visual and nonvisual cues interact in navigational decisions

    How You Get There From Here:Interaction of Visual Landmarks and Path Integration in Human Navigation

    Get PDF
    How do people combine their sense of direction with their use of visual landmarks during navigation? Cue-integration theory predicts that such cues will be optimally integrated to reduce variability, whereas cue-competition theory predicts that one cue will dominate the response direction. We tested these theories by measuring both accuracy and variability in a homing task while manipulating information about path integration and visual landmarks. We found that the two cues were near-optimally integrated to reduce variability, even when landmarks were shifted up to 90°. Yet the homing direction was dominated by a single cue, which switched from landmarks to path integration when landmark shifts were greater than 90°. These findings suggest that cue integration and cue competition govern different aspects of the homing response: Cues are integrated to reduce response variability but compete to determine the response direction. The results are remarkably similar to data on animal navigation, which implies that visual landmarks reset the orientation, but not the precision, of the path-integration system

    Perceptual compasses: spatial navigation in multisensory environments

    Get PDF
    Moving through space is a crucial activity in daily human life. The main objective of my Ph.D. project consisted of investigating how people exploit the multisensory sources of information available (vestibular, visual, auditory) to efficiently navigate. Specifically, my Ph.D. aimed at i) examining the multisensory integration mechanisms underlying spatial navigation; ii) establishing the crucial role of vestibular signals in spatial encoding and processing, and its interaction with environmental landmarks; iii) providing the neuroscientific basis to develop tailored assessment protocols and rehabilitation procedures to enhance orientation and mobility based on the integration of different sensory modalities, especially addressed to improve the compromised navigational performance of visually impaired (VI) people. To achieve these aims, we conducted behavioral experiments on adult participants, including psychophysics procedures, galvanic stimulation, and modeling. In particular, the experiments involved active spatial navigation tasks with audio-visual landmarks and selfmotion discrimination tasks with and without acoustic landmarks using a motion platform (Rotational-Translational Chair) and an acoustic virtual reality tool. Besides, we applied Galvanic Vestibular Stimulation to directly modulate signals coming from the vestibular system during behavioral tasks that involved interaction with audio-visual landmarks. In addition, when appropriate, we compared the obtained results with predictions coming from the Maximum Likelihood Estimation model, to verify the potential optimal integration between the available multisensory cues. i) Results on multisensory navigation showed a sub-group of integrators and another of non-integrators, revealing inter-individual differences in audio-visual processing while moving through the environment. Finding these idiosyncrasies in a homogeneous sample of adults emphasizes the role of individual perceptual characteristics in multisensory perception, highlighting how important it is to plan tailored rehabilitation protocols considering each individual’s perceptual preferences and experiences. ii) We also found a robust inherent overestimation bias when estimating passive self-motion stimuli. This finding shed new light on how our brain processes and elaborates the available cues building a more functional representation of the world. We also demonstrated a novel impact of the vestibular signals on the encoding of visual environmental cues without actual self-motion information. The role that vestibular inputs play in visual cues perception, and space encoding has multiple consequences on humans’ ability to functionally navigate in space and interact with environmental objects, especially when vestibular signals are impaired due to intrinsic (vestibular disorders) or environmental conditions (altered gravity, e.g. spaceflight missions). Finally, iii) the combination of the Rotational-Translational Chair and the acoustic virtual reality tool revealed a slight improvement in self-motion perception for VI people when exploiting acoustic cues. This approach shows to be a successful technique for evaluating audio-vestibular perception and improving spatial representation abilities of VI people, providing the basis to develop new rehabilitation procedures focused on multisensory perception. Overall, the findings resulting from my Ph.D. project broaden the scientific knowledge about spatial navigation in multisensory environments, yielding new insights into the exploration of the brain mechanisms associated with mobility, orientation, and locomotion abilities

    The Speed, Precision and Accuracy of Human Multisensory Perception following Changes to the Visual Sense

    Get PDF
    Human adults can combine information from multiple senses to improve their perceptual judgments. Visual and multisensory experience plays an important role in the development of multisensory integration, however it is unclear to what extent changes in vision impact multisensory processing later in life. In particular, it is not known whether adults account for changes to the relative reliability of their senses, following sensory loss, treatment or training. Using psychophysical methods, this thesis studied the multisensory processing of individuals experiencing changes to the visual sense. Chapters 2 and 3 assessed whether patients implanted with a retinal prosthesis (having been blinded by a retinal degenerative disease) could use this new visual signal with non-visual information to improve their speed or precision on multisensory tasks. Due to large differences between the reliabilities of the visual and non-visual cues, patients were not always able to benefit from the new visual signal. Chapter 4 assessed whether patients with degenerative visual loss adjust the weight given to visual and non-visual cues during audio-visual localization as their relative reliabilities change. Although some patients adjusted their reliance on vision across the visual field in line with predictions based on cue relative reliability, others - patients with visual loss limited to their central visual field only - did not. Chapter 5 assessed whether training with either more reliable or less reliable visual feedback could enable normally sighted adults to overcome an auditory localization bias. Findings suggest that visual information, irrespective of reliability, can be used to overcome at least some non-visual biases. In summary, this thesis documents multisensory changes following changes to the visual sense. The results improve our understanding of adult multisensory plasticity and have implications for successful treatments and rehabilitation following sensory loss

    Combining Path Integration and Remembered Landmarks When Navigating without Vision

    Get PDF
    This study investigated the interaction between remembered landmark and path integration strategies for estimating current location when walking in an environment without vision. We asked whether observers navigating without vision only rely on path integration information to judge their location, or whether remembered landmarks also influence judgments. Participants estimated their location in a hallway after viewing a target (remembered landmark cue) and then walking blindfolded to the same or a conflicting location (path integration cue). We found that participants averaged remembered landmark and path integration information when they judged that both sources provided congruent information about location, which resulted in more precise estimates compared to estimates made with only path integration. In conclusion, humans integrate remembered landmarks and path integration in a gated fashion, dependent on the congruency of the information. Humans can flexibly combine information about remembered landmarks with path integration cues while navigating without visual information.National Institutes of Health (U.S.) (Grant T32 HD007151)National Institutes of Health (U.S.) (Grant T32 EY07133)National Institutes of Health (U.S.) (Grant F32EY019622)National Institutes of Health (U.S.) (Grant EY02857)National Institutes of Health (U.S.) (Grant EY017835-01)National Institutes of Health (U.S.) (Grant EY015616-03)United States. Department of Education (H133A011903
    • …
    corecore