1,744 research outputs found

    Systematic biases in human heading estimation.

    Get PDF
    Heading estimation is vital to everyday navigation and locomotion. Despite extensive behavioral and physiological research on both visual and vestibular heading estimation over more than two decades, the accuracy of heading estimation has not yet been systematically evaluated. Therefore human visual and vestibular heading estimation was assessed in the horizontal plane using a motion platform and stereo visual display. Heading angle was overestimated during forward movements and underestimated during backward movements in response to both visual and vestibular stimuli, indicating an overall multimodal bias toward lateral directions. Lateral biases are consistent with the overrepresentation of lateral preferred directions observed in neural populations that carry visual and vestibular heading information, including MSTd and otolith afferent populations. Due to this overrepresentation, population vector decoding yields patterns of bias remarkably similar to those observed behaviorally. Lateral biases are inconsistent with standard bayesian accounts which predict that estimates should be biased toward the most common straight forward heading direction. Nevertheless, lateral biases may be functionally relevant. They effectively constitute a perceptual scale expansion around straight ahead which could allow for more precise estimation and provide a high gain feedback signal to facilitate maintenance of straight-forward heading during everyday navigation and locomotion

    The role of the ventral intraparietal area (VIP/pVIP) in parsing optic flow into visual motion caused by self-motion and visual motion produced by object-motion

    Get PDF
    Retinal image motion is a composite signal that contains information about two behaviourally significant factors: self-motion and the movement of environmental objects. It is thought that the brain separates the two relevant signals, and although multiple brain regions have been identified that respond selectively to the composite optic flow signal, which brain region(s) perform the parsing process remains unknown. Here, we present original evidence that the putative human ventral intraparietal area (pVIP), a region known to receive optic flow signals as well as independent self-motion signals from other sensory modalities, plays a critical role in the parsing process and acts to isolate object-motion. We localised pVIP using its multisensory response profile, and then tested its relative responses to simulated object-motion and self-motion stimuli; results indicated that responses were much stronger in pVIP to stimuli that specified object-motion. We report two further observations that will be significant for the future direction of research in this area; firstly, activation in pVIP was suppressed by distant stationary objects compared to the absence of objects or closer objects. Secondly, we describe several other brain regions that share with pVIP selectivity for visual object-motion over visual self-motion as well as a multisensory response

    A review of human sensory dynamics for application to models of driver steering and speed control.

    Get PDF
    In comparison with the high level of knowledge about vehicle dynamics which exists nowadays, the role of the driver in the driver-vehicle system is still relatively poorly understood. A large variety of driver models exist for various applications; however, few of them take account of the driver's sensory dynamics, and those that do are limited in their scope and accuracy. A review of the literature has been carried out to consolidate information from previous studies which may be useful when incorporating human sensory systems into the design of a driver model. This includes information on sensory dynamics, delays, thresholds and integration of multiple sensory stimuli. This review should provide a basis for further study into sensory perception during driving.This work was supported by the UK Engineering and Physical Sciences Research Council (EP/P505445/1) (studentship for Nash).This is the published version. It first appeared from Springer at http://dx.doi.org/10.1007/s00422-016-0682-x

    Rotational and Translational Velocity and Acceleration Thresholds for the Onset of Cybersickness in Virtual Reality

    Get PDF
    This paper determined rotational and translational velocity and acceleration thresholds for the onset of cybersickness. Cybersickness causes discomfort and discourages the widespread use of virtual reality systems for both recreational and professional use. Visual motion or optic flow is known to be one of the main causes of cybersickness due to the sensory conflict it creates with the vestibular system. The aim of this experiment is to detect rotational and translational velocity and acceleration thresholds that cause the onset of cybersickness. Participants were exposed to a moving particle field in virtual reality for a few seconds per run. The field moved in different directions (longitudinal, lateral, roll, and yaw), with different velocity profiles (steady and accelerating), and different densities. Using a staircase procedure, that controlled the speed or acceleration of the field, we detected the threshold at which participant started to feel temporary symptoms of cybersickness. The optic flow was quantified for each motion type and by modifying the number of features, the same amount of optic flow was present in each scene. Having the same optic flow in each scene allows a direct comparison of the thresholds. The results show that the velocity and acceleration thresholds for rotational optic flow were significantly lower than for translational optic flow. The thresholds suggestively decreased with the decreasing particle density of the scene. Finally, it was found that all the rotational and translational thresholds strongly correlate with each other. While the mean values of the thresholds could be used as guidelines to develop virtual reality applications, the high variability between individuals implies that the individual tuning of motion controls would be more effective to reduce cybersickness while minimizing the impact on the experience of immersion

    The Neural Correlates of Vection: An fMRI Study

    Get PDF
    Vection is an illusion of visually-induced self-motion in a stationary observer. I used different types of vection stimuli in a functional magnetic resonance imaging (fMRI) study to determine the interaction between cortical visual regions and cortical vestibular regions during vection. My findings suggest that the cingulate sulcus visual area is heavily involved in self-motion processing. The parieto-insular vestibular cortex, showed a significant change in blood oxygenation level dependent signal activity during vection but to a lesser extent than CSv. Behavioural data correlated with the neuroimaging data (in CSv and PIVC) as both show a significant difference when comparing the radial oscillating condition to the radial smooth condition in CSv and PIVC - suggesting a neural correlate of the jitter effect. My results suggest that the brain region of primary importance in the self-motion debate is CSv - a region that has received little attention in the vection literature to date

    Neural correlates of the processing of visually simulated self-motion

    Get PDF
    Successful interaction with our environment requires the perception of our surroundings. For coping with everyday challenges our own movements in this environment are important. In my thesis, I have investigated the neural correlates of visually simulated self-motion. More specifically, I have analyzed the processing of two key features of visual self-motion: the self-motion direction (heading) and the traveled distance (path integration) by means of electroencephalogram (EEG) measurements and transcranial magnetic stimulation (TMS). I have focused on investigating the role of prediction about the upcoming sensory event on the processing of these self-motion features. To this end, I applied the approach of the predictive coding theory. In this context, prediction errors induced by the mismatch between predictions and the actual sensory input are used to update the internal model responsible for creating the predictions. Additionally, I aimed to combine my findings with the results of previous studies on monkeys in order to further probe the role of the macaque monkey as an animal model for human sensorimotor processing. In my first study, I investigated the processing of different self-motion directions using a classical oddball EEG measurement. The frequently presented self-motion stimuli to one direction were interspersed with a rarely presented different self-motion direction. The headings occurred with different probabilities which modified the prediction about the upcoming event and allowed for the formulation of an internal model. Unexpected self-motion directions created a prediction error. I could prove this in my data by detecting a specific EEG-component, the mismatch negativity (MMN). This MMN-component does not only reveal the influence of predictions on the processing of visually simulated self-motion directions according to the predictive coding theory, but is also known to indicate the preattentive processing of the analyzed feature, here the heading. EEG data from monkeys was recorded with identical equipment during the presentation of the previously described stimulus by colleagues from my lab in order to test for the similarities in monkey and human processing of visually simulated self-motion. Remarkably, data showing a MMN-component similar to the human data was recorded. This led us to suggest that the underlying processes are comparable across human and non-human primates. In my second study, the objective was to causally link the human functional equivalent of macaque medial superior temporal area (hMST) to the perception of self-motion directions. In previous studies this area has been shown to be important for the processing of self-motion. Applying TMS to right hemisphere area hMST resulted in an increase in variance when participants were asked to estimate heading to the left, i.e. to the direction contraversive to the stimulation site. The results of this study were used to test a model developed by colleagues of my lab. They used findings from single cell recordings in macaque monkeys to create it. Simulating the influence of lateralized TMS pulses on one hemisphere hMST this model hypothesized an increase in variance for estimation of headings contraversive to the TMS stimulated hemisphere. This is exactly what I observed in data of my TMS experiment. In this second study I verified the finding of previous studies that hMST is important for the processing of self-motion directions. In addition, I showed that a model based on recordings from macaque monkeys can predict the outcome of an experiment with human participants. This indicates the similarity of the processing of visually simulated self-motion in humans and macaque monkeys. The third study focused on the representation of traveled distance using EEG recordings in human participants. The goal of this study was two-fold: First, I analyzed the influence of prediction on the processing of traveled distance. Second, I aimed to find a neural correlate of subjective traveled distance. Participants were asked to passively observe a forward self-motion. The movement onset and offset could not be predicted by them. In a next step participants reproduced double the distance of the previously observed self-motion. Since they actively modulated the movement to reach the desired distance, the resulting self-motion onset and offset could be predicted. Comparing the visually evoked potentials (VEPs) after self-motion onset and offsets of the predicted and unpredicted self-motion, I found differences supporting the predictive coding theory. Amplitudes for self-motion onset VEPs were larger in the passive condition. For self-motion offset, I found larger latencies for the VEP-components in the passive condition. In addition to these results I searched for a neural correlate of the subjective estimation of the distance presented in the passive condition. During the active reproduction of double the distance obviously the single distance was passed. I assumed that half of the reproduced double distance would be the subjective estimation of the single distance. When passing this subjective single distance, an increase in the alpha band activity was detected in half of the participants. At this point in time prediction about the upcoming movement changed since participants started reproducing the single distance again. In context of the predictive coding theory these prediction changes are considered to be feedback processes. It has been shown in previous studies that these kinds of feedback processes are associated with alpha oscillations. With this study, I demonstrated the influence of prediction on self-motion onset and offset VEPs as well as on brain oscillations during a distance reproduction experiment. In conclusion, with this thesis I analyzed the neural correlates of the processing of self-motion directions and traveled distance. The underlying neural mechanisms seem to be very similar in humans and macaque monkeys, which suggests the macaque monkey as an appropriate animal model for human sensorimotor processing. Lastly, I investigated the influence of prediction on EEG-components recorded during the processing of self-motion directions and traveled distances

    Optic Flow Dominates Visual Scene Polarity in Causing Adaptive Modification of Locomotor Trajectory

    Get PDF
    Locomotion and posture are influenced and controlled by vestibular, visual and somatosensory information. Optic flow and scene polarity are two characteristics of a visual scene that have been identified as being critical in how they affect perceived body orientation and self-motion. The goal of this study was to determine the role of optic flow and visual scene polarity on adaptive modification in locomotor trajectory. Two computer-generated virtual reality scenes were shown to subjects during 20 minutes of treadmill walking. One scene was a highly polarized scene while the other was composed of objects displayed in a non-polarized fashion. Both virtual scenes depicted constant rate self-motion equivalent to walking counterclockwise around the perimeter of a room. Subjects performed Stepping Tests blindfolded before and after scene exposure to assess adaptive changes in locomotor trajectory. Subjects showed a significant difference in heading direction, between pre and post adaptation stepping tests, when exposed to either scene during treadmill walking. However, there was no significant difference in the subjects heading direction between the two visual scene polarity conditions. Therefore, it was inferred from these data that optic flow has a greater role than visual polarity in influencing adaptive locomotor function

    Aerospace medicine and biology: A continuing bibliography with indexes, supplement 130, July 1974

    Get PDF
    This special bibliography lists 291 reports, articles, and other documents introduced into the NASA scientific and technical information system in June 1974

    Exposure to a Rotating Virtual Environment During Treadmill Locomotion Causes Adaptation in Heading Direction

    Get PDF
    The goal of the present study was to investigate the adaptive effects of variation in the direction of optic flow, experienced during linear treadmill walking, on modifying locomotor trajectory. Subjects (n = 30) walked on a motorized linear treadmill at 4.0 kilometers per hour for 24 minutes while viewing the interior of a 3D virtual scene projected onto a screen 1.5 in in front of them. The virtual scene depicted constant self-motion equivalent to either 1) walking around the perimeter of a room to one s left (Rotating Room group) 2) walking down the center of a hallway (Infinite Hallway group). The scene was static for the first 4 minutes, and then constant rate self-motion was simulated for the remaining 20 minutes. Before and after the treadmill locomotion adaptation period, subjects performed five stepping trials where in each trial they marched in place to the beat of a metronome at 90 steps/min while blindfolded in a quiet room. The subject's final heading direction (deg), final X (for-aft, cm) and final Y (medio-lateral, cm) positions were measured for each trial. During the treadmill locomotion adaptation period subject's 3D torso position was measured. We found that subjects in the Rotating Room group as compared to the Infinite Hallway group: 1) showed significantly greater deviation during post exposure testing in the heading direction and Y position opposite to the direction of optic flow experienced during treadmill walking 2) showed a significant monotonically increasing torso yaw angular rotation bias in the direction of optic flow during the treadmill adaptation exposure period. Subjects in both groups showed greater forward translation (in the +X direction) during the post treadmill stepping task that differed significantly from their pre exposure performance. Subjects in both groups reported no perceptual deviation in position during the stepping tasks. We infer that viewing simulated rotary self-motion during treadmill locomotion causes adaptive modification of sensory-motor integration in the control of position and trajectory during locomotion which functionally reflects adaptive changes in the integration of visual, vestibular, and proprioceptive cues. Such an adaptation in the control of position and heading direction during locomotion due to the congruence of sensory information demonstrates the potential for adaptive transfer between sensorimotor systems and suggests a common neural site for the processing and self-motion perception and concurrent adaptation in motor output. This will result in lack of subjects perception of deviation of position and trajectory during the post treadmill step test while blind folded

    Human self-motion perception

    Get PDF
    • …
    corecore