86 research outputs found

    Alignment to natural and imposed mismatches between the senses

    Get PDF
    Does the nervous system continuously realign the senses so that objects are seen and felt in the same place? Conflicting answers to this question have been given. Research imposing a sensory mismatch has provided evidence that the nervous system realigns the senses to reduce the mismatch. Other studies have shown that when subjects point with the unseen hand to visual targets, their end points show visual-proprioceptive biases that do not disappear after episodes of visual feedback. These biases are indicative of intersensory mismatches that the nervous system does not align for. Here, we directly compare how the nervous system deals with natural and imposed mismatches. Subjects moved a hand-held cube to virtual cubes appearing at pseudorandom locations in threedimensional space. We alternated blocks in which subjects moved without visual feedback of the hand with feedback blocks in which we rendered a cube representing the hand-held cube. In feedback blocks, we rotated the visual feedback by 5° relative to the subject's head, creating an imposed mismatch between vision and proprioception on top of any natural mismatches. Realignment occurred quickly but was incomplete. We found more realignment to imposed mismatches than to natural mismatches. We propose that this difference is related to the way in which the visual information changed when subjects entered the experiment: the imposed mismatches were different from the mismatch in daily life, so alignment started from scratch, whereas the natural mismatches were not imposed by the experimenter, so subjects are likely to have entered the experiment partly aligned. © 2013 the American Physiological Society

    Stress analysis of LOFT penetrations 1A, 2A, 3F, 5A-F, 7A, 9A, 17A-B, 20A-C, 21-A

    No full text
    A stress analysis has been completed for the LOFT piping nozzles penetrating through the containment vessel in accordance with the 1965 edition of Section III of the ASME Boiler and Pressure Vessel Code. LOFT Specification S-1 states that the 1965 edition, including the addenda through the summer 1966 issue, be used. Stresses in the containment wall and in the nozzles result from mechanical and thermal loads on the piping that penetrate the nozzles. The mechanical loads were compiled in LTR 1217-7 and the temperature gradients were provided by the Thermal Analysis Branch. This analysis indicates that the nozzles and the containment wall are adequate to sustain the given mechanical and thermal loads. Therefore, it is recommended that paragraph number S1-04, section M of LOFT specification S-1 be revised to list the nozzle loads presented in Table 3, page A-3a. 9 refs

    The sources of variability in saccadic eye movements

    No full text
    Our movements are variable, but the origin of this variability is poorly understood. We examined the sources of variability in human saccadic eye movements. In two experiments, we measured the spatiotemporal variability in saccade trajectories as a function of movement direction and amplitude. One of our new observations is that the variability in movement direction is smaller for purely horizontal and vertical saccades than for saccades in oblique directions. We also found that saccade amplitude, duration, and peak velocity are all correlated with one another. To determine the origin of the observed variability, we estimated the noise in motor commands from the observed spatiotemporal variability, while taking into account the variability resulting from uncertainty in localization of the target. This analysis revealed that uncertainty in target localization is the major source of variability in saccade endpoints, whereas noise in the magnitude of the motor commands explains a slightly smaller fraction. In addition, there is temporal variability such that saccades with a longer than average duration have a smaller than average peak velocity. This noise model has a large generality because it correctly predicts the variability in other data sets, which contain saccades starting from very different initial locations. Because the temporal noise most likely originates in movement planning, and the motor command noise in movement execution, we conclude that uncertainty in sensory signals and noise in movement planning and execution all contribute to the variability in saccade trajectories. These results are important for understanding how the brain controls movement

    Integration of visual and proprioceptive position-information

    No full text
    Applied Science

    Visuomotor adaptation: How forgetting keeps us conservative

    Get PDF
    Contains fulltext : 150720.pdf (publisher's version ) (Open Access)Even when provided with feedback after every movement, adaptation levels off before biases are completely removed. Incomplete adaptation has recently been attributed to forgetting: the adaptation is already partially forgotten by the time the next movement is made. Here we test whether this idea is correct. If so, the final level of adaptation is determined by a balance between learning and forgetting. Because we learn from perceived errors, scaling these errors by a magnification factor has the same effect as subjects increasing the amount by which they learn from each error. In contrast, there is no reason to expect scaling the errors to affect forgetting. The magnification factor should therefore influence the balance between learning and forgetting, and thereby the final level of adaptation. We found that adaptation was indeed more complete for larger magnification factors. This supports the idea that incomplete adaptation is caused by part of what has been learnt quickly being forgotten.13 p

    Allocentric and egocentric contribution to manual interception by moving actors

    Get PDF
    Previous studies suggest that the brain combines egocentric and allocentric cues to estimate the location of objects in the world. It remains unclear how the brain would combine these cues to immediately act upon objects in dynamic environments. For example, intercepting a moving object while we are moving requires us to predict the object's future location by compensating for our own displacement. In this situation, using allocentric information about the object location could improve this estimate as long as it carries reliable cues about the object's location. To test this hypothesis, we designed an interception task in virtual reality. While being moved using a vestibular motion platform and as soon as they received an auditory cue (response signal), participants had to intercept a virtual ball (target) moving in 3D with a virtual paddle that they controlled with a linear guide. The target was presented in isolation ("target only") or surrounded by two other balls (landmarks) moving along a similar trajectory. The target disappeared 250 ms before the landmarks, which were removed at the response signal. We manipulated the landmarks' reliability by varying the spatial variance of their trajectory. Both with and without self-motion, we found that increasing the landmarks' variability resulted in an increased reaching error and variability as compared to the "target only" condition, whereas the presence of "noiseless" landmarks reduced reaching error and variability compared to the "target only" condition. Our results show that while performing an interception task, the brain does integrate allocentric information with egocentric information in order to predict the object's position, even if it is at the cost of a noisier estimate. These results may be accounted for by a Bayesian model that combines predictions about the target location based on its last observation and the actual observation of the landmarks' dynamics. Meeting abstract presented at VSS 2016
    • …
    corecore