251 research outputs found

    A Computational Mechanism for Unified Gain and Timing Control in the Cerebellum

    Get PDF
    Precise gain and timing control is the goal of cerebellar motor learning. Because the basic neural circuitry of the cerebellum is homogeneous throughout the cerebellar cortex, a single computational mechanism may be used for simultaneous gain and timing control. Although many computational models of the cerebellum have been proposed for either gain or timing control, few models have aimed to unify them. In this paper, we hypothesize that gain and timing control can be unified by learning of the complete waveform of the desired movement profile instructed by climbing fiber signals. To justify our hypothesis, we adopted a large-scale spiking network model of the cerebellum, which was originally developed for cerebellar timing mechanisms to explain the experimental data of Pavlovian delay eyeblink conditioning, to the gain adaptation of optokinetic response (OKR) eye movements. By conducting large-scale computer simulations, we could reproduce some features of OKR adaptation, such as the learning-related change of simple spike firing of model Purkinje cells and vestibular nuclear neurons, simulated gain increase, and frequency-dependent gain increase. These results suggest that the cerebellum may use a single computational mechanism to control gain and timing simultaneously

    Incorporating Prediction in Models for Two-Dimensional Smooth Pursuit

    Get PDF
    A predictive component can contribute to the command signal for smooth pursuit. This is readily demonstrated by the fact that low frequency sinusoidal target motion can be tracked with zero time delay or even with a small lead. The objective of this study was to characterize the predictive contributions to pursuit tracking more precisely by developing analytical models for predictive smooth pursuit. Subjects tracked a small target moving in two dimensions. In the simplest case, the periodic target motion was composed of the sums of two sinusoidal motions (SS), along both the horizontal and the vertical axes. Motions following the same or similar paths, but having a richer spectral composition, were produced by having the target follow the same path but at a constant speed (CS), and by combining the horizontal SS velocity with the vertical CS velocity and vice versa. Several different quantitative models were evaluated. The predictive contribution to the eye tracking command signal could be modeled as a low-pass filtered target acceleration signal with a time delay. This predictive signal, when combined with retinal image velocity at the same time delay, as in classical models for the initiation of pursuit, gave a good fit to the data. The weighting of the predictive acceleration component was different in different experimental conditions, being largest when target motion was simplest, following the SS velocity profiles

    Representation of Neck Velocity and Neck–Vestibular Interactions in Pursuit Neurons in the Simian Frontal Eye Fields

    Get PDF
    The smooth pursuit system must interact with the vestibular system to maintain the accuracy of eye movements in space (i.e., gaze-movement) during head movement. Normally, the head moves on the stationary trunk. Vestibular signals cannot distinguish whether the head or whole body is moving. Neck proprioceptive inputs provide information about head movements relative to the trunk. Previous studies have shown that the majority of pursuit neurons in the frontal eye fields (FEF) carry visual information about target velocity, vestibular information about whole-body movements, and signal eye- or gaze-velocity. However, it is unknown whether FEF neurons carry neck proprioceptive signals. By passive trunk-on-head rotation, we tested neck inputs to FEF pursuit neurons in 2 monkeys. The majority of FEF pursuit neurons tested that had horizontal preferred directions (87%) responded to horizontal trunk-on-head rotation. The modulation consisted predominantly of velocity components. Discharge modulation during pursuit and trunk-on-head rotation added linearly. During passive head-on-trunk rotation, modulation to vestibular and neck inputs also added linearly in most neurons, although in half of gaze-velocity neurons neck responses were strongly influenced by the context of neck rotation. Our results suggest that neck inputs could contribute to representing eye- and gaze-velocity FEF signals in trunk coordinates

    Eye-Pursuit and Reafferent Head Movement Signals Carried by Pursuit Neurons in the Caudal Part of the Frontal Eye Fields during Head-Free Pursuit

    Get PDF
    Eye and head movements are coordinated during head-free pursuit. To examine whether pursuit neurons in frontal eye fields (FEF) carry gaze-pursuit commands that drive both eye-pursuit and head-pursuit, monkeys whose heads were free to rotate about a vertical axis were trained to pursue a juice feeder with their head and a target with their eyes. Initially the feeder and target moved synchronously with the same visual angle. FEF neurons responding to this gaze-pursuit were tested for eye-pursuit of target motion while the feeder was stationary and for head-pursuit while the target was stationary. The majority of pursuit neurons exhibited modulation during head-pursuit, but their preferred directions during eye-pursuit and head-pursuit were different. Although peak modulation occurred during head movements, the onset of discharge usually was not aligned with the head movement onset. The minority of neurons whose discharge onset was so aligned discharged after the head movement onset. These results do not support the idea that the head-pursuit–related modulation reflects head-pursuit commands. Furthermore, modulation similar to that during head-pursuit was obtained by passive head rotation on stationary trunk. Our results suggest that FEF pursuit neurons issue gaze or eye movement commands during gaze-pursuit and that the head-pursuit–related modulation primarily reflects reafferent signals resulting from head movements

    Is Acceleration Used for Ocular Pursuit and Spatial Estimation during Prediction Motion?

    Get PDF
    Here we examined ocular pursuit and spatial estimation in a linear prediction motion task that emphasized extrapolation of occluded accelerative object motion. Results from the ocular response up to occlusion showed that there was evidence in the eye position, velocity and acceleration data that participants were attempting to pursue the moving object in accord with the veridical motion properties. They then attempted to maintain ocular pursuit of the randomly-ordered accelerative object motion during occlusion but this was not ideal, and resulted in undershoot of eye position and velocity at the moment of object reappearance. In spatial estimation there was a general bias, with participants less likely to report object reappearance being behind than ahead of the expected position. In addition, participants’ spatial estimation did not take into account the effects of object acceleration. Logistic regression indicated that spatial estimation was best predicted for the majority of participants by the difference between actual object reappearance position and an extrapolation based on pre-occlusion velocity. In combination, and in light of previous work, we interpret these findings as showing that eye movements are scaled in accord with the effects of object acceleration but do not directly specify information for accurate spatial estimation in prediction motion

    Visuomotor Cerebellum in Human and Nonhuman Primates

    Get PDF
    In this paper, we will review the anatomical components of the visuomotor cerebellum in human and, where possible, in non-human primates and discuss their function in relation to those of extracerebellar visuomotor regions with which they are connected. The floccular lobe, the dorsal paraflocculus, the oculomotor vermis, the uvula–nodulus, and the ansiform lobule are more or less independent components of the visuomotor cerebellum that are involved in different corticocerebellar and/or brain stem olivocerebellar loops. The floccular lobe and the oculomotor vermis share different mossy fiber inputs from the brain stem; the dorsal paraflocculus and the ansiform lobule receive corticopontine mossy fibers from postrolandic visual areas and the frontal eye fields, respectively. Of the visuomotor functions of the cerebellum, the vestibulo-ocular reflex is controlled by the floccular lobe; saccadic eye movements are controlled by the oculomotor vermis and ansiform lobule, while control of smooth pursuit involves all these cerebellar visuomotor regions. Functional imaging studies in humans further emphasize cerebellar involvement in visual reflexive eye movements and are discussed
    corecore