198 research outputs found

    Neuronal activity in medial superior temporal area (MST) during memory-based smooth pursuit eye movements in monkeys

    Get PDF
    We examined recently neuronal substrates for predictive pursuit using a memory-based smooth pursuit task that distinguishes the discharge related to memory of visual motion-direction from that related to movement preparation. We found that the supplementary eye fields (SEF) contain separate signals coding memory and assessment of visual motion-direction, decision not-to-pursue, and preparation for pursuit. Since medial superior temporal area (MST) is essential for visual motion processing and projects to SEF, we examined whether MST carried similar signals. We analyzed the discharge of 108 MSTd neurons responding to visual motion stimuli. The majority (69/108Β =Β 64%) were also modulated during smooth pursuit. However, in nearly all (104/108Β =Β 96%) of the MSTd neurons tested, there was no significant discharge modulation during the delay periods that required memory of visual motion-direction or preparation for smooth pursuit or not-to-pursue. Only 4 neurons of the 108 (4%) exhibited significantly higher discharge rates during the delay periods; however, their responses were non-directional and not instruction specific. Representative signals in the MSTd clearly differed from those in the SEF during memory-based smooth pursuit. MSTd neurons are unlikely to provide signals for memory of visual motion-direction or preparation for smooth pursuit eye movements

    Spatio-Temporal Interpolation Is Accomplished by Binocular Form and Motion Mechanisms

    Get PDF
    Spatio-temporal interpolation describes the ability of the visual system to perceive shapes as whole figures (Gestalts), even if they are moving behind narrow apertures, so that only thin slices of them meet the eye at any given point in time. The interpolation process requires registration of the form slices, as well as perception of the shape's global motion, in order to reassemble the slices in the correct order. The commonly proposed mechanism is a spatio-temporal motion detector with a receptive field, for which spatial distance and temporal delays are interchangeable, and which has generally been regarded as monocular. Here we investigate separately the nature of the motion and the form detection involved in spatio-temporal interpolation, using dichoptic masking and interocular presentation tasks. The results clearly demonstrate that the associated mechanisms for both motion and form are binocular rather than monocular. Hence, we question the traditional view according to which spatio-temporal interpolation is achieved by monocular first-order motion-energy detectors in favour of models featuring binocular motion and form detection

    Integration of Sensory and Reward Information during Perceptual Decision-Making in Lateral Intraparietal Cortex (LIP) of the Macaque Monkey

    Get PDF
    Single neurons in cortical area LIP are known to carry information relevant to both sensory and value-based decisions that are reported by eye movements. It is not known, however, how sensory and value information are combined in LIP when individual decisions must be based on a combination of these variables. To investigate this issue, we conducted behavioral and electrophysiological experiments in rhesus monkeys during performance of a two-alternative, forced-choice discrimination of motion direction (sensory component). Monkeys reported each decision by making an eye movement to one of two visual targets associated with the two possible directions of motion. We introduced choice biases to the monkeys' decision process (value component) by randomly interleaving balanced reward conditions (equal reward value for the two choices) with unbalanced conditions (one alternative worth twice as much as the other). The monkeys' behavior, as well as that of most LIP neurons, reflected the influence of all relevant variables: the strength of the sensory information, the value of the target in the neuron's response field, and the value of the target outside the response field. Overall, detailed analysis and computer simulation reveal that our data are consistent with a two-stage drift diffusion model proposed by Diederich and Bussmeyer [1] for the effect of payoffs in the context of sensory discrimination tasks. Initial processing of payoff information strongly influences the starting point for the accumulation of sensory evidence, while exerting little if any effect on the rate of accumulation of sensory evidence

    Illusions of Visual Motion Elicited by Electrical Stimulation of Human MT Complex

    Get PDF
    Human cortical area MT+ (hMT+) is known to respond to visual motion stimuli, but its causal role in the conscious experience of motion remains largely unexplored. Studies in non-human primates demonstrate that altering activity in area MT can influence motion perception judgments, but animal studies are inherently limited in assessing subjective conscious experience. In the current study, we use functional magnetic resonance imaging (fMRI), intracranial electrocorticography (ECoG), and electrical brain stimulation (EBS) in three patients implanted with intracranial electrodes to address the role of area hMT+ in conscious visual motion perception. We show that in conscious human subjects, reproducible illusory motion can be elicited by electrical stimulation of hMT+. These visual motion percepts only occurred when the site of stimulation overlapped directly with the region of the brain that had increased fMRI and electrophysiological activity during moving compared to static visual stimuli in the same individual subjects. Electrical stimulation in neighboring regions failed to produce illusory motion. Our study provides evidence for the sufficient causal link between the hMT+ network and the human conscious experience of visual motion. It also suggests a clear spatial relationship between fMRI signal and ECoG activity in the human brain

    Smooth Pursuit Eye Movements Improve Temporal Resolution for Color Perception

    Get PDF
    Human observers see a single mixed color (yellow) when different colors (red and green) rapidly alternate. Accumulating evidence suggests that the critical temporal frequency beyond which chromatic fusion occurs does not simply reflect the temporal limit of peripheral encoding. However, it remains poorly understood how the central processing controls the fusion frequency. Here we show that the fusion frequency can be elevated by extra-retinal signals during smooth pursuit. This eye movement can keep the image of a moving target in the fovea, but it also introduces a backward retinal sweep of the stationary background pattern. We found that the fusion frequency was higher when retinal color changes were generated by pursuit-induced background motions than when the same retinal color changes were generated by object motions during eye fixation. This temporal improvement cannot be ascribed to a general increase in contrast gain of specific neural mechanisms during pursuit, since the improvement was not observed with a pattern flickering without changing position on the retina or with a pattern moving in the direction opposite to the background motion during pursuit. Our findings indicate that chromatic fusion is controlled by a cortical mechanism that suppresses motion blur. A plausible mechanism is that eye-movement signals change spatiotemporal trajectories along which color signals are integrated so as to reduce chromatic integration at the same locations (i.e., along stationary trajectories) on the retina that normally causes retinal blur during fixation

    An Automated Paradigm for Drosophila Visual Psychophysics

    Get PDF
    Background: Mutations that cause learning and memory defects in Drosophila melanogaster have been found to also compromise visual responsiveness and attention. A better understanding of attention-like defects in such Drosophila mutants therefore requires a more detailed characterization of visual responsiveness across a range of visual parameters. Methodology/Principal Findings: We designed an automated behavioral paradigm for efficiently dissecting visual responsiveness in Drosophila. Populations of flies walk through multiplexed serial choice mazes while being exposed to moving visuals displayed on computer monitors, and infra-red fly counters at the end of each maze automatically score the responsiveness of a strain. To test our new design, we performed a detailed comparison between wild-type flies and a learning and memory mutant, dunce. We first confirmed that the learning mutant dunce displays increased responsiveness to a black/green moving grating compared to wild type in this new design. We then extended this result to explore responses to a wide range of psychophysical parameters for moving gratings (e.g., luminosity, contrast, spatial frequency, velocity) as well as to a different stimulus, moving dots. Finally, we combined these visuals (gratings versus dots) in competition to investigate how dunce and wild-type flies respond to more complex and conflicting motion effects. Conclusions/Significance: We found that dunce responds more strongly than wild type to high contrast and highly structured motion. This effect was found for simple gratings, dots, and combinations of both stimuli presented in competition

    A Motion Illusion Reveals Mechanisms of Perceptual Stabilization

    Get PDF
    Visual illusions are valuable tools for the scientific examination of the mechanisms underlying perception. In the peripheral drift illusion special drift patterns appear to move although they are static. During fixation small involuntary eye movements generate retinal image slips which need to be suppressed for stable perception. Here we show that the peripheral drift illusion reveals the mechanisms of perceptual stabilization associated with these micromovements. In a series of experiments we found that illusory motion was only observed in the peripheral visual field. The strength of illusory motion varied with the degree of micromovements. However, drift patterns presented in the central (but not the peripheral) visual field modulated the strength of illusory peripheral motion. Moreover, although central drift patterns were not perceived as moving, they elicited illusory motion of neutral peripheral patterns. Central drift patterns modulated illusory peripheral motion even when micromovements remained constant. Interestingly, perceptual stabilization was only affected by static drift patterns, but not by real motion signals. Our findings suggest that perceptual instabilities caused by fixational eye movements are corrected by a mechanism that relies on visual rather than extraretinal (proprioceptive or motor) signals, and that drift patterns systematically bias this compensatory mechanism. These mechanisms may be revealed by utilizing static visual patterns that give rise to the peripheral drift illusion, but remain undetected with other patterns. Accordingly, the peripheral drift illusion is of unique value for examining processes of perceptual stabilization

    Stimulus-Dependent Adjustment of Reward Prediction Error in the Midbrain

    Get PDF
    Previous reports have described that neural activities in midbrain dopamine areas are sensitive to unexpected reward delivery and omission. These activities are correlated with reward prediction error in reinforcement learning models, the difference between predicted reward values and the obtained reward outcome. These findings suggest that the reward prediction error signal in the brain updates reward prediction through stimulus–reward experiences. It remains unknown, however, how sensory processing of reward-predicting stimuli contributes to the computation of reward prediction error. To elucidate this issue, we examined the relation between stimulus discriminability of the reward-predicting stimuli and the reward prediction error signal in the brain using functional magnetic resonance imaging (fMRI). Before main experiments, subjects learned an association between the orientation of a perceptually salient (high-contrast) Gabor patch and a juice reward. The subjects were then presented with lower-contrast Gabor patch stimuli to predict a reward. We calculated the correlation between fMRI signals and reward prediction error in two reinforcement learning models: a model including the modulation of reward prediction by stimulus discriminability and a model excluding this modulation. Results showed that fMRI signals in the midbrain are more highly correlated with reward prediction error in the model that includes stimulus discriminability than in the model that excludes stimulus discriminability. No regions showed higher correlation with the model that excludes stimulus discriminability. Moreover, results show that the difference in correlation between the two models was significant from the first session of the experiment, suggesting that the reward computation in the midbrain was modulated based on stimulus discriminability before learning a new contingency between perceptually ambiguous stimuli and a reward. These results suggest that the human reward system can incorporate the level of the stimulus discriminability flexibly into reward computations by modulating previously acquired reward values for a typical stimulus

    Incorporating Prediction in Models for Two-Dimensional Smooth Pursuit

    Get PDF
    A predictive component can contribute to the command signal for smooth pursuit. This is readily demonstrated by the fact that low frequency sinusoidal target motion can be tracked with zero time delay or even with a small lead. The objective of this study was to characterize the predictive contributions to pursuit tracking more precisely by developing analytical models for predictive smooth pursuit. Subjects tracked a small target moving in two dimensions. In the simplest case, the periodic target motion was composed of the sums of two sinusoidal motions (SS), along both the horizontal and the vertical axes. Motions following the same or similar paths, but having a richer spectral composition, were produced by having the target follow the same path but at a constant speed (CS), and by combining the horizontal SS velocity with the vertical CS velocity and vice versa. Several different quantitative models were evaluated. The predictive contribution to the eye tracking command signal could be modeled as a low-pass filtered target acceleration signal with a time delay. This predictive signal, when combined with retinal image velocity at the same time delay, as in classical models for the initiation of pursuit, gave a good fit to the data. The weighting of the predictive acceleration component was different in different experimental conditions, being largest when target motion was simplest, following the SS velocity profiles

    Visual Stability and the Motion Aftereffect: A Psychophysical Study Revealing Spatial Updating

    Get PDF
    Eye movements create an ever-changing image of the world on the retina. In particular, frequent saccades call for a compensatory mechanism to transform the changing visual information into a stable percept. To this end, the brain presumably uses internal copies of motor commands. Electrophysiological recordings of visual neurons in the primate lateral intraparietal cortex, the frontal eye fields, and the superior colliculus suggest that the receptive fields (RFs) of special neurons shift towards their post-saccadic positions before the onset of a saccade. However, the perceptual consequences of these shifts remain controversial. We wanted to test in humans whether a remapping of motion adaptation occurs in visual perception
    • …
    corecore