30,578 research outputs found

    The reentry hypothesis: The putative interaction of the frontal eye field, ventrolateral prefrontal cortex, and areas V4, IT for attention and eye movement

    Get PDF
    Attention is known to play a key role in perception, including action selection, object recognition and memory. Despite findings revealing competitive interactions among cell populations, attention remains difficult to explain. The central purpose of this paper is to link up a large number of findings in a single computational approach. Our simulation results suggest that attention can be well explained on a network level involving many areas of the brain. We argue that attention is an emergent phenomenon that arises from reentry and competitive interactions. We hypothesize that guided visual search requires the usage of an object-specific template in prefrontal cortex to sensitize V4 and IT cells whose preferred stimuli match the target template. This induces a feature-specific bias and provides guidance for eye movements. Prior to an eye movement, a spatially organized reentry from occulomotor centers, specifically the movement cells of the frontal eye field, occurs and modulates the gain of V4 and IT cells. The processes involved are elucidated by quantitatively comparing the time course of simulated neural activity with experimental data. Using visual search tasks as an example, we provide clear and empirically testable predictions for the participation of IT, V4 and the frontal eye field in attention. Finally, we explain a possible physiological mechanism that can lead to non-flat search slopes as the result of a slow, parallel discrimination process

    Space representation for eye movements is more contralateral in monkeys than in humans

    Get PDF
    Contralateral hemispheric representation of sensory inputs (the right visual hemifield in the left hemisphere and vice versa) is a fundamental feature of primate sensorimotor organization, in particular the visuomotor system. However, many higher-order cognitive functions in humans show an asymmetric hemispheric lateralization—e.g., right brain specialization for spatial processing—necessitating a convergence of information from both hemifields. Electrophysiological studies in monkeys and functional imaging in humans have investigated space and action representations at different stages of visuospatial processing, but the transition from contralateral to unified global spatial encoding and the relationship between these encoding schemes and functional lateralization are not fully understood. Moreover, the integration of data across monkeys and humans and elucidation of interspecies homologies is hindered, because divergent findings may reflect actual species differences or arise from discrepancies in techniques and measured signals (electrophysiology vs. imaging). Here, we directly compared spatial cue and memory representations for action planning in monkeys and humans using event-related functional MRI during a working-memory oculomotor task. In monkeys, cue and memory-delay period activity in the frontal, parietal, and temporal regions was strongly contralateral. In putative human functional homologs, the contralaterality was significantly weaker, and the asymmetry between the hemispheres was stronger. These results suggest an inverse relationship between contralaterality and lateralization and elucidate similarities and differences in human and macaque cortical circuits subserving spatial awareness and oculomotor goal-directed actions

    Evidence that indirect inhibition of saccade initiation improves saccade accuracy

    Get PDF
    Saccadic eye-movements to a visual target are less accurate if there are distracters close to its location (local distracters). The addition of more distracters, remote from the target location (remote distracters), invokes an involuntary increase in the response latency of the saccade and attenuates the effect of local distracters on accuracy. This may be due to the target and distracters directly competing (direct route) or to the remote distracters acting to impair the ability to disengage from fixation (indirect route). To distinguish between these we examined the development of saccade competition by recording saccade latency and accuracy responses made to a target and local distracter compared with those made with an addition of a remote distracter. The direct route would predict that the remote distracter impacts on the developing competition between target and local distracter, while the indirect route would predict no change as the accuracy benefit here derives from accessing the same competitive process but at a later stage. We found that the presence of the remote distracter did not change the pattern of accuracy improvement. This suggests that the remote distracter was acting along an indirect route that inhibits disengagement from fixation, slows saccade initiation, and enables more accurate saccades to be made

    The role of the ventrolateral frontal cortex in inhibitory oculomotor control

    Get PDF
    It has been proposed that the inferior/ventrolateral frontal cortex plays a critical role in the inhibitory control of action during cognitive tasks.However, the contribution of this region to the control of eye movements has not been clearly established.Here, we describe the performance of a group of 23 frontal lobe damaged patients in an oculomotor rule switching task for which the association between a centrally presented visual cue and the direction of a saccade could change from trial to trial. A subset of 16 patients also completed the standard antisaccade task.Ventrolateral damage was found to be a significant predictor of errors in both tasks. Analysis of the rate at which patients corrected errors in the rule switching task also revealed an important dissociation between left and right hemisphere damaged patients.Whilst patients with left ventrolateral damage usually corrected response errors with secondary saccades, those with right hemisphere lesions often failed to do so. The results suggest that the inferior frontal cortex forms part of a wider frontal network mediating inhibitory control over stimulus elicited eye movements. The critical role played by the right ventrolateral region in cognitive tasks may arise due to an additional functional specialization for the monitoring and updating of task rules

    Target Selection by Frontal Cortex During Coordinated Saccadic and Smooth Pursuit Eye Movement

    Full text link
    Oculomotor tracking of moving objects is an important component of visually based cognition and planning. Such tracking is achieved by a combination of saccades and smooth pursuit eye movements. In particular, the saccadic and smooth pursuit systems interact to often choose the same target, and to maximize its visibility through time. How do multiple brain regions interact, including frontal cortical areas, to decide the choice of a target among several competing moving stimuli? How is target selection information that is created by a bias (e.g., electrical stimulation) transferred from one movement system to another? These saccade-pursuit interactions are clarified by a new computational neural model, which describes interactions among motion processing areas MT, MST, FPA, DLPN; saccade specification, selection, and planning areas LIP, FEF, SNr, SC; the saccadic generator in the brain stem; and the cerebellum. Model simulations explain a broad range of neuroanatomical and neurophysiological data. These results are in contrast with the simplest parallel model with no interactions between saccades and pursuit than common-target selection and recruitment of shared motoneurons. Actual tracking episodes in primates reveal multiple systematic deviations from predictions of the simplest parallel model, which are explained by the current model.National Science Foundation (SBE-0354378); Office of Naval Research (N00014-01-1-0624

    Separate representations of target and timing cue locations in the supplementary eye fields

    Get PDF
    When different stimuli indicate where and when to make an eye movement, the brain areas involved in oculomotor control must selectively plan an eye movement to the stimulus that encodes the target position and also encode the information available from the timing cue. This could pose a challenge to the oculomotor system since the representation of the timing stimulus location in one brain area might be interpreted by downstream neurons as a competing motor plan. Evidence from diverse sources has suggested that the supplementary eye fields (SEF) play an important role in behavioral timing, so we recorded single-unit activity from SEF to characterize how target and timing cues are encoded in this region. Two monkeys performed a variant of the memory-guided saccade task, in which a timing stimulus was presented at a randomly chosen eccentric location. Many spatially tuned SEF neurons encoded only the location of the target and not the timing stimulus, whereas several other SEF neurons encoded the location of the timing stimulus and not the target. The SEF population therefore encoded the location of each stimulus with largely distinct neuronal subpopulations. For comparison, we recorded a small population of lateral intraparietal (LIP) neurons in the same task. We found that most LIP neurons that encoded the location of the target also encoded the location of the timing stimulus after its presentation, but selectively encoded the intended eye movement plan in advance of saccade initiation. These results suggest that SEF, by conditionally encoding the location of instructional stimuli depending on their meaning, can help identify which movement plan represented in other oculomotor structures, such as LIP, should be selected for the next eye movement

    Neural Dynamics of Saccadic and Smooth Pursuit Eye Movement Coordination during Visual Tracking of Unpredictably Moving Targets

    Full text link
    How does the brain use eye movements to track objects that move in unpredictable directions and speeds? Saccadic eye movements rapidly foveate peripheral visual or auditory targets and smooth pursuit eye movements keep the fovea pointed toward an attended moving target. Analyses of tracking data in monkeys and humans reveal systematic deviations from predictions of the simplest model of saccade-pursuit interactions, which would use no interactions other than common target selection and recruitment of shared motoneurons. Instead, saccadic and smooth pursuit movements cooperate to cancel errors of gaze position and velocity, and thus to maximize target visibility through time. How are these two systems coordinated to promote visual localization and identification of moving targets? How are saccades calibrated to correctly foveate a target despite its continued motion during the saccade? A neural model proposes answers to such questions. The modeled interactions encompass motion processing areas MT, MST, FPA, DLPN and NRTP; saccade planning and execution areas FEF and SC; the saccadic generator in the brain stem; and the cerebellum. Simulations illustrate the model’s ability to functionally explain and quantitatively simulate anatomical, neurophysiological and behavioral data about SAC-SPEM tracking.National Science Foundation (SBE-0354378); Office of Naval Research (N00014-01-1-0624

    Probabilistic modeling of eye movement data during conjunction search via feature-based attention

    Get PDF
    Where the eyes fixate during search is not random; rather, gaze reflects the combination of information about the target and the visual input. It is not clear, however, what information about a target is used to bias the underlying neuronal responses. We here engage subjects in a variety of simple conjunction search tasks while tracking their eye movements. We derive a generative model that reproduces these eye movements and calculate the conditional probabilities that observers fixate, given the target, on or near an item in the display sharing a specific feature with the target. We use these probabilities to infer which features were biased by top-down attention: Color seems to be the dominant stimulus dimension for guiding search, followed by object size, and lastly orientation. We use the number of fixations it took to find the target as a measure of task difficulty. We find that only a model that biases multiple feature dimensions in a hierarchical manner can account for the data. Contrary to common assumptions, memory plays almost no role in search performance. Our model can be fit to average data of multiple subjects or to individual subjects. Small variations of a few key parameters account well for the intersubject differences. The model is compatible with neurophysiological findings of V4 and frontal eye fields (FEF) neurons and predicts the gain modulation of these cells
    corecore