816 research outputs found

    Target Selection by Frontal Cortex During Coordinated Saccadic and Smooth Pursuit Eye Movement

    Full text link
    Oculomotor tracking of moving objects is an important component of visually based cognition and planning. Such tracking is achieved by a combination of saccades and smooth pursuit eye movements. In particular, the saccadic and smooth pursuit systems interact to often choose the same target, and to maximize its visibility through time. How do multiple brain regions interact, including frontal cortical areas, to decide the choice of a target among several competing moving stimuli? How is target selection information that is created by a bias (e.g., electrical stimulation) transferred from one movement system to another? These saccade-pursuit interactions are clarified by a new computational neural model, which describes interactions among motion processing areas MT, MST, FPA, DLPN; saccade specification, selection, and planning areas LIP, FEF, SNr, SC; the saccadic generator in the brain stem; and the cerebellum. Model simulations explain a broad range of neuroanatomical and neurophysiological data. These results are in contrast with the simplest parallel model with no interactions between saccades and pursuit than common-target selection and recruitment of shared motoneurons. Actual tracking episodes in primates reveal multiple systematic deviations from predictions of the simplest parallel model, which are explained by the current model.National Science Foundation (SBE-0354378); Office of Naval Research (N00014-01-1-0624

    Visuomotor Origins of Covert Spatial Attention

    Get PDF
    AbstractCovert spatial attention produces biases in perceptual performance and neural processing of behaviorally relevant stimuli in the absence of overt orienting movements. The neural mechanism that gives rise to these effects is poorly understood. This paper surveys past evidence of a relationship between oculomotor control and visual spatial attention and more recent evidence of a causal link between the control of saccadic eye movements by frontal cortex and covert visual selection. Both suggest that the mechanism of covert spatial attention emerges as a consequence of the reciprocal interactions between neural circuits primarily involved in specifying the visual properties of potential targets and those involved in specifying the movements needed to fixate them

    High-field fMRI reveals brain activation patterns underlying saccade execution in the human superior colliculus

    Get PDF
    Background The superior colliculus (SC) has been shown to play a crucial role in the initiation and coordination of eye- and head-movements. The knowledge about the function of this structure is mainly based on single-unit recordings in animals with relatively few neuroimaging studies investigating eye-movement related brain activity in humans. Methodology/Principal Findings The present study employed high-field (7 Tesla) functional magnetic resonance imaging (fMRI) to investigate SC responses during endogenously cued saccades in humans. In response to centrally presented instructional cues, subjects either performed saccades away from (centrifugal) or towards (centripetal) the center of straight gaze or maintained fixation at the center position. Compared to central fixation, the execution of saccades elicited hemodynamic activity within a network of cortical and subcortical areas that included the SC, lateral geniculate nucleus (LGN), occipital cortex, striatum, and the pulvinar. Conclusions/Significance Activity in the SC was enhanced contralateral to the direction of the saccade (i.e., greater activity in the right as compared to left SC during leftward saccades and vice versa) during both centrifugal and centripetal saccades, thereby demonstrating that the contralateral predominance for saccade execution that has been shown to exist in animals is also present in the human SC. In addition, centrifugal saccades elicited greater activity in the SC than did centripetal saccades, while also being accompanied by an enhanced deactivation within the prefrontal default-mode network. This pattern of brain activity might reflect the reduced processing effort required to move the eyes toward as compared to away from the center of straight gaze, a position that might serve as a spatial baseline in which the retinotopic and craniotopic reference frames are aligned

    ELECTRICAL MICROSTIMULATION OF THE MONKEY DORSOLATERAL PREFRONTAL CORTEX IMPAIRS ANTISACCADE PERFORMANCE

    Get PDF
    The dorsolateral prefrontal cortex (DLPFC) has been implicated in response suppression. This function is frequently investigated with the antisaccade task, which requires suppression of the automatic tendency to look toward a flashed peripheral stimulus (prosaccade) and generation of a voluntary saccade to the mirror location. To test the functional relationship between DLPFC activity and antisaccade performance, we applied electrical microstimulation to the DLPFC of two monkeys while they performed randomly interleaved pro- and anti-saccade trials. Microstimulation increased the number of direction errors and slowed saccadic reaction times (SRTs) on antisaccade trials when the visual stimulus is presented on the side contralateral to the stimulated hemisphere. Also, we observed shorter SRTs for contralateral prosaccades and longer SRTs for ipsilateral prosaccades on microstimulation trials. These findings do not support a role for the DLPFC in response suppression, but suggest a more general role in attentional selection of the contralateral field

    Encoding of Intention and Spatial Location in the Posterior Parietal Cortex

    Get PDF
    The posterior parietal cortex is functionally situated between sensory cortex and motor cortex. The responses of cells in this area are difficult to classify as strictly sensory or motor, since many have both sensory- and movement-related activities, as well as activities related to higher cognitive functions such as attention and intention. In this review we will provide evidence that the posterior parietal cortex is an interface between sensory and motor structures and performs various functions important for sensory-motor integration. The review will focus on two specific sensory-motor tasks-the formation of motor plans and the abstract representation of space. Cells in the lateral intraparietal area, a subdivision of the parietal cortex, have activity related to eye movements the animal intends to make. This finding represents the lowest stage in the sensory-motor cortical pathway in which activity related to intention has been found and may represent the cortical stage in which sensory signals go "over the hump" to become intentions and plans to make movements. The second part of the review will discuss the representation of space in the posterior parietal cortex. Encoding spatial locations is an essential step in sensory-motor transformations. Since movements are made to locations in space, these locations should be coded invariant of eye and head position or the sensory modality signaling the target for a movement Data will be reviewed demonstrating that there exists in the posterior parietal cortex an abstract representation of space that is constructed from the integration of visual, auditory, vestibular, eye position, and propriocaptive head position signals. This representation is in the form of a population code and the above signals are not combined in a haphazard fashion. Rather, they are brought together using a specific operation to form "planar gain fields" that are the common foundation of the population code for the neural construct of space

    Overt responses during covert orienting.

    Get PDF
    A distributed network of cortical and subcortical brain areas controls our oculomotor behavior. This network includes the superior colliculus (SC), which coordinates an ancient visual grasp reflex via outputs that ramify widely within the brainstem and spinal cord, accessing saccadic and other premotor and autonomic circuits. In this Review, we discuss recent results correlating subliminal SC activity in the absence of saccades with diverse components of the visual grasp reflex, including neck and limb muscle recruitment, pupil dilation, and microsaccade propensity. Such subtle manifestations of covert orienting are accessible in the motor periphery and may provide the next generation of oculomotor biomarkers in health and disease

    Temporal Dynamics of Decision-Making during Motion Perception in the Visual Cortex

    Get PDF
    How does the brain make decisions? Speed and accuracy of perceptual decisions covary with certainty in the input, and correlate with the rate of evidence accumulation in parietal and frontal cortical "decision neurons." A biophysically realistic model of interactions within and between Retina/LGN and cortical areas V1, MT, MST, and LIP, gated by basal ganglia, simulates dynamic properties of decision-making in response to ambiguous visual motion stimuli used by Newsome, Shadlen, and colleagues in their neurophysiological experiments. The model clarifies how brain circuits that solve the aperture problem interact with a recurrent competitive network with self-normalizing choice properties to carry out probablistic decisions in real time. Some scientists claim that perception and decision-making can be described using Bayesian inference or related general statistical ideas, that estimate the optimal interpretation of the stimulus given priors and likelihoods. However, such concepts do not propose the neocortical mechanisms that enable perception, and make decisions. The present model explains behavioral and neurophysiological decision-making data without an appeal to Bayesian concepts and, unlike other existing models of these data, generates perceptual representations and choice dynamics in response to the experimental visual stimuli. Quantitative model simulations include the time course of LIP neuronal dynamics, as well as behavioral accuracy and reaction time properties, during both correct and error trials at different levels of input ambiguity in both fixed duration and reaction time tasks. Model MT/MST interactions compute the global direction of random dot motion stimuli, while model LIP computes the stochastic perceptual decision that leads to a saccadic eye movement.National Science Foundation (SBE-0354378, IIS-02-05271); Office of Naval Research (N00014-01-1-0624); National Institutes of Health (R01-DC-02852

    Neural Dynamics of Saccadic and Smooth Pursuit Eye Movement Coordination during Visual Tracking of Unpredictably Moving Targets

    Full text link
    How does the brain use eye movements to track objects that move in unpredictable directions and speeds? Saccadic eye movements rapidly foveate peripheral visual or auditory targets and smooth pursuit eye movements keep the fovea pointed toward an attended moving target. Analyses of tracking data in monkeys and humans reveal systematic deviations from predictions of the simplest model of saccade-pursuit interactions, which would use no interactions other than common target selection and recruitment of shared motoneurons. Instead, saccadic and smooth pursuit movements cooperate to cancel errors of gaze position and velocity, and thus to maximize target visibility through time. How are these two systems coordinated to promote visual localization and identification of moving targets? How are saccades calibrated to correctly foveate a target despite its continued motion during the saccade? A neural model proposes answers to such questions. The modeled interactions encompass motion processing areas MT, MST, FPA, DLPN and NRTP; saccade planning and execution areas FEF and SC; the saccadic generator in the brain stem; and the cerebellum. Simulations illustrate the model’s ability to functionally explain and quantitatively simulate anatomical, neurophysiological and behavioral data about SAC-SPEM tracking.National Science Foundation (SBE-0354378); Office of Naval Research (N00014-01-1-0624

    Does the Superior Colliculus Control Perceptual Sensitivity or Choice Bias during Attention? Evidence from a Multialternative Decision Framework

    Get PDF
    Distinct networks in the forebrain and the midbrain coordinate to control spatial attention. The critical involvement of the superior colliculus (SC)—the central structure in the midbrain network—in visuospatial attention has been shown by four seminal, published studies in monkeys (Macaca mulatta) performing multialternative tasks. However, due to the lack of a mechanistic framework for interpreting behavioral data in such tasks, the nature of the SC's contribution to attention remains unclear. Here we present and validate a novel decision framework for analyzing behavioral data in multialternative attention tasks. We apply this framework to re-examine the behavioral evidence from these published studies. Our model is a multidimensional extension to signal detection theory that distinguishes between two major classes of attentional mechanisms: those that alter the quality of sensory information or “sensitivity,” and those that alter the selective gating of sensory information or “choice bias.” Model-based simulations and model-based analyses of data from these published studies revealed a converging pattern of results that indicated that choice-bias changes, rather than sensitivity changes, were the primary outcome of SC manipulation. Our results suggest that the SC contributes to attentional performance predominantly by generating a spatial choice bias for stimuli at a selected location, and that this bias operates downstream of forebrain mechanisms that enhance sensitivity. The findings lead to a testable mechanistic framework of how the midbrain and forebrain networks interact to control spatial attention
    corecore