1,330 research outputs found

    State Dependence of Stimulus-Induced Variability Tuning in Macaque MT

    Full text link
    Behavioral states marked by varying levels of arousal and attention modulate some properties of cortical responses (e.g. average firing rates or pairwise correlations), yet it is not fully understood what drives these response changes and how they might affect downstream stimulus decoding. Here we show that changes in state modulate the tuning of response variance-to-mean ratios (Fano factors) in a fashion that is neither predicted by a Poisson spiking model nor changes in the mean firing rate, with a substantial effect on stimulus discriminability. We recorded motion-sensitive neurons in middle temporal cortex (MT) in two states: alert fixation and light, opioid anesthesia. Anesthesia tended to lower average spike counts, without decreasing trial-to-trial variability compared to the alert state. Under anesthesia, within-trial fluctuations in excitability were correlated over longer time scales compared to the alert state, creating supra-Poisson Fano factors. In contrast, alert-state MT neurons have higher mean firing rates and largely sub-Poisson variability that is stimulus-dependent and cannot be explained by firing rate differences alone. The absence of such stimulus-induced variability tuning in the anesthetized state suggests different sources of variability between states. A simple model explains state-dependent shifts in the distribution of observed Fano factors via a suppression in the variance of gain fluctuations in the alert state. A population model with stimulus-induced variability tuning and behaviorally constrained information-limiting correlations explores the potential enhancement in stimulus discriminability by the cortical population in the alert state.Comment: 36 pages, 18 figure

    The effect of neural adaptation of population coding accuracy

    Full text link
    Most neurons in the primary visual cortex initially respond vigorously when a preferred stimulus is presented, but adapt as stimulation continues. The functional consequences of adaptation are unclear. Typically a reduction of firing rate would reduce single neuron accuracy as less spikes are available for decoding, but it has been suggested that on the population level, adaptation increases coding accuracy. This question requires careful analysis as adaptation not only changes the firing rates of neurons, but also the neural variability and correlations between neurons, which affect coding accuracy as well. We calculate the coding accuracy using a computational model that implements two forms of adaptation: spike frequency adaptation and synaptic adaptation in the form of short-term synaptic plasticity. We find that the net effect of adaptation is subtle and heterogeneous. Depending on adaptation mechanism and test stimulus, adaptation can either increase or decrease coding accuracy. We discuss the neurophysiological and psychophysical implications of the findings and relate it to published experimental data.Comment: 35 pages, 8 figure

    A Neural Model of Motion Processing and Visual Navigation by Cortical Area MST

    Full text link
    Cells in the dorsal medial superior temporal cortex (MSTd) process optic flow generated by self-motion during visually-guided navigation. A neural model shows how interactions between well-known neural mechanisms (log polar cortical magnification, Gaussian motion-sensitive receptive fields, spatial pooling of motion-sensitive signals, and subtractive extraretinal eye movement signals) lead to emergent properties that quantitatively simulate neurophysiological data about MSTd cell properties and psychophysical data about human navigation. Model cells match MSTd neuron responses to optic flow stimuli placed in different parts of the visual field, including position invariance, tuning curves, preferred spiral directions, direction reversals, average response curves, and preferred locations for stimulus motion centers. The model shows how the preferred motion direction of the most active MSTd cells can explain human judgments of self-motion direction (heading), without using complex heading templates. The model explains when extraretinal eye movement signals are needed for accurate heading perception, and when retinal input is sufficient, and how heading judgments depend on scene layouts and rotation rates.Defense Research Projects Agency (N00014-92-J-4015); Office of Naval Research (N00014-92-J-1309, N00014-95-1-0409, N00014-95-1-0657, N00014-91-J-4100, N0014-94-I-0597); Air Force Office of Scientific Research (F49620-92-J-0334)

    Eye velocity gain fields for visuo- motor coordinate transformations

    Get PDF
    ’Gain-field-like’ tuning behavior is characterized by a modulation of the neuronal response depending on a certain variable, without changing the actual receptive field characteristics in relation to another variable. Eye position gain fields were first observed in area 7a of the posterior parietal cortex (PPC), where visually responsive neurons are modulated by ocular position. Analysis of artificial neural networks has shown that this type of tuning function might comprise the neuronal substrate for coordinate transformations. In this work, neuronal activity in the dorsal medial superior temporal area (MSTd) has been analyzed with an focus on it’s involvement in oculomotor control. MSTd is part of the extrastriate visual cortex and located in the PPC. Lesion studies suggested a participation of this cortical area in the control of eye movements. Inactivation of MSTd severely impairs the optokinetic response (OKR), which is an reflex-like kind of eye movement that compensates for motion of the whole visual scene. Using a novel, information-theory based approach for neuronal data analysis, we were able to identify those visual and eye movement related signals which were most correlated to the mean rate of spiking activity in MSTd neurons during optokinetic stimulation. In a majority of neurons firing rate was non-linearly related to a combination of retinal image velocity and eye velocity. The observed neuronal latency relative to these signals is in line with a system-level model of OKR, where an efference copy of the motor command signal is used to generate an internal estimate of the head-centered stimulus velocity signal. Tuning functions were obtained by using a probabilistic approach. In most MSTd neurons these functions exhibited gain-field-like shapes, with eye velocity modulating the visual response in a multiplicative manner. Population analysis revealed a large diversity of tuning forms including asymmetric and non-separable functions. The distribution of gain fields was almost identical to the predictions from a neural network model trained to perform the summation of image and eye velocity. These findings therefore strongly support the hypothesis of MSTd’s participation in the OKR control system by implementing the transformation from retinal image velocity to an estimate of stimulus velocity. In this sense, eye velocity gain fields constitute an intermediate step in transforming the eye-centered to a head-centered visual motion signal.Another aspect that was addressed in this work was the comparison of the irregularity of MSTd spiking activity during optokinetic response with the behavior during pure visual stimulation. The goal of this study was an evaluation of potential neuronal mechanisms underlying the observed gain field behavior. We found that both inter- and intra-trial variability were decreased with increasing retinal image velocity, but increased with eye velocity. This observation argues against a symmetrical integration of driving and modulating inputs. Instead, we propose an architecture where multiplicative gain modulation is achieved by simultaneous increase of excitatory and inhibitory background synaptic input. A conductance-based single-compartment model neuron was able to reproduce realistic gain modulation and the observed stimulus-dependence of neural variability, at the same time. In summary, this work leads to improved knowledge about MSTd’s role in visuomotor transformation by analyzing various functional and mechanistic aspects of eye velocity gain fields on a systems-, network-, and neuronal level

    Correlated Activity and Corticothalamic Cell Function in the Early Mouse Visual System

    Get PDF
    Vision has long been the model for understanding cortical function. Great progress has been made in understanding the transformations that occur within some primary visual cortex (V1) layers, like the emergence of orientation selectivity in layer 4. Less is known about other V1 circuit elements, like the shaping of V1 input via corticothalamic projections, or the population structure of the cortico-cortical output in layer 2/3. Here, we use the mouse early visual system to investigate the structure and function of circuit elements in V1. We use two approaches: comparative physiology and optogenetics. We measured the structure of pairwise correlations in the output layer 2/3 using extracellular recordings. We find that despite a lack of organization in mouse V1 seen in other species, the specificity of connections preserves a correlation structure on multiple timescales. To investigate the role of corticogeniculate projections, we utilize a transgenic mouse line to specifically and reversibly manipulate these projections with millisecond precision. We find that activity of these cells results a mix of inhibition and excitation in the thalamus, is not spatiotemporally specific, and can affect correlated activity. Finally, we classify mouse thalamic cells according to stimuli used for cell classification in primates and cats, finding some, but not complete, homology to the processing streams of primate thalamus and further highlighting fundamentals of mammalian visual system organization

    Top-Down Control of Lateral Interactions in Visual Cortex

    Get PDF
    V1 neurons are capable of integrating information over a large area of visual field. Their responses to local features are dependent on the global characteristics of contours and surfaces that extend well beyond their receptive fields. These contextual influences in V1 are subject to cognitive influences of attention, perceptual task and expectation. Previously it’s been shown that the response properties of V1 neurons change to carry more information about behaviorally relevant stimulus features (Li et al. 2004). We hypothesized that top-down modulation of effective connectivity within V1 underlies the behaviorally dependent modulations of contextual interactions in V1. To test this idea, we used a chronically implanted multi-electrode array in awake primates and studied the mechanisms of top-down control of contextual interactions in V1. We used a behavioral paradigm in which the animals performed two different perceptual tasks on the same stimulus and studied task-dependent changes in connectivity between V1 sites that encode the stimulus. We found that V1 interactions-both spiking and LFP interactions-showed significant task-dependent changes. The direction of the task-dependent changes observed in LFP interactions, measured by coherence between LFP signals, was dependent on the perceptual strategy used by the animal. Bisection task involving perceptual grouping of parallel lines increased LFP coherence while vernier task involving segregation of collinear line decrease LFP coherence. Also, grouping of collinear lines to detect a contour resulted in increased LFP interactions. Since noise correlations can affect the coding accuracy of a cortical network, we investigated how top-down processes of attention and perceptual task affect V1 noise correlations. We were able to study the noise correlation dynamics that were due to attentional shift separately from the changes due to the perceptual task being performed at the attended location. Top-down influences reduced V1 noise-correlations to a greater extent when the animal performed a discrimination task at the recorded locations compared to when the animal shifted its attention to the location. The reduction in noise correlation during the perceptual task was accompanied by a significant increase in the information carried about the stimulus (calculated as Fisher information). Our analysis was also able to determine the degree to which the task dependent change in information was due to the alteration in neuronal tuning compared to changes in correlated activity. Interestingly, the largest effects on information were seen between stimuli that had the greatest difficulty of discrimination

    Perceptual Learning Reduces Interneuronal Correlations in Macaque Visual Cortex

    Get PDF
    SummaryResponses of neurons in early visual cortex change little with training and appear insufficient to account for perceptual learning. Behavioral performance, however, relies on population activity, and the accuracy of a population code is constrained by correlated noise among neurons. We tested whether training changes interneuronal correlations in the dorsal medial superior temporal area, which is involved in multisensory heading perception. Pairs of single units were recorded simultaneously in two groups of subjects: animals trained extensively in a heading discrimination task, and “naive” animals that performed a passive fixation task. Correlated noise was significantly weaker in trained versus naive animals, which might be expected to improve coding efficiency. However, we show that the observed uniform reduction in noise correlations leads to little change in population coding efficiency when all neurons are decoded. Thus, global changes in correlated noise among sensory neurons may be insufficient to account for perceptual learning

    Optogenetic interrogation of primary visual cortex and its impact on neural coding and behavior

    Get PDF
    Understanding the mechanism by which the brain transforms simple sensory inputs into rich perceptual experiences is one of the great mysteries of systems neuroscience. Undoubtedly this involves the activity of large populations of interconnected neurons, but while the responses of individual neurons to a variety of sensory stimuli have been well-characterized, how populations of such neurons organize their activity to create our sensory perceptions is almost entirely unknown. To investigate this complex circuitry requires the ability to causally manipulate the activity of neural populations and monitor the resultant effects. Here we focus on primary visual cortex (V1), which has been shown to be crucial for visual perception, and utilize optogenetic tools to render the activity of genetically- defined neural populations sensitive to light. By simultaneously recording and modulating (either driving or silencing) the activity of excitatory (glutamatergic) neurons, we are able to causally examine their role in visual perception. Here we report 3 major findings. First, we show that activating subpopulations of excitatory neurons can improve visual perception under certain conditions and that information in V1 used for perceptual decisions is integrated across spatially-limited populations of neurons. Further, we show that a key signature of this information integration is a reduction in correlated variability between neurons. Correlated variability has been implicated as a major source of behavioral choice related activity in the cortex, and theorized to be a major factor limiting information in cortical populations. However, until now, there has not been a way to manipulate correlations without altering firing rates or other task related variables. Here we demonstrate a novel method using optogenetic stimulation to causally manipulate correlated variability between cortical neurons without altering their firing rates. Lastly, with the goal of expanding the currently limited repertoire of optogenetic tools for non-human primates, we establish the viability of a novel optogenetic construct capable of dramatically silencing neural populations using a recently discovered anion conducting channelrhodopsin
    corecore