282 research outputs found

    Towards an image of a memory trace

    Get PDF
    Learning and memory lead to functional and structural changes in the brain, ultimately providing a basis for adaptive behavior. The honeybee is an elegant model for the study of learning and memory formation as it permits both the visualization of neural activity related to the events occurring in olfactory learning and the behavioral assessment of olfactory learning (Galizia and Menzel, 2000 ). The formation of odor memories in the honeybee is thought to involve the two primary processing centers of the olfactory system, the antennal lobe (AL) and the mushroom body (MB). The intrinsic neurons of the MB – the Kenyon cells (KCs), located within the lip region of the MB calyx – are the site of convergence of the neural pathways that transmit odor information from the projection neurons (PNs) of the AL and reward information from the VUMmx1 neuron (Hammer, 1997 ). In recent years, imaging studies performed in the honeybee AL and MB lip have indicated that pairing odor and reward induces changes in neural activity (Faber and Menzel, 2001 ; Faber et al., 1999 ), reinforcing the anatomical suggestion that KCs are likely to undergo associative plasticity during learning

    Branching dendrites with resonant membrane: a “sum-over-trips” approach

    Get PDF
    Dendrites form the major components of neurons. They are complex branching structures that receive and process thousands of synaptic inputs from other neurons. It is well known that dendritic morphology plays an important role in the function of dendrites. Another important contribution to the response characteristics of a single neuron comes from the intrinsic resonant properties of dendritic membrane. In this paper we combine the effects of dendritic branching and resonant membrane dynamics by generalising the “sum-over-trips” approach (Abbott et al. in Biol Cybernetics 66, 49–60 1991). To illustrate how this formalism can shed light on the role of architecture and resonances in determining neuronal output we consider dual recording and reconstruction data from a rat CA1 hippocampal pyramidal cell. Specifically we explore the way in which an Ih current contributes to a voltage overshoot at the soma

    Statistical-Mechanical Measure of Stochastic Spiking Coherence in A Population of Inhibitory Subthreshold Neurons

    Full text link
    By varying the noise intensity, we study stochastic spiking coherence (i.e., collective coherence between noise-induced neural spikings) in an inhibitory population of subthreshold neurons (which cannot fire spontaneously without noise). This stochastic spiking coherence may be well visualized in the raster plot of neural spikes. For a coherent case, partially-occupied "stripes" (composed of spikes and indicating collective coherence) are formed in the raster plot. This partial occupation occurs due to "stochastic spike skipping" which is well shown in the multi-peaked interspike interval histogram. The main purpose of our work is to quantitatively measure the degree of stochastic spiking coherence seen in the raster plot. We introduce a new spike-based coherence measure MsM_s by considering the occupation pattern and the pacing pattern of spikes in the stripes. In particular, the pacing degree between spikes is determined in a statistical-mechanical way by quantifying the average contribution of (microscopic) individual spikes to the (macroscopic) ensemble-averaged global potential. This "statistical-mechanical" measure MsM_s is in contrast to the conventional measures such as the "thermodynamic" order parameter (which concerns the time-averaged fluctuations of the macroscopic global potential), the "microscopic" correlation-based measure (based on the cross-correlation between the microscopic individual potentials), and the measures of precise spike timing (based on the peri-stimulus time histogram). In terms of MsM_s, we quantitatively characterize the stochastic spiking coherence, and find that MsM_s reflects the degree of collective spiking coherence seen in the raster plot very well. Hence, the "statistical-mechanical" spike-based measure MsM_s may be used usefully to quantify the degree of stochastic spiking coherence in a statistical-mechanical way.Comment: 16 pages, 5 figures, to appear in the J. Comput. Neurosc

    An International Laboratory for Systems and Computational Neuroscience

    Get PDF
    The neural basis of decision-making has been elusive and involves the coordinated activity of multiple brain structures. This NeuroView, by the International Brain Laboratory (IBL), discusses their efforts to develop a standardized mouse decision-making behavior, to make coordinated measurements of neural activity across the mouse brain, and to use theory and analyses to uncover the neural computations that support decision-making. The neural basis of decision-making has been elusive and involves the coordinated activity of multiple brain structures. This NeuroView, by the International Brain Laboratory (IBL), discusses their efforts to develop a standardized mouse decision-making behavior, to make coordinated measurements of neural activity across the mouse brain, and to use theory and analyses to uncover the neural computations that support decision-making

    Intrinsic gain modulation and adaptive neural coding

    Get PDF
    In many cases, the computation of a neural system can be reduced to a receptive field, or a set of linear filters, and a thresholding function, or gain curve, which determines the firing probability; this is known as a linear/nonlinear model. In some forms of sensory adaptation, these linear filters and gain curve adjust very rapidly to changes in the variance of a randomly varying driving input. An apparently similar but previously unrelated issue is the observation of gain control by background noise in cortical neurons: the slope of the firing rate vs current (f-I) curve changes with the variance of background random input. Here, we show a direct correspondence between these two observations by relating variance-dependent changes in the gain of f-I curves to characteristics of the changing empirical linear/nonlinear model obtained by sampling. In the case that the underlying system is fixed, we derive relationships relating the change of the gain with respect to both mean and variance with the receptive fields derived from reverse correlation on a white noise stimulus. Using two conductance-based model neurons that display distinct gain modulation properties through a simple change in parameters, we show that coding properties of both these models quantitatively satisfy the predicted relationships. Our results describe how both variance-dependent gain modulation and adaptive neural computation result from intrinsic nonlinearity.Comment: 24 pages, 4 figures, 1 supporting informatio

    Consequences of converting graded to action potentials upon neural information coding and energy efficiency

    Get PDF
    Information is encoded in neural circuits using both graded and action potentials, converting between them within single neurons and successive processing layers. This conversion is accompanied by information loss and a drop in energy efficiency. We investigate the biophysical causes of this loss of information and efficiency by comparing spiking neuron models, containing stochastic voltage-gated Na+ and K+ channels, with generator potential and graded potential models lacking voltage-gated Na+ channels. We identify three causes of information loss in the generator potential that are the by-product of action potential generation: (1) the voltage-gated Na+ channels necessary for action potential generation increase intrinsic noise and (2) introduce non-linearities, and (3) the finite duration of the action potential creates a ‘footprint’ in the generator potential that obscures incoming signals. These three processes reduce information rates by ~50% in generator potentials, to ~3 times that of spike trains. Both generator potentials and graded potentials consume almost an order of magnitude less energy per second than spike trains. Because of the lower information rates of generator potentials they are substantially less energy efficient than graded potentials. However, both are an order of magnitude more efficient than spike trains due to the higher energy costs and low information content of spikes, emphasizing that there is a two-fold cost of converting analogue to digital; information loss and cost inflation

    Dendritic Morphology Predicts Pattern Recognition Performance in Multi-compartmental Model Neurons with and without Active Conductances

    Get PDF
    This is an Open Access article published under the Creative Commons Attribution license CC BY 4.0 which allows users to read, copy, distribute and make derivative works, as long as the author of the original work is citedIn this paper we examine how a neuron’s dendritic morphology can affect its pattern recognition performance. We use two different algorithms to systematically explore the space of dendritic morphologies: an algorithm that generates all possible dendritic trees with 22 terminal points, and one that creates representative samples of trees with 128 terminal points. Based on these trees, we construct multi-compartmental models. To assess the performance of the resulting neuronal models, we quantify their ability to discriminate learnt and novel input patterns. We find that the dendritic morphology does have a considerable effect on pattern recognition performance and that the neuronal performance is inversely correlated with the mean depth of the dendritic tree. The results also reveal that the asymmetry index of the dendritic tree does not correlate with the performance for the full range of tree morphologies. The performance of neurons with dendritic tapering is best predicted by the mean and variance of the electrotonic distance of their synapses to the soma. All relationships found for passive neuron models also hold, even in more accentuated form, for neurons with active membranesPeer reviewedFinal Published versio

    Shaping bursting by electrical coupling and noise

    Full text link
    Gap-junctional coupling is an important way of communication between neurons and other excitable cells. Strong electrical coupling synchronizes activity across cell ensembles. Surprisingly, in the presence of noise synchronous oscillations generated by an electrically coupled network may differ qualitatively from the oscillations produced by uncoupled individual cells forming the network. A prominent example of such behavior is the synchronized bursting in islets of Langerhans formed by pancreatic \beta-cells, which in isolation are known to exhibit irregular spiking. At the heart of this intriguing phenomenon lies denoising, a remarkable ability of electrical coupling to diminish the effects of noise acting on individual cells. In this paper, we derive quantitative estimates characterizing denoising in electrically coupled networks of conductance-based models of square wave bursting cells. Our analysis reveals the interplay of the intrinsic properties of the individual cells and network topology and their respective contributions to this important effect. In particular, we show that networks on graphs with large algebraic connectivity or small total effective resistance are better equipped for implementing denoising. As a by-product of the analysis of denoising, we analytically estimate the rate with which trajectories converge to the synchronization subspace and the stability of the latter to random perturbations. These estimates reveal the role of the network topology in synchronization. The analysis is complemented by numerical simulations of electrically coupled conductance-based networks. Taken together, these results explain the mechanisms underlying synchronization and denoising in an important class of biological models
    corecore