115 research outputs found

    Sequential Sparsening by Successive Adaptation in Neural Populations

    Get PDF
    In the principal cells of the insect mushroom body, the Kenyon cells (KC), olfactory information is represented by a spatially and temporally sparse code. Each odor stimulus will activate only a small portion of neurons and each stimulus leads to only a short phasic response following stimulus onset irrespective of the actual duration of a constant stimulus. The mechanisms responsible for the sparse code in the KCs are yet unresolved. Here, we explore the role of the neuron-intrinsic mechanism of spike-frequency adaptation (SFA) in producing temporally sparse responses to sensory stimulation in higher processing stages. Our single neuron model is defined through a conductance-based integrate-and-fire neuron with spike-frequency adaptation [1]. We study a fully connected feed-forward network architecture in coarse analogy to the insect olfactory pathway. A first layer of ten neurons represents the projection neurons (PNs) of the antenna lobe. All PNs receive a step-like input from the olfactory receptor neurons, which was realized by independent Poisson processes. The second layer represents 100 KCs which converge onto ten neurons in the output layer which represents the population of mushroom body extrinsic neurons (ENs). Our simulation result matches with the experimental observations. In particular, intracellular recordings of PNs show a clear phasic-tonic response that outlasts the stimulus [2] while extracellular recordings from KCs in the locust express sharp transient responses [3]. We conclude that the neuron-intrinsic mechanism is can explain a progressive temporal response sparsening in the insect olfactory system. Further experimental work is needed to test this hypothesis empirically. [1] Muller et. al., Neural Comput, 19(11):2958-3010, 2007. [2] Assisi et. al., Nat Neurosci, 10(9):1176-1184, 2007. [3] Krofczik et. al. Front. Comput. Neurosci., 2(9), 2009.Comment: 5 pages, 2 figures, This manuscript was submitted for review to the Eighteenth Annual Computational Neuroscience Meeting CNS*2009 in Berlin and accepted for oral presentation at the meetin

    NEURON and Python

    Get PDF
    The NEURON simulation program now allows Python to be used, alone or in combination with NEURON's traditional Hoc interpreter. Adding Python to NEURON has the immediate benefit of making available a very extensive suite of analysis tools written for engineering and science. It also catalyzes NEURON software development by offering users a modern programming tool that is recognized for its flexibility and power to create and maintain complex programs. At the same time, nothing is lost because all existing models written in Hoc, including graphical user interface tools, continue to work without change and are also available within the Python context. An example of the benefits of Python availability is the use of the xml module in implementing NEURON's Import3D and CellBuild tools to read MorphML and NeuroML model specifications

    Trends in Programming Languages for Neuroscience Simulations

    Get PDF
    Neuroscience simulators allow scientists to express models in terms of biological concepts, without having to concern themselves with low-level computational details of their implementation. The expressiveness, power and ease-of-use of the simulator interface is critical in efficiently and accurately translating ideas into a working simulation. We review long-term trends in the development of programmable simulator interfaces, and examine the benefits of moving from proprietary, domain-specific languages to modern dynamic general-purpose languages, in particular Python, which provide neuroscientists with an interactive and expressive simulation development environment and easy access to state-of-the-art general-purpose tools for scientific computing

    Network-timing-dependent plasticity

    Get PDF
    Bursts of activity in networks of neurons are thought to convey salient information and drive synaptic plasticity. Here we report that network bursts also exert a profound effect on Spike-Timing-Dependent Plasticity (STDP). In acute slices of juvenile rat somatosensory cortex we paired a network burst, which alone induced long-term depression (LTD), with STDP-induced long-term potentiation (LTP) and LTD. We observed that STDP-induced LTP was either unaffected, blocked or flipped into LTD by the network burst, and that STDP-induced LTD was either saturated or flipped into LTP, depending on the relative timing of the network burst with respect to spike coincidences of the STDP event. We hypothesized that network bursts flip STDP-induced LTP to LTD by depleting resources needed for LTP and therefore developed a resource-dependent STDP learning rule. In a model neural network under the influence of the proposed resource-dependent STDP rule, we found that excitatory synaptic coupling was homeostatically regulated to produce power law distributed burst amplitudes reflecting self-organized criticality, a state that ensures optimal information coding

    Establishing a Novel Modeling Tool: A Python-Based Interface for a Neuromorphic Hardware System

    Get PDF
    Neuromorphic hardware systems provide new possibilities for the neuroscience modeling community. Due to the intrinsic parallelism of the micro-electronic emulation of neural computation, such models are highly scalable without a loss of speed. However, the communities of software simulator users and neuromorphic engineering in neuroscience are rather disjoint. We present a software concept that provides the possibility to establish such hardware devices as valuable modeling tools. It is based on the integration of the hardware interface into a simulator-independent language which allows for unified experiment descriptions that can be run on various simulation platforms without modification, implying experiment portability and a huge simplification of the quantitative comparison of hardware and simulator results. We introduce an accelerated neuromorphic hardware device and describe the implementation of the proposed concept for this system. An example setup and results acquired by utilizing both the hardware system and a software simulator are demonstrated

    Adaptation Reduces Variability of the Neuronal Population Code

    Full text link
    Sequences of events in noise-driven excitable systems with slow variables often show serial correlations among their intervals of events. Here, we employ a master equation for general non-renewal processes to calculate the interval and count statistics of superimposed processes governed by a slow adaptation variable. For an ensemble of spike-frequency adapting neurons this results in the regularization of the population activity and an enhanced post-synaptic signal decoding. We confirm our theoretical results in a population of cortical neurons.Comment: 4 pages, 2 figure
    • 

    corecore