9 research outputs found

    KInNeSS: A Modular Framework for Computational Neuroscience

    Full text link
    Making use of very detailed neurophysiological, anatomical, and behavioral data to build biological-realistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalabiltiy, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multu-compartmental neurons with biophysical properties such as membrane potential, voltage-gated and ligand-gated channels, the presence of gap junctions of ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, local-field potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plug-in development is also presented. Further developement of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effecitively collaborate using a modern neural simulation platform.Center for Excellence for Learning Education, Science, and Technology (SBE-0354378); Air Force Office of Scientific Research (F49620-01-1-0397); Office of Naval Research (N00014-01-1-0624

    Simulation of networks of spiking neurons: A review of tools and strategies

    Full text link
    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin-Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks.Comment: 49 pages, 24 figures, 1 table; review article, Journal of Computational Neuroscience, in press (2007

    Introducing numerical bounds to improve event-based neural network simulation

    Get PDF
    Although the spike-trains in neural networks are mainly constrained by the neural dynamics itself, global temporal constraints (refractoriness, time precision, propagation delays, ..) are also to be taken into account. These constraints are revisited in this paper in order to use them in event-based simulation paradigms. We first review these constraints, and discuss their consequences at the simulation level, showing how event-based simulation of time-constrained networks can be simplified in this context: the underlying data-structures are strongly simplified, while event-based and clock-based mechanisms can be easily mixed. These ideas are applied to punctual conductance-based generalized integrate-and-fire neural networks simulation, while spike-response model simulations are also revisited within this framework. As an outcome, a fast minimal complementary alternative with respect to existing simulation event-based methods, with the possibility to simulate interesting neuron models is implemented and experimented.Comment: submitte

    State-dependent activity dynamics of hypothalamic stress effector neurons

    Get PDF
    The stress response necessitates an immediate boost in vital physiological functions from their homeostatic operation to an elevated emergency response. However, the neural mechanisms underlying this state-dependent change remain largely unknown. Using a combination of in vivo and ex vivo electrophysiology with computational modeling, we report that corticotropin releasing hormone (CRH) neurons in the paraventricular nucleus of the hypothalamus (PVN), the effector neurons of hormonal stress response, rapidly transition between distinct activity states through recurrent inhibition. Specifically, in vivo optrode recording shows that under non-stress conditions, CRHPVN neurons often fire with rhythmic brief bursts (RB), which, somewhat counterintuitively, constrains firing rate due to long (~2 s) interburst intervals. Stressful stimuli rapidly switch RB to continuous single spiking (SS), permitting a large increase in firing rate. A spiking network model shows that recurrent inhibition can control this activity-state switch, and more broadly the gain of spiking responses to excitatory inputs. In biological CRHPVN neurons ex vivo, the injection of whole-cell currents derived from our computational model recreates the in vivo-like switch between RB and SS, providing direct evidence that physiologically relevant network inputs enable state-dependent computation in single neurons. Together, we present a novel mechanism for state-dependent activity dynamics in CRHPVN neurons

    DynaSim: a MATLAB toolbox for neural modeling and simulation

    Get PDF
    [EN] DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community.This material is based upon research supported by the U.S. Army Research Office under award number ARO W911NF-12-R-0012-02, the U.S. Office of Naval Research under award number ONR MURI N00014-16-1-2832, and the National Science Foundation under award number NSF DMS-1042134 (Cognitive Rhythms Collaborative: A Discovery Network)Sherfey, JS.; Soplata, AE.; Ardid-Ramírez, JS.; Roberts, EA.; Stanley, DA.; Pittman-Polletta, BR.; Kopell, NJ. (2018). DynaSim: a MATLAB toolbox for neural modeling and simulation. Frontiers in Neuroinformatics. 12:1-15. https://doi.org/10.3389/fninf.2018.00010S11512Bokil, H., Andrews, P., Kulkarni, J. E., Mehta, S., & Mitra, P. P. (2010). Chronux: A platform for analyzing neural signals. Journal of Neuroscience Methods, 192(1), 146-151. doi:10.1016/j.jneumeth.2010.06.020Brette, R., Rudolph, M., Carnevale, T., Hines, M., Beeman, D., Bower, J. M., … Destexhe, A. (2007). Simulation of networks of spiking neurons: A review of tools and strategies. Journal of Computational Neuroscience, 23(3), 349-398. doi:10.1007/s10827-007-0038-6Börgers, C., & Kopell, N. (2005). Effects of Noisy Drive on Rhythms in Networks of Excitatory and Inhibitory Neurons. Neural Computation, 17(3), 557-608. doi:10.1162/0899766053019908Ching, S., Cimenser, A., Purdon, P. L., Brown, E. N., & Kopell, N. J. (2010). Thalamocortical model for a propofol-induced  -rhythm associated with loss of consciousness. Proceedings of the National Academy of Sciences, 107(52), 22665-22670. doi:10.1073/pnas.1017069108Delorme, A., & Makeig, S. (2004). EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of Neuroscience Methods, 134(1), 9-21. doi:10.1016/j.jneumeth.2003.10.009Durstewitz, D., Seamans, J. K., & Sejnowski, T. J. (2000). Neurocomputational models of working memory. Nature Neuroscience, 3(S11), 1184-1191. doi:10.1038/81460EatonJ. W. BatemanD. HaubergS. WehbringR. GNU Octave Version 4.2.0 Manual: A High-Level Interactive Language for Numerical Computations2016Ermentrout, B. (2002). Simulating, Analyzing, and Animating Dynamical Systems. doi:10.1137/1.9780898718195FitzHugh, R. (1955). Mathematical models of threshold phenomena in the nerve membrane. The Bulletin of Mathematical Biophysics, 17(4), 257-278. doi:10.1007/bf02477753Gewaltig, M.-O., & Diesmann, M. (2007). NEST (NEural Simulation Tool). Scholarpedia, 2(4), 1430. doi:10.4249/scholarpedia.1430Gleeson, P., Crook, S., Cannon, R. C., Hines, M. L., Billings, G. O., Farinella, M., … Silver, R. A. (2010). NeuroML: A Language for Describing Data Driven Models of Neurons and Networks with a High Degree of Biological Detail. PLoS Computational Biology, 6(6), e1000815. doi:10.1371/journal.pcbi.1000815Goodman, D. (2008). Brian: a simulator for spiking neural networks in Python. Frontiers in Neuroinformatics, 2. doi:10.3389/neuro.11.005.2008Goodman, D. F. M. (2009). The Brian simulator. Frontiers in Neuroscience, 3(2), 192-197. doi:10.3389/neuro.01.026.2009Hines, M. L., & Carnevale, N. T. (1997). The NEURON Simulation Environment. Neural Computation, 9(6), 1179-1209. doi:10.1162/neco.1997.9.6.1179Hodgkin, A. L., & Huxley, A. F. (1952). A quantitative description of membrane current and its application to conduction and excitation in nerve. The Journal of Physiology, 117(4), 500-544. doi:10.1113/jphysiol.1952.sp004764Hucka, M., Finney, A., Sauro, H. M., Bolouri, H., Doyle, J. C., Kitano, H., … Wang. (2003). The systems biology markup language (SBML): a medium for representation and exchange of biochemical network models. Bioinformatics, 19(4), 524-531. doi:10.1093/bioinformatics/btg015Izhikevich, E. M. (2003). Simple model of spiking neurons. IEEE Transactions on Neural Networks, 14(6), 1569-1572. doi:10.1109/tnn.2003.820440Kopell, N., Ermentrout, G. B., Whittington, M. A., & Traub, R. D. (2000). Gamma rhythms and beta rhythms have different synchronization properties. Proceedings of the National Academy of Sciences, 97(4), 1867-1872. doi:10.1073/pnas.97.4.1867Kramer, M. A., Roopun, A. K., Carracedo, L. M., Traub, R. D., Whittington, M. A., & Kopell, N. J. (2008). Rhythm Generation through Period Concatenation in Rat Somatosensory Cortex. PLoS Computational Biology, 4(9), e1000169. doi:10.1371/journal.pcbi.1000169Lorenz, E. N. (1963). Deterministic Nonperiodic Flow. Journal of the Atmospheric Sciences, 20(2), 130-141. doi:10.1175/1520-0469(1963)0202.0.co;2Markram, H., Meier, K., Lippert, T., Grillner, S., Frackowiak, R., Dehaene, S., … Saria, A. (2011). Introducing the Human Brain Project. Procedia Computer Science, 7, 39-42. doi:10.1016/j.procs.2011.12.015McDougal, R. A., Morse, T. M., Carnevale, T., Marenco, L., Wang, R., Migliore, M., … Hines, M. L. (2016). Twenty years of ModelDB and beyond: building essential modeling tools for the future of neuroscience. Journal of Computational Neuroscience, 42(1), 1-10. doi:10.1007/s10827-016-0623-7Meng, L., Kramer, M. A., Middleton, S. J., Whittington, M. A., & Eden, U. T. (2014). A Unified Approach to Linking Experimental, Statistical and Computational Analysis of Spike Train Data. PLoS ONE, 9(1), e85269. doi:10.1371/journal.pone.0085269Morris, C., & Lecar, H. (1981). Voltage oscillations in the barnacle giant muscle fiber. Biophysical Journal, 35(1), 193-213. doi:10.1016/s0006-3495(81)84782-0Rudolph, M., & Destexhe, A. (2007). How much can we trust neural simulation strategies? Neurocomputing, 70(10-12), 1966-1969. doi:10.1016/j.neucom.2006.10.138Stimberg, M., Goodman, D. F. M., Benichoux, V., & Brette, R. (2014). Equation-oriented specification of neural models for simulations. Frontiers in Neuroinformatics, 8. doi:10.3389/fninf.2014.00006Traub, R. D., Buhl, E. H., Gloveli, T., & Whittington, M. A. (2003). Fast Rhythmic Bursting Can Be Induced in Layer 2/3 Cortical Neurons by Enhancing Persistent Na+Conductance or by Blocking BK Channels. Journal of Neurophysiology, 89(2), 909-921. doi:10.1152/jn.00573.200

    Xolotl: An Intuitive and Approachable Neuron and Network Simulator for Research and Teaching

    Get PDF
    Conductance-based models of neurons are used extensively in computational neuroscience. Working with these models can be challenging due to their high dimensionality and large number of parameters. Here, we present a neuron and network simulator built on a novel automatic type system that binds object-oriented code written in C++ to objects in MATLAB. Our approach builds on the tradition of uniting the speed of languages like C++ with the ease-of-use and feature-set of scientific programming languages like MATLAB. Xolotl allows for the creation and manipulation of hierarchical models with components that are named and searchable, permitting intuitive high-level programmatic control over all parts of the model. The simulator's architecture allows for the interactive manipulation of any parameter in any model, and for visualizing the effects of changing that parameter immediately. Xolotl is fully featured with hundreds of ion channel models from the electrophysiological literature, and can be extended to include arbitrary conductances, synapses, and mechanisms. Several core features like bookmarking of parameters and automatic hashing of source code facilitate reproducible and auditable research. Its ease of use and rich visualization capabilities make it an attractive option in teaching environments. Finally, xolotl is written in a modular fashion, includes detailed tutorials and worked examples, and is freely available at https://github.com/sg-s/xolotl, enabling seamless integration into the workflows of other researchers

    Sample Path Analysis of Integrate-and-Fire Neurons

    Get PDF
    Computational neuroscience is concerned with answering two intertwined questions that are based on the assumption that spatio-temporal patterns of spikes form the universal language of the nervous system. First, what function does a specific neural circuitry perform in the elaboration of a behavior? Second, how do neural circuits process behaviorally-relevant information? Non-linear system analysis has proven instrumental in understanding the coding strategies of early neural processing in various sensory modalities. Yet, at higher levels of integration, it fails to help in deciphering the response of assemblies of neurons to complex naturalistic stimuli. If neural activity can be assumed to be primarily driven by the stimulus at early stages of processing, the intrinsic activity of neural circuits interacts with their high-dimensional input to transform it in a stochastic non-linear fashion at the cortical level. As a consequence, any attempt to fully understand the brain through a system analysis approach becomes illusory. However, it is increasingly advocated that neural noise plays a constructive role in neural processing, facilitating information transmission. This prompts to gain insight into the neural code by studying the stochasticity of neuronal activity, which is viewed as biologically relevant. Such an endeavor requires the design of guiding theoretical principles to assess the potential benefits of neural noise. In this context, meeting the requirements of biological relevance and computational tractability, while providing a stochastic description of neural activity, prescribes the adoption of the integrate-and-fire model. In this thesis, founding ourselves on the path-wise description of neuronal activity, we propose to further the stochastic analysis of the integrate-and fire model through a combination of numerical and theoretical techniques. To begin, we expand upon the path-wise construction of linear diffusions, which offers a natural setting to describe leaky integrate-and-fire neurons, as inhomogeneous Markov chains. Based on the theoretical analysis of the first-passage problem, we then explore the interplay between the internal neuronal noise and the statistics of injected perturbations at the single unit level, and examine its implications on the neural coding. At the population level, we also develop an exact event-driven implementation of a Markov network of perfect integrate-and-fire neurons with both time delayed instantaneous interactions and arbitrary topology. We hope our approach will provide new paradigms to understand how sensory inputs perturb neural intrinsic activity and accomplish the goal of developing a new technique for identifying relevant patterns of population activity. From a perturbative perspective, our study shows how injecting frozen noise in different flavors can help characterize internal neuronal noise, which is presumably functionally relevant to information processing. From a simulation perspective, our event-driven framework is amenable to scrutinize the stochastic behavior of simple recurrent motifs as well as temporal dynamics of large scale networks under spike-timing-dependent plasticity
    corecore