254 research outputs found

    Flipping Biological Switches: Solving for Optimal Control: A Dissertation

    Get PDF
    Switches play an important regulatory role at all levels of biology, from molecular switches triggering signaling cascades to cellular switches regulating cell maturation and apoptosis. Medical therapies are often designed to toggle a system from one state to another, achieving a specified health outcome. For instance, small doses of subpathologic viruses activate the immune system’s production of antibodies. Electrical stimulation revert cardiac arrhythmias back to normal sinus rhythm. In all of these examples, a major challenge is finding the optimal stimulus waveform necessary to cause the switch to flip. This thesis develops, validates, and applies a novel model-independent stochastic algorithm, the Extrema Distortion Algorithm (EDA), towards finding the optimal stimulus. We validate the EDA’s performance for the Hodgkin-Huxley model (an empirically validated ionic model of neuronal excitability), the FitzHugh-Nagumo model (an abstract model applied to a wide range of biological systems that that exhibit an oscillatory state and a quiescent state), and the genetic toggle switch (a model of bistable gene expression). We show that the EDA is able to not only find the optimal solution, but also in some cases excel beyond the traditional analytic approaches. Finally, we have computed novel optimal stimulus waveforms for aborting epileptic seizures using the EDA in cellular and network models of epilepsy. This work represents a first step in developing a new class of adaptive algorithms and devices that flip biological switches, revealing basic mechanistic insights and therapeutic applications for a broad range of disorders

    Training deep neural density estimators to identify mechanistic models of neural dynamics

    Get PDF
    Mechanistic modeling in neuroscience aims to explain observed phenomena in terms of underlying causes. However, determining which model parameters agree with complex and stochastic neural data presents a significant challenge. We address this challenge with a machine learning tool which uses deep neural density estimators-- trained using model simulations-- to carry out Bayesian inference and retrieve the full space of parameters compatible with raw data or selected data features. Our method is scalable in parameters and data features, and can rapidly analyze new data after initial training. We demonstrate the power and flexibility of our approach on receptive fields, ion channels, and Hodgkin-Huxley models. We also characterize the space of circuit configurations giving rise to rhythmic activity in the crustacean stomatogastric ganglion, and use these results to derive hypotheses for underlying compensation mechanisms. Our approach will help close the gap between data-driven and theory-driven models of neural dynamics

    Probing the dynamics of identified neurons with a data-driven modeling approach

    Get PDF
    In controlling animal behavior the nervous system has to perform within the operational limits set by the requirements of each specific behavior. The implications for the corresponding range of suitable network, single neuron, and ion channel properties have remained elusive. In this article we approach the question of how well-constrained properties of neuronal systems may be on the neuronal level. We used large data sets of the activity of isolated invertebrate identified cells and built an accurate conductance-based model for this cell type using customized automated parameter estimation techniques. By direct inspection of the data we found that the variability of the neurons is larger when they are isolated from the circuit than when in the intact system. Furthermore, the responses of the neurons to perturbations appear to be more consistent than their autonomous behavior under stationary conditions. In the developed model, the constraints on different parameters that enforce appropriate model dynamics vary widely from some very tightly controlled parameters to others that are almost arbitrary. The model also allows predictions for the effect of blocking selected ionic currents and to prove that the origin of irregular dynamics in the neuron model is proper chaoticity and that this chaoticity is typical in an appropriate sense. Our results indicate that data driven models are useful tools for the in-depth analysis of neuronal dynamics. The better consistency of responses to perturbations, in the real neurons as well as in the model, suggests a paradigm shift away from measuring autonomous dynamics alone towards protocols of controlled perturbations. Our predictions for the impact of channel blockers on the neuronal dynamics and the proof of chaoticity underscore the wide scope of our approach

    Training deep neural density estimators to identify mechanistic models of neural dynamics

    Get PDF
    Mechanistic modeling in neuroscience aims to explain observed phenomena in terms of underlying causes. However, determining which model parameters agree with complex and stochastic neural data presents a significant challenge. We address this challenge with a machine learning tool which uses deep neural density estimators—trained using model simulations—to carry out Bayesian inference and retrieve the full space of parameters compatible with raw data or selected data features. Our method is scalable in parameters and data features and can rapidly analyze new data after initial training. We demonstrate the power and flexibility of our approach on receptive fields, ion channels, and Hodgkin–Huxley models. We also characterize the space of circuit configurations giving rise to rhythmic activity in the crustacean stomatogastric ganglion, and use these results to derive hypotheses for underlying compensation mechanisms. Our approach will help close the gap between data-driven and theory-driven models of neural dynamics

    Computational modeling with spiking neural networks

    Get PDF
    This chapter reviews recent developments in the area of spiking neural networks (SNN) and summarizes the main contributions to this research field. We give background information about the functioning of biological neurons, discuss the most important mathematical neural models along with neural encoding techniques, learning algorithms, and applications of spiking neurons. As a specific application, the functioning of the evolving spiking neural network (eSNN) classification method is presented in detail and the principles of numerous eSNN based applications are highlighted and discussed

    Parameter estimation of neuron models using <i>in-vitro </i>and<i> in-vivo </i>electrophysiological data

    Get PDF
    Spiking neuron models can accurately predict the response of neurons to somatically injected currents if the model parameters are carefully tuned. Predicting the response of in-vivo neurons responding to natural stimuli presents a far more challenging modeling problem. In this study, an algorithm is presented for parameter estimation of spiking neuron models. The algorithm is a hybrid evolutionary algorithm which uses a spike train metric as a fitness function. We apply this to parameter discovery in modeling two experimental data sets with spiking neurons; in-vitro current injection responses from a regular spiking pyramidal neuron are modeled using spiking neurons and in-vivo extracellular auditory data is modeled using a two stage model consisting of a stimulus filter and spiking neuron model

    Simulation of networks of spiking neurons: A review of tools and strategies

    Full text link
    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin-Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks.Comment: 49 pages, 24 figures, 1 table; review article, Journal of Computational Neuroscience, in press (2007

    Neuron models of the generic bifurcation type:network analysis and data modeling

    Get PDF
    Minimal nonlinear dynamic neuron models of the generic bifurcation type may provide the middle way between the detailed models favored by experimentalists and the simplified threshold and rate model of computational neuroscientists. This thesis investigates to which extent generic bifurcation type models grasp the essential dynamical features that may turn out play a role in cooperative neural behavior. The thesis considers two neuron models, of increasing complexity, and one model of synaptic interactions. The FitzHugh-Nagumo model is a simple two-dimensional model capable only of spiking behavior, and the Hindmarsh-Rose model is a three-dimensional model capable of more complex dynamics such as bursting and chaos. The model for synaptic interactions is a memory-less nonlinear function, known as fast threshold modulation (FTM). By means of a combination of nonlinear system theory and bifurcation analysis the dynamical features of the two models are extracted. The most important feature of the FitzHugh-Nagumo model is its dynamic threshold: the spike threshold does not only depend on the absolute value, but also on the amplitude of changes in the membrane potential. Part of the very complex, intriguing bifurcation structure of the Hindmarsh-Rose model is revealed. By considering basic networks of FTM-coupled FitzHugh-Nagumo (spiking) or Hindmarsh-Rose (bursting) neurons, two main cooperative phenomena, synchronization and coincidence detections, are addressed. In both cases it is illustrated that pulse coupling in combination with the intrinsic dynamics of the models provides robustness. In large scale networks of FTM-coupled bursting neurons, the stability of complete synchrony is independent from the network topology and depends only on the number of inputs to each neuron. The analytical results are obtained under very restrictive and biologically implausible hypotheses, but simulations show that the theoretical predictions hold in more realistic cases as well. Finally, the realism of the models is put to a test by identification of their parameters from in vitro measurements. The identification problem is addressed by resorting to standard techniques combined with heuristics based on the results of the reported mathematical analysis and on a priori knowledge from neuroscience. The FitzHugh-Nagumo model is only able to model pyramidal neurons and even then performs worse than simple threshold models; it should be used only when the advantages of the more realistic threshold mechanism are prevalent. The Hindmarsh-Rose model can model much of the diversity of neocortical neurons; it can be used as a model in the study of heterogeneous networks and as a realistic model of a pyramidal neuron

    Evolutionary robotics and neuroscience

    Get PDF
    No description supplie

    Firing patterns in the adaptive exponential integrate-and-fire model

    Get PDF
    For simulations of large spiking neuron networks, an accurate, simple and versatile single-neuron modeling framework is required. Here we explore the versatility of a simple two-equation model: the adaptive exponential integrate-and-fire neuron. We show that this model generates multiple firing patterns depending on the choice of parameter values, and present a phase diagram describing the transition from one firing type to another. We give an analytical criterion to distinguish between continuous adaption, initial bursting, regular bursting and two types of tonic spiking. Also, we report that the deterministic model is capable of producing irregular spiking when stimulated with constant current, indicating low-dimensional chaos. Lastly, the simple model is fitted to real experiments of cortical neurons under step current stimulation. The results provide support for the suitability of simple models such as the adaptive exponential integrate-and-fire neuron for large network simulation
    • …
    corecore