17 research outputs found

    Nonlinear Dynamics of Neural Circuits

    Get PDF

    Optimal solid state neurons

    Get PDF
    Bioelectronic medicine is driving the need for neuromorphic microcircuits that integrate raw nervous stimuli and respond identically to biological neurons. However, designing such circuits remains a challenge. Here we estimate the parameters of highly nonlinear conductance models and derive the ab initio equations of intracellular currents and membrane voltages embodied in analog solid-state electronics. By configuring individual ion channels of solid-state neurons with parameters estimated from large-scale assimilation of electrophysiological recordings, we successfully transfer the complete dynamics of hippocampal and respiratory neurons in silico. The solid-state neurons are found to respond nearly identically to biological neurons under stimulation by a wide range of current injection protocols. The optimization of nonlinear models demonstrates a powerful method for programming analog electronic circuits. This approach offers a route for repairing diseased biocircuits and emulating their function with biomedical implants that can adapt to biofeedback.ISSN:2041-172

    Statistical Data Assimilation: Formulation and Examples From Neurobiology

    Get PDF
    For the Research Topic Data Assimilation and Control: Theory and Applications in Life Sciences we first review the formulation of statistical data assimilation (SDA) and discuss algorithms for exploring variational approximations to the conditional expected values of biophysical aspects of functional neural circuits. Then we report on the application of SDA to (1) the exploration of properties of individual neurons in the HVC nucleus of the avian song system, and (2) characterizing individual neurons formulated as very large scale integration (VLSI) analog circuits with a goal of building functional, biophysically realistic, VLSI representations of functional nervous systems. Networks of neurons pose a substantially greater challenge, and we comment on formulating experiments to probe the properties, especially the functional connectivity, in song command circuits within HVC

    Noise-activated barrier crossing in multi-attractor spiking networks

    Get PDF

    Data assimilation for conductance-based neuronal models

    Get PDF
    This dissertation illustrates the use of data assimilation algorithms to estimate unobserved variables and unknown parameters of conductance-based neuronal models. Modern data assimilation (DA) techniques are widely used in climate science and weather prediction, but have only recently begun to be applied in neuroscience. The two main classes of DA techniques are sequential methods and variational methods. Throughout this work, twin experiments, where the data is synthetically generated from output of the model, are used to validate use of these techniques for conductance-based models observing only the voltage trace. In Chapter 1, these techniques are described in detail and the estimation problem for conductance-based neuron models is derived. In Chapter 2, these techniques are applied to a minimal conductance-based model, the Morris-Lecar model. This model exhibits qualitatively different types of neuronal excitability due to changes in the underlying bifurcation structure and it is shown that the DA methods can identify parameter sets that produce the correct bifurcation structure even with initial parameter guesses that correspond to a different excitability regime. This demonstrates the ability of DA techniques to perform nonlinear state and parameter estimation, and introduces the geometric structure of inferred models as a novel qualitative measure of estimation success. Chapter 3 extends the ideas of variational data assimilation to include a control term to relax the problem further in a process that is referred to as nudging from the geoscience community. The nudged 4D-Var is applied to twin experiments from a more complex, Hodgkin-Huxley-type two-compartment model for various time-sampling strategies. This controlled 4D-Var with nonuniform time-samplings is then applied to voltage traces from current-clamp recordings of suprachiasmatic nucleus neurons in diurnal rodents to improve upon our understanding of the driving forces in circadian (~24) rhythms of electrical activity. In Chapter 4 the complementary strengths of 4D-Var and UKF are leveraged to create a two-stage algorithm that uses 4D-Var to estimate fast timescale parameters and UKF for slow timescale parameters. This coupled approach is applied to data from a conductance-based model of neuronal bursting with distinctive slow and fast time-scales present in the dynamics. In Chapter 5, the ideas of identifiability and sensitivity are introduced. The Morris-Lecar model and a subset of its parameters are shown to be identifiable through the use of numerical techniques. Chapter 6 frames the selection of stimulus waveforms to inject into neurons during patch-clamp recordings as an optimal experimental design problem. Results on the optimal stimulus waveforms for improving the identifiability of parameters for a Hodgkin-Huxley-type model are presented. Chapter 7 shows the preliminary application of data assimilation for voltage-clamp, rather than current-clamp, data and expands on voltage-clamp principles to formulate a reduced assimilation problem driven by the observed voltage. Concluding thoughts are given in Chapter 8

    Brain-machine interface coupled cognitive sensory fusion with a Kohonen and reservoir computing scheme

    Get PDF
    Artificial Intelligence (AI) has been a source of great intrigue and has spawned many questions regarding the human condition and the core of what it means to be a sentient entity. The field has bifurcated into so-called “weak” and “strong” artificial intelligence. In weak artificial intelligence reside the forms of automation and data mining that we interact with on a daily basis. Strong artificial intelligence can be best defined as a “synthetic” being with cognitive abilities and the capacity for presence of mind that we would normally associate with humankind. We feel that this distinction is misguided. First, we begin with the statement that intelligence lies on a spectrum, even in artificial systems. The fact that our systems currently can be considered weak artificial intelligence does not preclude our ability to develop an understanding that can lead us to more complex behavior. In this research, we utilized neural feedback via electroencephalogram (EEG) data to develop an emotional landscape for linguistic interaction via the android's sensory fields which we consider to be part and parcel of embodied cognition. We have also given the iCub child android the instinct to babble the words it has learned. This is a skill that we leveraged for low-level linguistic acquisition in the latter part of this research, the slightly stronger artificial intelligence goal. This research is motivated by two main questions regarding intelligence: Is intelligence an emergent phenomenon? And, if so, can multi-modal sensory information and a term coined called “co-intelligence” which is a shared sensory experience via coupling EEG input, assist in the development of representations in the mind that we colloquially refer to as language? Given that it is not reasonable to program all of the activities needed to foster intelligence in artificial systems, our hope is that these types of forays will set the stage for further development of stronger artificial intelligence constructs. We have incorporated self-organizing processes - i.e. Kohonen maps, hidden Markov models for the speech, language development and emotional information via neural data - to help lay the substrate for emergence. Next, homage is given to the central and unique role played in intellectual study by language. We have also developed rudimentary associative memory for the iCub that is derived from the aforementioned sensory input that was collected. We formalized this process only as needed, but that is based on the assumption that mind, brain and language can be represented using the mathematics and logic of the day without contradiction. We have some reservations regarding this statement, but unfortunately a proof is a task beyond the scope of this Ph.D. Finally, this data from the coupling of the EEG and the other sensory modes of embodied cognition is used to interact with a reservoir computing recurrent neural network in an attempt to produce simple language interaction, e.g. babbling, from the child android

    Self Organisation and Hierarchical Concept Representation in Networks of Spiking Neurons

    Get PDF
    The aim of this work is to introduce modular processing mechanisms for cortical functions implemented in networks of spiking neurons. Neural maps are a feature of cortical processing found to be generic throughout sensory cortical areas, and self-organisation to the fundamental properties of input spike trains has been shown to be an important property of cortical organisation. Additionally, oscillatory behaviour, temporal coding of information, and learning through spike timing dependent plasticity are all frequently observed in the cortex. The traditional self-organising map (SOM) algorithm attempts to capture the computational properties of this cortical self-organisation in a neural network. As such, a cognitive module for a spiking SOM using oscillations, phasic coding and STDP has been implemented. This model is capable of mapping to distributions of input data in a manner consistent with the traditional SOM algorithm, and of categorising generic input data sets. Higher-level cortical processing areas appear to feature a hierarchical category structure that is founded on a feature-based object representation. The spiking SOM model is therefore extended to facilitate input patterns in the form of sets of binary feature-object relations, such as those seen in the field of formal concept analysis. It is demonstrated that this extended model is capable of learning to represent the hierarchical conceptual structure of an input data set using the existing learning scheme. Furthermore, manipulations of network parameters allow the level of hierarchy used for either learning or recall to be adjusted, and the network is capable of learning comparable representations when trained with incomplete input patterns. Together these two modules provide related approaches to the generation of both topographic mapping and hierarchical representation of input spaces that can be potentially combined and used as the basis for advanced spiking neuron models of the learning of complex representations

    Engineering derivatives from biological systems for advanced aerospace applications

    Get PDF
    The present study consisted of a literature survey, a survey of researchers, and a workshop on bionics. These tasks produced an extensive annotated bibliography of bionics research (282 citations), a directory of bionics researchers, and a workshop report on specific bionics research topics applicable to space technology. These deliverables are included as Appendix A, Appendix B, and Section 5.0, respectively. To provide organization to this highly interdisciplinary field and to serve as a guide for interested researchers, we have also prepared a taxonomy or classification of the various subelements of natural engineering systems. Finally, we have synthesized the results of the various components of this study into a discussion of the most promising opportunities for accelerated research, seeking solutions which apply engineering principles from natural systems to advanced aerospace problems. A discussion of opportunities within the areas of materials, structures, sensors, information processing, robotics, autonomous systems, life support systems, and aeronautics is given. Following the conclusions are six discipline summaries that highlight the potential benefits of research in these areas for NASA's space technology programs
    corecore