228 research outputs found
Consistent Recovery of Sensory Stimuli Encoded with MIMO Neural Circuits
We consider the problem of reconstructing finite energy stimuli encoded with a population of spiking leaky integrate-and-fire neurons. The reconstructed signal satisfies a consistency condition: when passed through the same neuron, it triggers the same spike train as the original stimulus. The recovered stimulus has to also minimize a quadratic smoothness optimality criterion. We formulate the reconstruction as a spline interpolation problem for scalar as well as vector valued stimuli and show that the recovery has a unique solution. We provide explicit reconstruction algorithms for stimuli encoded with single as well as a population of integrate-and-fire neurons. We demonstrate how our reconstruction algorithms can be applied to stimuli encoded with ON-OFF neural circuits with feedback. Finally, we extend the formalism to multi-input multi-output neural circuits and demonstrate that vector-valued finite energy signals can be efficiently encoded by a neural population provided that its size is beyond a threshold value. Examples are given that demonstrate the potential applications of our methodology to systems neuroscience and neuromorphic engineering
Identification of linear and nonlinear sensory processing circuits from spiking neuron data
Inferring mathematical models of sensory processing systems directly from input-output observations, while making the fewest assumptions about the model equations and the types of measurements available, is still a major issue in computational neuroscience. This letter introduces two new approaches for identifying sensory circuit models consisting of linear and nonlinear filters in series with spiking neuron models, based only on the sampled analog input to the filter and the recorded spike train output of the spiking neuron. For an ideal integrate-and-fire neuron model, the first algorithm can identify the spiking neuron parameters as well as the structure and parameters of an arbitrary nonlinear filter connected to it. The second algorithm can identify the parameters of the more general leaky integrate-and-fire spiking neuron model, as well as the parameters of an arbitrary linear filter connected to it. Numerical studies involving simulated and real experimental recordings are used to demonstrate the applicability and evaluate the performance of the proposed algorithms
Recommended from our members
Identification of Dendritic Processing in Spiking Neural Circuits
A large body of experimental evidence points to sophisticated signal processing taking place at the level of dendritic trees and dendritic branches of neurons. This evidence suggests that, in addition to inferring the connectivity between neurons, identifying analog dendritic processing in individual cells is fundamentally important to understanding the underlying principles of neural computation. In this thesis, we develop a novel theoretical framework for the identification of dendritic processing directly from spike times produced by spiking neurons. The problem setting of spiking neurons is necessary since such neurons make up the majority of electrically excitable cells in most nervous systems and it is often hard or even impossible to directly monitor the activity within dendrites. Thus, action potentials produced by neurons often constitute the only causal and observable correlate of dendritic processing. In order to remain true to the underlying biophysics of electrically excitable cells, we employ well-established mechanistic models of action potential generation to describe the nonlinear mapping of the aggregate current produced by the tree into an asynchronous sequence of spikes. Specific models of spike generation considered include conductance-based models such as Hodgkin-Huxley, Morris-Lecar, Fitzhugh-Nagumo, as well as simpler models of the integrate-and-fire and threshold-and-fire type. The aggregate time-varying current driving the spike generator is taken to be produced by a dendritic stimulus processor, which is a nonlinear dynamical system capable of describing arbitrary linear and nonlinear transformations performed on one or more input stimuli. In the case of multiple stimuli, it can also describe the cross-coupling, or interaction, between various stimulus features. The behavior of the dendritic stimulus processor is fully captured by one or more kernels, which provide a characterization of the signal processing that is consistent with the broader cable theory description of dendritic trees. We prove that the neural identification problem, stated in terms of identifying the kernels of the dendritic stimulus processor, is mathematically dual to the neural population encoding problem. Specifically, we show that the collection of spikes produced by a single neuron in multiple experimental trials can be treated as a single multidimensional spike train of a population of neurons encoding the parameters of the dendritic stimulus processor. Using the theory of sampling in reproducing kernel Hilbert spaces, we then derive precise results demonstrating that, during any experiment, the entire neural circuit is projected onto the space of input stimuli and parameters of this projection are faithfully encoded in the spike train. Spike times are shown to correspond to generalized samples, or measurements, of this projection in a system of coordinates that is not fixed but is both neuron- and stimulus-dependent. We examine the theoretical conditions under which it may be possible to reconstruct the dendritic stimulus processor from these samples and derive corresponding experimental conditions for the minimum number of spikes and stimuli that need to be used. We also provide explicit algorithms for reconstructing the kernel projection and demonstrate that, under natural conditions, this projection converges to the true kernel. The developed methodology is quite general and can be applied to a number of neural circuits. In particular, the methods discussed span all sensory modalities, including vision, audition and olfaction, in which external stimuli are typically continuous functions of time and space. The results can also be applied to circuits in higher brain centers that receive multi-dimensional spike trains as input stimuli instead of continuous signals. In addition, the modularity of the approach allows one to extend it to mixed-signal circuits processing both continuous and spiking stimuli, to circuits with extensive lateral connections and feedback, as well as to multisensory circuits concurrently processing multiple stimuli of different dimensions, such as audio and video. Another important extension of the approach can be used to estimate the phase response curves of a neuron. All of the theoretical results are accompanied by detailed examples demonstrating the performance of the proposed identification algorithms. We employ both synthetic and naturalistic stimuli such as natural video and audio to highlight the power of the approach. Finally, we consider the implication of our work on problems pertaining to neural encoding and decoding and discuss promising directions for future research
Neural System Identification with Spike-triggered Non-negative Matrix Factorization
Neuronal circuits formed in the brain are complex with intricate connection
patterns. Such complexity is also observed in the retina as a relatively simple
neuronal circuit. A retinal ganglion cell receives excitatory inputs from
neurons in previous layers as driving forces to fire spikes. Analytical methods
are required that can decipher these components in a systematic manner.
Recently a method termed spike-triggered non-negative matrix factorization
(STNMF) has been proposed for this purpose. In this study, we extend the scope
of the STNMF method. By using the retinal ganglion cell as a model system, we
show that STNMF can detect various computational properties of upstream bipolar
cells, including spatial receptive field, temporal filter, and transfer
nonlinearity. In addition, we recover synaptic connection strengths from the
weight matrix of STNMF. Furthermore, we show that STNMF can separate spikes of
a ganglion cell into a few subsets of spikes where each subset is contributed
by one presynaptic bipolar cell. Taken together, these results corroborate that
STNMF is a useful method for deciphering the structure of neuronal circuits.Comment: updated versio
Augmentation of Brain Function: Facts, Fiction and Controversy. Volume III: From Clinical Applications to Ethical Issues and Futuristic Ideas
The final volume in this tripartite series on Brain Augmentation is entitled “From Clinical Applications to Ethical Issues and Futuristic Ideas”. Many of the articles within this volume deal with translational efforts taking the results of experiments on laboratory animals and applying them to humans. In many cases, these interventions are intended to help people with disabilities in such a way so as to either restore or extend brain function. Traditionally, therapies in brain augmentation have included electrical and pharmacological techniques. In contrast, some of the techniques discussed in this volume add specificity by targeting select neural populations. This approach opens the door to where and how to promote the best interventions. Along the way, results have empowered the medical profession by expanding their understanding of brain function. Articles in this volume relate novel clinical solutions for a host of neurological and psychiatric conditions such as stroke, Parkinson’s disease, Huntington’s disease, epilepsy, dementia, Alzheimer’s disease, autism spectrum disorders (ASD), traumatic brain injury, and disorders of consciousness. In disease, symptoms and signs denote a departure from normal function. Brain augmentation has now been used to target both the core symptoms that provide specificity in the diagnosis of a disease, as well as other constitutional symptoms that may greatly handicap the individual. The volume provides a report on the use of repetitive transcranial magnetic stimulation (rTMS) in ASD with reported improvements of core deficits (i.e., executive functions). TMS in this regard departs from the present-day trend towards symptomatic treatment that leaves unaltered the root cause of the condition. In diseases, such as schizophrenia, brain augmentation approaches hold promise to avoid lengthy pharmacological interventions that are usually riddled with side effects or those with limiting returns as in the case of Parkinson’s disease. Brain stimulation can also be used to treat auditory verbal hallucination, visuospatial (hemispatial) neglect, and pain in patients suffering from multiple sclerosis. The brain acts as a telecommunication transceiver wherein different bandwidth of frequencies (brainwave oscillations) transmit information. Their baseline levels correlate with certain behavioral states. The proper integration of brain oscillations provides for the phenomenon of binding and central coherence. Brain augmentation may foster the normalization of brain oscillations in nervous system disorders. These techniques hold the promise of being applied remotely (under the supervision of medical personnel), thus overcoming the obstacle of travel in order to obtain healthcare. At present, traditional thinking would argue the possibility of synergism among different modalities of brain augmentation as a way of increasing their overall effectiveness and improving therapeutic selectivity. Thinking outside of the box would also provide for the implementation of brain-to-brain interfaces where techniques, proper to artificial intelligence, could allow us to surpass the limits of natural selection or enable communications between several individual brains sharing memories, or even a global brain capable of self-organization. Not all brains are created equal. Brain stimulation studies suggest large individual variability in response that may affect overall recovery/treatment, or modify desired effects of a given intervention. The subject’s age, gender, hormonal levels may affect an individual’s cortical excitability. In addition, this volume discusses the role of social interactions in the operations of augmenting technologies. Finally, augmenting methods could be applied to modulate consciousness, even though its neural mechanisms are poorly understood. Finally, this volume should be taken as a debate on social, moral and ethical issues on neurotechnologies. Brain enhancement may transform the individual into someone or something else. These techniques bypass the usual routes of accommodation to environmental exigencies that exalted our personal fortitude: learning, exercising, and diet. This will allow humans to preselect desired characteristics and realize consequent rewards without having to overcome adversity through more laborious means. The concern is that humans may be playing God, and the possibility of an expanding gap in social equity where brain enhancements may be selectively available to the wealthier individuals. These issues are discussed by a number of articles in this volume. Also discussed are the relationship between the diminishment and enhancement following the application of brain-augmenting technologies, the problem of “mind control” with BMI technologies, free will the duty to use cognitive enhancers in high-responsibility professions, determining the population of people in need of brain enhancement, informed public policy, cognitive biases, and the hype caused by the development of brain- augmenting approaches
Nonlinear System Identification of Neural Systems from Neurophysiological Signals
The human nervous system is one of the most complicated systems in nature. Complex nonlinear behaviours have been shown from the single neuron level to the system level. For decades, linear connectivity analysis methods, such as correlation, coherence and Granger causality, have been extensively used to assess the neural connectivities and input-output interconnections in neural systems. Recent studies indicate that these linear methods can only capture a small amount of neural activities and functional relationships, and therefore cannot describe neural behaviours in a precise or complete way. In this review, we highlight recent advances in nonlinear system identification of neural systems, corresponding time and frequency domain analysis, and novel neural connectivity measures based on nonlinear system identification techniques. We argue that nonlinear modelling and analysis are necessary to study neuronal processing and signal transfer in neural systems quantitatively. These approaches can hopefully provide new insights to advance our understanding of neurophysiological mechanisms underlying neural functions. These nonlinear approaches also have the potential to produce sensitive biomarkers to facilitate the development of precision diagnostic tools for evaluating neurological disorders and the effects of targeted intervention
Neuronal oscillations, information dynamics, and behaviour: an evolutionary robotics study
Oscillatory neural activity is closely related to cognition and behaviour, with synchronisation mechanisms playing a key role in the integration and functional organization of different cortical areas. Nevertheless, its informational content and relationship with behaviour - and hence cognition - are still to be fully understood.
This thesis is concerned with better understanding the role of neuronal oscillations and information dynamics towards the generation of embodied cognitive behaviours and with investigating the efficacy of such systems as practical robot controllers. To this end, we develop a novel model based on the Kuramoto model of coupled phase oscillators and perform three minimally cognitive evolutionary robotics experiments. The analyses focus both on a behavioural level description, investigating the robot’s trajectories, and on a mechanism level description, exploring the variables’ dynamics and the information transfer properties within and between the agent’s body and the environment.
The first experiment demonstrates that in an active categorical perception task under normal and inverted vision, networks with a definite, but not too strong, propensity for synchronisation are more able to reconfigure, to organise themselves functionally, and to adapt to different behavioural conditions. The second experiment relates assembly constitution and phase reorganisation dynamics to performance in supervised and unsupervised learning tasks. We demonstrate that assembly dynamics facilitate the evolutionary process, can account for varying degrees of stimuli modulation of the sensorimotor interactions, and can contribute to solving different tasks leaving aside other plasticity mechanisms. The third experiment explores an associative learning task considering a more realistic connectivity pattern between neurons. We demonstrate that networks with travelling waves as a default solution perform poorly compared to networks that are normally synchronised in the absence of stimuli.
Overall, this thesis shows that neural synchronisation dynamics, when suitably flexible and reconfigurable, produce an asymmetric flow of information and can generate minimally cognitive embodied behaviours
Recent Advances in Signal Processing
The signal processing task is a very critical issue in the majority of new technological inventions and challenges in a variety of applications in both science and engineering fields. Classical signal processing techniques have largely worked with mathematical models that are linear, local, stationary, and Gaussian. They have always favored closed-form tractability over real-world accuracy. These constraints were imposed by the lack of powerful computing tools. During the last few decades, signal processing theories, developments, and applications have matured rapidly and now include tools from many areas of mathematics, computer science, physics, and engineering. This book is targeted primarily toward both students and researchers who want to be exposed to a wide variety of signal processing techniques and algorithms. It includes 27 chapters that can be categorized into five different areas depending on the application at hand. These five categories are ordered to address image processing, speech processing, communication systems, time-series analysis, and educational packages respectively. The book has the advantage of providing a collection of applications that are completely independent and self-contained; thus, the interested reader can choose any chapter and skip to another without losing continuity
Closed-loop approaches for innovative neuroprostheses
The goal of this thesis is to study new ways to interact with the nervous system in case of damage or pathology. In particular, I focused my effort towards the development of innovative, closed-loop stimulation protocols in various scenarios: in vitro, ex vivo, in vivo
- …