1,111 research outputs found
How Gibbs distributions may naturally arise from synaptic adaptation mechanisms. A model-based argumentation
This paper addresses two questions in the context of neuronal networks
dynamics, using methods from dynamical systems theory and statistical physics:
(i) How to characterize the statistical properties of sequences of action
potentials ("spike trains") produced by neuronal networks ? and; (ii) what are
the effects of synaptic plasticity on these statistics ? We introduce a
framework in which spike trains are associated to a coding of membrane
potential trajectories, and actually, constitute a symbolic coding in important
explicit examples (the so-called gIF models). On this basis, we use the
thermodynamic formalism from ergodic theory to show how Gibbs distributions are
natural probability measures to describe the statistics of spike trains, given
the empirical averages of prescribed quantities. As a second result, we show
that Gibbs distributions naturally arise when considering "slow" synaptic
plasticity rules where the characteristic time for synapse adaptation is quite
longer than the characteristic time for neurons dynamics.Comment: 39 pages, 3 figure
Recommended from our members
A Framework for Studying Synaptic Plasticity with Neural Spike Train Data
Learning and memory in the brain are implemented by complex, time-varying changes in neural circuitry. The computational rules according to which synaptic weights change over time are the subject of much research, and are not precisely understood. Until recently, limitations in experimental methods have made it challenging to test hypotheses about synaptic plasticity on a large scale. However, as such data become available and these barriers are lifted, it becomes necessary to develop analysis techniques to validate plasticity models. Here, we present a highly extensible framework for modeling arbitrary synaptic plasticity rules on spike train data in populations of interconnected neurons. We treat synaptic weights as a (potentially nonlinear) dynamical system embedded in a fully-Bayesian generalized linear model (GLM). In addition, we provide an algorithm for inferring synaptic weight trajectories alongside the parameters of the GLM and of the learning rules. Using this method, we perform model comparison of two proposed variants of the well-known spike-timing-dependent plasticity (STDP) rule, where nonlinear effects play a substantial role. On synthetic data generated from the biophysical simulator NEURON, we show that we can recover the weight trajectories, the pattern of connectivity, and the underlying learning rules.Engineering and Applied Science
Unsupervised Heart-rate Estimation in Wearables With Liquid States and A Probabilistic Readout
Heart-rate estimation is a fundamental feature of modern wearable devices. In
this paper we propose a machine intelligent approach for heart-rate estimation
from electrocardiogram (ECG) data collected using wearable devices. The novelty
of our approach lies in (1) encoding spatio-temporal properties of ECG signals
directly into spike train and using this to excite recurrently connected
spiking neurons in a Liquid State Machine computation model; (2) a novel
learning algorithm; and (3) an intelligently designed unsupervised readout
based on Fuzzy c-Means clustering of spike responses from a subset of neurons
(Liquid states), selected using particle swarm optimization. Our approach
differs from existing works by learning directly from ECG signals (allowing
personalization), without requiring costly data annotations. Additionally, our
approach can be easily implemented on state-of-the-art spiking-based
neuromorphic systems, offering high accuracy, yet significantly low energy
footprint, leading to an extended battery life of wearable devices. We
validated our approach with CARLsim, a GPU accelerated spiking neural network
simulator modeling Izhikevich spiking neurons with Spike Timing Dependent
Plasticity (STDP) and homeostatic scaling. A range of subjects are considered
from in-house clinical trials and public ECG databases. Results show high
accuracy and low energy footprint in heart-rate estimation across subjects with
and without cardiac irregularities, signifying the strong potential of this
approach to be integrated in future wearable devices.Comment: 51 pages, 12 figures, 6 tables, 95 references. Under submission at
Elsevier Neural Network
Rewiring Neural Interactions by Micro-Stimulation
Plasticity is a crucial component of normal brain function and a critical mechanism for recovery from injury. In vitro, associative pairing of presynaptic spiking and stimulus-induced postsynaptic depolarization causes changes in the synaptic efficacy of the presynaptic neuron, when activated by extrinsic stimulation. In vivo, such paradigms can alter the responses of whole groups of neurons to stimulation. Here, we used in vivo spike-triggered stimulation to drive plastic changes in rat forelimb sensorimotor cortex, which we monitored using a statistical measure of functional connectivity inferred from the spiking statistics of the neurons during normal, spontaneous behavior. These induced plastic changes in inferred functional connectivity depended on the latency between trigger spike and stimulation, and appear to reflect a robust reorganization of the network. Such targeted connectivity changes might provide a tool for rerouting the flow of information through a network, with implications for both rehabilitation and brain–machine interface applications
Circumstantial evidence and explanatory models for synapses in large-scale spike recordings
Whether, when, and how causal interactions between neurons can be
meaningfully studied from observations of neural activity alone are vital
questions in neural data analysis. Here we aim to better outline the concept of
functional connectivity for the specific situation where systems
neuroscientists aim to study synapses using spike train recordings. In some
cases, cross-correlations between the spikes of two neurons are such that,
although we may not be able to say that a relationship is causal without
experimental manipulations, models based on synaptic connections provide
precise explanations of the data. Additionally, there is often strong
circumstantial evidence that pairs of neurons are monosynaptically connected.
Here we illustrate how circumstantial evidence for or against synapses can be
systematically assessed and show how models of synaptic effects can provide
testable predictions for pair-wise spike statistics. We use case studies from
large-scale multi-electrode spike recordings to illustrate key points and to
demonstrate how modeling synaptic effects using large-scale spike recordings
opens a wide range of data analytic questions
Nonlinear slow-timescale mechanisms in synaptic plasticity
Learning and memory rely on synapses changing their strengths in response to neural activity. However, there is a substantial gap between the timescales of neural electrical dynamics (1-100 ms) and organism behaviour during learning (seconds-minutes). What mechanisms bridge this timescale gap? What are the implications for theories of brain learning? Here I first cover experimental evidence for slow-timescale factors in plasticity induction. Then I review possible underlying cellular and synaptic mechanisms, and insights from recent computational models that incorporate such slow-timescale variables. I conclude that future progress in understanding brain learning across timescales will require both experimental and computational modelling studies that map out the nonlinearities implemented by both fast and slow plasticity mechanisms at synapses, and crucially, their joint interactions. [Abstract copyright: Copyright © 2023 Elsevier Ltd. All rights reserved.
Recommended from our members
Neuronal dynamics and connectivity analysis of neuronal cultures on multi electrode arrays
Despite a number of attempts over the past two decades, research into reliable, controlled induction of long term evoked responses, mimicking low level learning and memory in dissociated cell cultures remains challenging. In addition, a full understanding of the stimulus-response relationships that underlie synaptic plasticity has not yet been achieved, and many of the underlying principles remain largely unknown. Plasticity studies have been predominantly limited to low density Multi/Micro Electrode Arrays (MEAs). With the advent of complementary metal-oxide-semiconductor (CMOS) based High-Density (HD) MEAs, unprecedented spatial and temporal resolution is now possible. In this thesis, an attempt to bridge the gap between studies of neural plasticity and the use of CMOS based HD-MEAs with thousands of electrodes, is reported. Additionally, since such HD-MEAs generate a large volume of data and require advanced analytics to efficiently process and analyse recordings, computational tools and novel algorithms to infer connectivity during plasticity have been developed.
The study showed that the responsiveness, stability and initial firing rate of neuronal cultures are the deciding factors to reliably induce evoked responses. With multi-site stimulation, sustained long term potentiation was achieved, which was validated both by evoked response plots and overall firing rates measured at five different time points - before and after repeated stimulation, and at a three day time points. In contrast, while depression responses were observed, it was found that the effects were not sustained over many days. The findings of the study suggest that appropriate selection of neuronal cultures is crucial for inducing desired evoked responses and criteria for this have been developed. Furthermore, it is concluded that the initial responses to test stimuli can be used to determine whether potentiated or depressed responses are to be expected.
To analyse the recordings, pipeline of computational tools was developed. Firstly, neuronal synchrony metrics were adapted for the first time for large HD-MEA recordings and shown to correspond effectively to the firing dynamics. To analyse functional connectivity, an information theoretic approach, Transfer Entropy(TE), was utilised. The method showed accurate estimation of functional connectivity with mid 80th percentile accuracy on simulated data. A superimposition method was proposed to enhance confidence in the connectivity estimation. To statistically evaluate connectivity estimation, a new surrogate method, based on ISI distribution approach, was proposed and validated with a simulated Izhikevich network. The method achieved improved accuracy, compared to the existing ISI shuffling method. This newly developed method was later utilised to infer connectivity and refine connections during the learning process of real neuronal cultures over many days of stimulation. The connectivity inference corresponded accurately to both the spontaneous and stimulated networks during evoked responses and the proposed method permitted observation of the evolution of connections for the potentiated network
- …