1,919 research outputs found
A Unified Framework of Third Order Time and Frequency Domain Analysis for Neural Spike Trains
Third order time and frequency analysis has exhibited great potential for correlation analysis of multi-sensor datasets, but is usually presented as separate time domain and frequency domain approaches. A combined framework of both frequency domain and time domain has rarely been used. This paper proposes a non-parametric third order time and frequency domain framework which used two dimensional Fourier transforms to bridge the gap between time domain and frequency domain. A unified framework offers flexibility and efficiency to apply to data. In this paper we study neural spike train data treated as stochastic point processes. In time domain direct analysis, third order cumulant densities of spike trains are applied, which need all first-, secondand third order product densities to be calculated before constructing the third order cumulant density, which brings additional challenges. The novelty in this study is that a new framework is proposed which can offer an alternative approach without calculating lower order quantities and can reveal nonlinear relationship between neural recordings. The results show that the present framework provides a novel non-parametric method to estimate both time and frequency domain measurements which is applicable to neural spike trains
A unified view on weakly correlated recurrent networks
The diversity of neuron models used in contemporary theoretical neuroscience
to investigate specific properties of covariances raises the question how these
models relate to each other. In particular it is hard to distinguish between
generic properties and peculiarities due to the abstracted model. Here we
present a unified view on pairwise covariances in recurrent networks in the
irregular regime. We consider the binary neuron model, the leaky
integrate-and-fire model, and the Hawkes process. We show that linear
approximation maps each of these models to either of two classes of linear rate
models, including the Ornstein-Uhlenbeck process as a special case. The classes
differ in the location of additive noise in the rate dynamics, which is on the
output side for spiking models and on the input side for the binary model. Both
classes allow closed form solutions for the covariance. For output noise it
separates into an echo term and a term due to correlated input. The unified
framework enables us to transfer results between models. For example, we
generalize the binary model and the Hawkes process to the presence of
conduction delays and simplify derivations for established results. Our
approach is applicable to general network structures and suitable for
population averages. The derived averages are exact for fixed out-degree
network architectures and approximate for fixed in-degree. We demonstrate how
taking into account fluctuations in the linearization procedure increases the
accuracy of the effective theory and we explain the class dependent differences
between covariances in the time and the frequency domain. Finally we show that
the oscillatory instability emerging in networks of integrate-and-fire models
with delayed inhibitory feedback is a model-invariant feature: the same
structure of poles in the complex frequency plane determines the population
power spectra
Third Order Time and Frequency Domain and Mutual Information Analyses For Neuronal Spike Trains
Research on the brain has received considerable attention over the last two decades. Non-randomness of the information flow is widely reported in the study of the brain. Statistical Signal Processing methods have been applied to analyse the dependencies between neuronal recordings. From one side, this makes a great contribution towards further understanding of the brain. On the other side, due the the progress of experimental technology, analysing experimental data becomes a more and more difficult task and hence requires advanced approaches to be developed. Evidence of higher order interactions and nonlinear interactions has been reported in recent experimental findings. This project develops two approaches for statistical signal processing to analyse Multielectrode Array (MEA) data. The first one is a Unified framework of Third Order time and frequency domain analysis (UTO) and the second one is a Mutual Information Function (MIF). These two approaches are described and applied to single unit spike trains to interpret the interactions and dependencies between the spiking neurons. The presence of dependencies are successfully estimated by each approach. In simulations where a modelled neuronal network with 100 neurons, UTO is applied to investigate third order dependencies according to a centre-surrounded pattern of connectivities in the network. The correct pattern of excitatory and inhibitory connections are detected using UTO. Significant values of cumulant estimates are present when third order interactions are present. MIF analysis is also conducted on the simulations. The proposed method computes the Mutual Information (MI) as a function of time lags, along with a Monte-Carlo based calibration method using 100 trials of Poisson spike trains. Significant departure of MI value from the baseline are shown when the dependence exists. UTO and MIF are applied to an experimental MEA spike train data collected from a study of connectivity in a model of kainic acid (KA) induced epileptiform activity for mesial temporal lobe epilepsy (mTLE) in rat. UTO and MIF both successfully highlight the short latency and long latency dependencies existing in the dataset. Therefore, UTO and MIF provide complementary tools to capture dependencies between spike train signals
The effect of heterogeneity on decorrelation mechanisms in spiking neural networks: a neuromorphic-hardware study
High-level brain function such as memory, classification or reasoning can be
realized by means of recurrent networks of simplified model neurons. Analog
neuromorphic hardware constitutes a fast and energy efficient substrate for the
implementation of such neural computing architectures in technical applications
and neuroscientific research. The functional performance of neural networks is
often critically dependent on the level of correlations in the neural activity.
In finite networks, correlations are typically inevitable due to shared
presynaptic input. Recent theoretical studies have shown that inhibitory
feedback, abundant in biological neural networks, can actively suppress these
shared-input correlations and thereby enable neurons to fire nearly
independently. For networks of spiking neurons, the decorrelating effect of
inhibitory feedback has so far been explicitly demonstrated only for
homogeneous networks of neurons with linear sub-threshold dynamics. Theory,
however, suggests that the effect is a general phenomenon, present in any
system with sufficient inhibitory feedback, irrespective of the details of the
network structure or the neuronal and synaptic properties. Here, we investigate
the effect of network heterogeneity on correlations in sparse, random networks
of inhibitory neurons with non-linear, conductance-based synapses. Emulations
of these networks on the analog neuromorphic hardware system Spikey allow us to
test the efficiency of decorrelation by inhibitory feedback in the presence of
hardware-specific heterogeneities. The configurability of the hardware
substrate enables us to modulate the extent of heterogeneity in a systematic
manner. We selectively study the effects of shared input and recurrent
connections on correlations in membrane potentials and spike trains. Our
results confirm ...Comment: 20 pages, 10 figures, supplement
- …