2,032 research outputs found
Detecting multineuronal temporal patterns in parallel spike trains
We present a non-parametric and computationally efficient method that detects spatiotemporal firing patterns and pattern sequences in parallel spike trains and tests whether the observed numbers of repeating patterns and sequences on a given timescale are significantly different from those expected by chance. The method is generally applicable and uncovers coordinated activity with arbitrary precision by comparing it to appropriate surrogate data. The analysis of coherent patterns of spatially and temporally distributed spiking activity on various timescales enables the immediate tracking of diverse qualities of coordinated firing related to neuronal state changes and information processing. We apply the method to simulated data and multineuronal recordings from rat visual cortex and show that it reliably discriminates between data sets with random pattern occurrences and with additional exactly repeating spatiotemporal patterns and pattern sequences. Multineuronal cortical spiking activity appears to be precisely coordinated and exhibits a sequential organization beyond the cell assembly concept
Fast, scalable, Bayesian spike identification for multi-electrode arrays
We present an algorithm to identify individual neural spikes observed on
high-density multi-electrode arrays (MEAs). Our method can distinguish large
numbers of distinct neural units, even when spikes overlap, and accounts for
intrinsic variability of spikes from each unit. As MEAs grow larger, it is
important to find spike-identification methods that are scalable, that is, the
computational cost of spike fitting should scale well with the number of units
observed. Our algorithm accomplishes this goal, and is fast, because it
exploits the spatial locality of each unit and the basic biophysics of
extracellular signal propagation. Human intervention is minimized and
streamlined via a graphical interface. We illustrate our method on data from a
mammalian retina preparation and document its performance on simulated data
consisting of spikes added to experimentally measured background noise. The
algorithm is highly accurate
Entropy-based parametric estimation of spike train statistics
We consider the evolution of a network of neurons, focusing on the asymptotic
behavior of spikes dynamics instead of membrane potential dynamics. The spike
response is not sought as a deterministic response in this context, but as a
conditional probability : "Reading out the code" consists of inferring such a
probability. This probability is computed from empirical raster plots, by using
the framework of thermodynamic formalism in ergodic theory. This gives us a
parametric statistical model where the probability has the form of a Gibbs
distribution. In this respect, this approach generalizes the seminal and
profound work of Schneidman and collaborators. A minimal presentation of the
formalism is reviewed here, while a general algorithmic estimation method is
proposed yielding fast convergent implementations. It is also made explicit how
several spike observables (entropy, rate, synchronizations, correlations) are
given in closed-form from the parametric estimation. This paradigm does not
only allow us to estimate the spike statistics, given a design choice, but also
to compare different models, thus answering comparative questions about the
neural code such as : "are correlations (or time synchrony or a given set of
spike patterns, ..) significant with respect to rate coding only ?" A numerical
validation of the method is proposed and the perspectives regarding spike-train
code analysis are also discussed.Comment: 37 pages, 8 figures, submitte
Hardware-Amenable Structural Learning for Spike-based Pattern Classification using a Simple Model of Active Dendrites
This paper presents a spike-based model which employs neurons with
functionally distinct dendritic compartments for classifying high dimensional
binary patterns. The synaptic inputs arriving on each dendritic subunit are
nonlinearly processed before being linearly integrated at the soma, giving the
neuron a capacity to perform a large number of input-output mappings. The model
utilizes sparse synaptic connectivity; where each synapse takes a binary value.
The optimal connection pattern of a neuron is learned by using a simple
hardware-friendly, margin enhancing learning algorithm inspired by the
mechanism of structural plasticity in biological neurons. The learning
algorithm groups correlated synaptic inputs on the same dendritic branch. Since
the learning results in modified connection patterns, it can be incorporated
into current event-based neuromorphic systems with little overhead. This work
also presents a branch-specific spike-based version of this structural
plasticity rule. The proposed model is evaluated on benchmark binary
classification problems and its performance is compared against that achieved
using Support Vector Machine (SVM) and Extreme Learning Machine (ELM)
techniques. Our proposed method attains comparable performance while utilizing
10 to 50% less computational resources than the other reported techniques.Comment: Accepted for publication in Neural Computatio
Competition-based model of pheromone component ratio detection in the moth
For some moth species, especially those closely interrelated and sympatric, recognizing a specific pheromone component concentration ratio is essential for males to successfully locate conspecific females. We propose and determine the properties of a minimalist competition-based feed-forward neuronal model capable of detecting a certain ratio of pheromone components independently of overall concentration. This model represents an elementary recognition unit for the ratio of binary mixtures which we propose is entirely contained in the macroglomerular complex (MGC) of the male moth. A set of such units, along with projection neurons (PNs), can provide the input to higher brain centres. We found that (1) accuracy is mainly achieved by maintaining a certain ratio of connection strengths between olfactory receptor neurons (ORN) and local neurons (LN), much less by properties of the interconnections between the competing LNs proper. An exception to this rule is that it is beneficial if connections between generalist LNs (i.e. excited by either pheromone component) and specialist LNs (i.e. excited by one component only) have the same strength as the reciprocal specialist to generalist connections. (2) successful ratio recognition is achieved using latency-to-first-spike in the LN populations which, in contrast to expectations with a population rate code, leads to a broadening of responses for higher overall concentrations consistent with experimental observations. (3) when longer durations of the competition between LNs were observed it did not lead to higher recognition accuracy
Shaping bursting by electrical coupling and noise
Gap-junctional coupling is an important way of communication between neurons
and other excitable cells. Strong electrical coupling synchronizes activity
across cell ensembles. Surprisingly, in the presence of noise synchronous
oscillations generated by an electrically coupled network may differ
qualitatively from the oscillations produced by uncoupled individual cells
forming the network. A prominent example of such behavior is the synchronized
bursting in islets of Langerhans formed by pancreatic \beta-cells, which in
isolation are known to exhibit irregular spiking. At the heart of this
intriguing phenomenon lies denoising, a remarkable ability of electrical
coupling to diminish the effects of noise acting on individual cells.
In this paper, we derive quantitative estimates characterizing denoising in
electrically coupled networks of conductance-based models of square wave
bursting cells. Our analysis reveals the interplay of the intrinsic properties
of the individual cells and network topology and their respective contributions
to this important effect. In particular, we show that networks on graphs with
large algebraic connectivity or small total effective resistance are better
equipped for implementing denoising. As a by-product of the analysis of
denoising, we analytically estimate the rate with which trajectories converge
to the synchronization subspace and the stability of the latter to random
perturbations. These estimates reveal the role of the network topology in
synchronization. The analysis is complemented by numerical simulations of
electrically coupled conductance-based networks. Taken together, these results
explain the mechanisms underlying synchronization and denoising in an important
class of biological models
- …