1,956 research outputs found
Bio-Inspired Multi-Layer Spiking Neural Network Extracts Discriminative Features from Speech Signals
Spiking neural networks (SNNs) enable power-efficient implementations due to
their sparse, spike-based coding scheme. This paper develops a bio-inspired SNN
that uses unsupervised learning to extract discriminative features from speech
signals, which can subsequently be used in a classifier. The architecture
consists of a spiking convolutional/pooling layer followed by a fully connected
spiking layer for feature discovery. The convolutional layer of leaky,
integrate-and-fire (LIF) neurons represents primary acoustic features. The
fully connected layer is equipped with a probabilistic spike-timing-dependent
plasticity learning rule. This layer represents the discriminative features
through probabilistic, LIF neurons. To assess the discriminative power of the
learned features, they are used in a hidden Markov model (HMM) for spoken digit
recognition. The experimental results show performance above 96% that compares
favorably with popular statistical feature extraction methods. Our results
provide a novel demonstration of unsupervised feature acquisition in an SNN
Unsupervised Heart-rate Estimation in Wearables With Liquid States and A Probabilistic Readout
Heart-rate estimation is a fundamental feature of modern wearable devices. In
this paper we propose a machine intelligent approach for heart-rate estimation
from electrocardiogram (ECG) data collected using wearable devices. The novelty
of our approach lies in (1) encoding spatio-temporal properties of ECG signals
directly into spike train and using this to excite recurrently connected
spiking neurons in a Liquid State Machine computation model; (2) a novel
learning algorithm; and (3) an intelligently designed unsupervised readout
based on Fuzzy c-Means clustering of spike responses from a subset of neurons
(Liquid states), selected using particle swarm optimization. Our approach
differs from existing works by learning directly from ECG signals (allowing
personalization), without requiring costly data annotations. Additionally, our
approach can be easily implemented on state-of-the-art spiking-based
neuromorphic systems, offering high accuracy, yet significantly low energy
footprint, leading to an extended battery life of wearable devices. We
validated our approach with CARLsim, a GPU accelerated spiking neural network
simulator modeling Izhikevich spiking neurons with Spike Timing Dependent
Plasticity (STDP) and homeostatic scaling. A range of subjects are considered
from in-house clinical trials and public ECG databases. Results show high
accuracy and low energy footprint in heart-rate estimation across subjects with
and without cardiac irregularities, signifying the strong potential of this
approach to be integrated in future wearable devices.Comment: 51 pages, 12 figures, 6 tables, 95 references. Under submission at
Elsevier Neural Network
Recommended from our members
Cerebellar mechanisms underlying adaptive motor responses
In order to understand the function of any brain structure, one must know what input/output transformation it performs. The term input/output transformation includes at least two stages. First, we must understand how inputs are processed. Second, we must know what the output activity encodes. Certain properties of the cerebellum make such an undertaking feasible. In this thesis I present the results of three main projects designed to study the input/output transformations of this major brain system from different angles.
In the first project I investigated the relationship between spiking activity of cerebellar cortex principal neurons - Purkinje cells (PCs) - and eyelid conditioned response (CRs) profiles on a single trial basis. Systematically exploring a variety of encoding possibilities, I found that PCs do not directly encode a single kinematic variable of a CR. The best prediction was rather achieved via a dynamical model approach, where PCs provide a âdriveâ to the eyelid plant, the dynamics of which are described by a differential equation.
In the second project I addressed how the cerebellum deals with inherent uncertainty about the nature of sensory inputs. I found that under conditions of uncertainty, the cerebellum performed a probabilistic binary choice, scaling the probability of response with the similarity between current and trained stimuli. Importantly, if responses were made, their amplitude was close to the previously trained value, maintaining the adaptive nature of responses. Recordings from eyelid Purkinje cells localized this computation to cerebellar cortex. Results from large-scale computer simulation suggest that the efference copy signal is critical for the expression of target response amplitude.
In the third project I studied cerebellar mechanisms of learning and expression of movement sequences. While the majority of movements we perform are composed of sequences, most of the knowledge about cerebellar learning and computation comes from tasks involving single, unitary movements. Hence, I designed a novel sequence training protocol to explicitly test the ability of the cerebellum to chain together a series of movements through associative learning processes. The results demonstrate a simple yet general framework for how the cerebellum can learn to produce a movement sequence.Neuroscienc
Spike burst-pause dynamics of Purkinje cells regulate sensorimotor adaptation
Cerebellar Purkinje cells mediate accurate eye movement coordination. However, it remains
unclear how oculomotor adaptation depends on the interplay between the characteristic
Purkinje cell response patterns, namely tonic, bursting, and spike pauses. Here, a spiking
cerebellar model assesses the role of Purkinje cell firing patterns in vestibular ocular reflex
(VOR) adaptation. The model captures the cerebellar microcircuit properties and it incorporates
spike-based synaptic plasticity at multiple cerebellar sites. A detailed Purkinje cell
model reproduces the three spike-firing patterns that are shown to regulate the cerebellar
output. Our results suggest that pauses following Purkinje complex spikes (bursts) encode
transient disinhibition of target medial vestibular nuclei, critically gating the vestibular signals
conveyed by mossy fibres. This gating mechanism accounts for early and coarse VOR
acquisition, prior to the late reflex consolidation. In addition, properly timed and sized Purkinje
cell bursts allow the ratio between long-term depression and potentiation (LTD/LTP) to
be finely shaped at mossy fibre-medial vestibular nuclei synapses, which optimises VOR
consolidation. Tonic Purkinje cell firing maintains the consolidated VOR through time.
Importantly, pauses are crucial to facilitate VOR phase-reversal learning, by reshaping previously
learnt synaptic weight distributions. Altogether, these results predict that Purkinje
spike burst-pause dynamics are instrumental to VOR learning and reversal adaptation.This work was supported by the
European Union (www.europa.eu), Project SpikeControl 658479 (recipient NL), the Spanish
Agencia Estatal de InvestigacioÂŽn and European
Regional Development Fund (www.ciencia.gob.es/
portal/site/MICINN/aei), Project CEREBROT
TIN2016-81041-R (recipient ER), and the French
National Research Agency (www.agence-nationalerecherche.
fr) â Essilor International (www.essilor.
com), Chair SilverSight ANR-14-CHIN-0001
(recipient AA)
Dynamical principles in neuroscience
Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?This work was supported by NSF Grant No. NSF/EIA-0130708, and Grant No. PHY 0414174; NIH Grant No. 1 R01 NS50945 and Grant No. NS40110; MEC BFI2003-07276, and FundaciĂłn BBVA
Emulating short-term synaptic dynamics with memristive devices
Neuromorphic architectures offer great promise for achieving computation capacities beyond conventional Von Neumann machines. The essential elements for achieving this vision are highly scalable synaptic mimics that do not undermine biological fidelity. Here we demonstrate that single solid-state TiO2 memristors can exhibit non-associative plasticity phenomena observed in biological synapses, supported by their metastable memory state transition properties. We show that, contrary to conventional uses of solid-state memory, the existence of rate-limiting volatility is a key feature for capturing short-term synaptic dynamics. We also show how the temporal dynamics of our prototypes can be exploited to implement spatio-temporal computation, demonstrating the memristors full potential for building biophysically realistic neural processing systems
- âŠ