993 research outputs found
Characterization and Compensation of Network-Level Anomalies in Mixed-Signal Neuromorphic Modeling Platforms
Advancing the size and complexity of neural network models leads to an ever
increasing demand for computational resources for their simulation.
Neuromorphic devices offer a number of advantages over conventional computing
architectures, such as high emulation speed or low power consumption, but this
usually comes at the price of reduced configurability and precision. In this
article, we investigate the consequences of several such factors that are
common to neuromorphic devices, more specifically limited hardware resources,
limited parameter configurability and parameter variations. Our final aim is to
provide an array of methods for coping with such inevitable distortion
mechanisms. As a platform for testing our proposed strategies, we use an
executable system specification (ESS) of the BrainScaleS neuromorphic system,
which has been designed as a universal emulation back-end for neuroscientific
modeling. We address the most essential limitations of this device in detail
and study their effects on three prototypical benchmark network models within a
well-defined, systematic workflow. For each network model, we start by defining
quantifiable functionality measures by which we then assess the effects of
typical hardware-specific distortion mechanisms, both in idealized software
simulations and on the ESS. For those effects that cause unacceptable
deviations from the original network dynamics, we suggest generic compensation
mechanisms and demonstrate their effectiveness. Both the suggested workflow and
the investigated compensation mechanisms are largely back-end independent and
do not require additional hardware configurability beyond the one required to
emulate the benchmark networks in the first place. We hereby provide a generic
methodological environment for configurable neuromorphic devices that are
targeted at emulating large-scale, functional neural networks
Dynamical principles in neuroscience
Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?This work was supported by NSF Grant No. NSF/EIA-0130708, and Grant No. PHY 0414174; NIH Grant No. 1 R01 NS50945 and Grant No. NS40110; MEC BFI2003-07276, and Fundación BBVA
Mecanismos de codificación y procesamiento de información en redes basadas en firmas neuronales
Tesis doctoral inédita leída en la Universidad Autónoma de Madrid, Escuela Politécnica Superior, Departamento de Tecnología Electrónica y de las Comunicaciones. Fecha de lectura: 21-02-202
Scaling of a large-scale simulation of synchronous slow-wave and asynchronous awake-like activity of a cortical model with long-range interconnections
Cortical synapse organization supports a range of dynamic states on multiple
spatial and temporal scales, from synchronous slow wave activity (SWA),
characteristic of deep sleep or anesthesia, to fluctuating, asynchronous
activity during wakefulness (AW). Such dynamic diversity poses a challenge for
producing efficient large-scale simulations that embody realistic metaphors of
short- and long-range synaptic connectivity. In fact, during SWA and AW
different spatial extents of the cortical tissue are active in a given timespan
and at different firing rates, which implies a wide variety of loads of local
computation and communication. A balanced evaluation of simulation performance
and robustness should therefore include tests of a variety of cortical dynamic
states. Here, we demonstrate performance scaling of our proprietary Distributed
and Plastic Spiking Neural Networks (DPSNN) simulation engine in both SWA and
AW for bidimensional grids of neural populations, which reflects the modular
organization of the cortex. We explored networks up to 192x192 modules, each
composed of 1250 integrate-and-fire neurons with spike-frequency adaptation,
and exponentially decaying inter-modular synaptic connectivity with varying
spatial decay constant. For the largest networks the total number of synapses
was over 70 billion. The execution platform included up to 64 dual-socket
nodes, each socket mounting 8 Intel Xeon Haswell processor cores @ 2.40GHz
clock rates. Network initialization time, memory usage, and execution time
showed good scaling performances from 1 to 1024 processes, implemented using
the standard Message Passing Interface (MPI) protocol. We achieved simulation
speeds of between 2.3x10^9 and 4.1x10^9 synaptic events per second for both
cortical states in the explored range of inter-modular interconnections.Comment: 22 pages, 9 figures, 4 table
Scaling of a large-scale simulation of synchronous slow-wave and asynchronous awake-like activity of a cortical model with long-range interconnections
Cortical synapse organization supports a range of dynamic states on multiple
spatial and temporal scales, from synchronous slow wave activity (SWA),
characteristic of deep sleep or anesthesia, to fluctuating, asynchronous
activity during wakefulness (AW). Such dynamic diversity poses a challenge for
producing efficient large-scale simulations that embody realistic metaphors of
short- and long-range synaptic connectivity. In fact, during SWA and AW
different spatial extents of the cortical tissue are active in a given timespan
and at different firing rates, which implies a wide variety of loads of local
computation and communication. A balanced evaluation of simulation performance
and robustness should therefore include tests of a variety of cortical dynamic
states. Here, we demonstrate performance scaling of our proprietary Distributed
and Plastic Spiking Neural Networks (DPSNN) simulation engine in both SWA and
AW for bidimensional grids of neural populations, which reflects the modular
organization of the cortex. We explored networks up to 192x192 modules, each
composed of 1250 integrate-and-fire neurons with spike-frequency adaptation,
and exponentially decaying inter-modular synaptic connectivity with varying
spatial decay constant. For the largest networks the total number of synapses
was over 70 billion. The execution platform included up to 64 dual-socket
nodes, each socket mounting 8 Intel Xeon Haswell processor cores @ 2.40GHz
clock rates. Network initialization time, memory usage, and execution time
showed good scaling performances from 1 to 1024 processes, implemented using
the standard Message Passing Interface (MPI) protocol. We achieved simulation
speeds of between 2.3x10^9 and 4.1x10^9 synaptic events per second for both
cortical states in the explored range of inter-modular interconnections.Comment: 22 pages, 9 figures, 4 table
Clique of functional hubs orchestrates population bursts in developmentally regulated neural networks
It has recently been discovered that single neuron stimulation can impact
network dynamics in immature and adult neuronal circuits. Here we report a
novel mechanism which can explain in neuronal circuits, at an early stage of
development, the peculiar role played by a few specific neurons in
promoting/arresting the population activity. For this purpose, we consider a
standard neuronal network model, with short-term synaptic plasticity, whose
population activity is characterized by bursting behavior. The addition of
developmentally inspired constraints and correlations in the distribution of
the neuronal connectivities and excitabilities leads to the emergence of
functional hub neurons, whose stimulation/deletion is critical for the network
activity. Functional hubs form a clique, where a precise sequential activation
of the neurons is essential to ignite collective events without any need for a
specific topological architecture. Unsupervised time-lagged firings of
supra-threshold cells, in connection with coordinated entrainments of
near-threshold neurons, are the key ingredients to orchestrateComment: 39 pages, 15 figures, to appear in PLOS Computational Biolog
Dynamics and precursor signs for phase transitions in neural systems
This thesis investigates neural state transitions associated with sleep, seizure and anaesthesia. The aim is to address the question: How does a brain traverse the critical threshold between distinct cortical states, both healthy and pathological? Specifically we are interested in sub-threshold neural behaviour immediately prior to state transition. We use theoretical neural modelling (single spiking neurons, a network of these, and a mean-field continuum limit) and in vitro experiments to address this question.
Dynamically realistic equations of motion for thalamic relay neuron, reticular nuclei, cortical pyramidal and cortical interneuron in different vigilance states are developed, based on the Izhikevich spiking neuron model. A network of cortical neurons is assembled to examine the behaviour of the gamma-producing cortical network and its transition to lower frequencies due to effect of anaesthesia. Then a three-neuron model for the thalamocortical loop for sleep spindles is presented. Numerical simulations of these networks confirms spiking consistent with reported in vivo measurement results, and provides supporting evidence for precursor indicators of imminent phase transition due to occurrence of individual spindles.
To complement the spiking neuron networks, we study the Wilson–Cowan neural mass equations describing homogeneous cortical columns and a 1D spatial cluster of such columns. The abstract representation of cortical tissue by a pair of coupled integro-differential equations permits thorough linear stability, phase plane and bifurcation analyses. This model shows a rich set of spatial and temporal bifurcations marking the boundary to state transitions: saddle-node, Hopf, Turing, and mixed Hopf–Turing. Close to state transition, white-noise-induced subthreshold fluctuations show clear signs of critical slowing down with prolongation and strengthening of autocorrelations, both in time and space, irrespective of bifurcation type.
Attempts at in vitro capture of these predicted leading indicators form the last part of the thesis. We recorded local field potentials (LFPs) from cortical and hippocampal slices of mouse brain. State transition is marked by the emergence and cessation of spontaneous seizure-like events (SLEs) induced by bathing the slices in an artificial cerebral spinal fluid containing no magnesium ions. Phase-plane analysis of the LFP time-series suggests that distinct bifurcation classes can be responsible for state change to seizure. Increased variance and growth of spectral power at low frequencies (f < 15 Hz) was observed in LFP recordings prior to initiation of some SLEs. In addition we demonstrated prolongation of electrically evoked potentials in cortical tissue, while forwarding the slice to a seizing regime. The results offer the possibility of capturing leading temporal indicators prior to seizure generation, with potential consequences for understanding epileptogenesis.
Guided by dynamical systems theory this thesis captures evidence for precursor signs of phase transitions in neural systems using mathematical and computer-based modelling as well as in vitro experiments
Memory and information processing in neuromorphic systems
A striking difference between brain-inspired neuromorphic processors and
current von Neumann processors architectures is the way in which memory and
processing is organized. As Information and Communication Technologies continue
to address the need for increased computational power through the increase of
cores within a digital processor, neuromorphic engineers and scientists can
complement this need by building processor architectures where memory is
distributed with the processing. In this paper we present a survey of
brain-inspired processor architectures that support models of cortical networks
and deep neural networks. These architectures range from serial clocked
implementations of multi-neuron systems to massively parallel asynchronous ones
and from purely digital systems to mixed analog/digital systems which implement
more biological-like models of neurons and synapses together with a suite of
adaptation and learning mechanisms analogous to the ones found in biological
nervous systems. We describe the advantages of the different approaches being
pursued and present the challenges that need to be addressed for building
artificial neural processing systems that can display the richness of behaviors
seen in biological systems.Comment: Submitted to Proceedings of IEEE, review of recently proposed
neuromorphic computing platforms and system
Information processing in a midbrain visual pathway
Visual information is processed in brain via the intricate interactions between neurons. We investigated a midbrain visual pathway: optic tectum and its isthmic nucleus) that is motion sensitive and is thought as part of attentional system. We determined the physiological properties of individual neurons as well as their synaptic connections with intracellular recordings. We reproduced the center-surround receptive field structure of tectal neurons in a dynamical recurrent feedback loop. We reveal in a computational model that the anti-topographic inhibitory feedback could mediate competitive stimulus selection in a complex visual scene. We also investigated the dynamics of the competitive selection in a rate model. The isthmotectal feedback loop gates the information transfer from tectum to thalamic rotundus. We discussed the role of a localized feedback projection in contributing to the gating mechanisms with both experimental and numerical approaches. We further discussed the dynamics of the isthmotectal system by considering the propagation delays between different components. We conclude that the isthmotectal system is involved in attention-like competitive stimulus selection and control the information coding in the motion sensitive SGC-I neurons by modulating the retino-tectal synaptic transmission
- …