1,075,055 research outputs found
Emergent complex neural dynamics
A large repertoire of spatiotemporal activity patterns in the brain is the
basis for adaptive behaviour. Understanding the mechanism by which the brain's
hundred billion neurons and hundred trillion synapses manage to produce such a
range of cortical configurations in a flexible manner remains a fundamental
problem in neuroscience. One plausible solution is the involvement of universal
mechanisms of emergent complex phenomena evident in dynamical systems poised
near a critical point of a second-order phase transition. We review recent
theoretical and empirical results supporting the notion that the brain is
naturally poised near criticality, as well as its implications for better
understanding of the brain
Deep Complex Networks
At present, the vast majority of building blocks, techniques, and
architectures for deep learning are based on real-valued operations and
representations. However, recent work on recurrent neural networks and older
fundamental theoretical analysis suggests that complex numbers could have a
richer representational capacity and could also facilitate noise-robust memory
retrieval mechanisms. Despite their attractive properties and potential for
opening up entirely new neural architectures, complex-valued deep neural
networks have been marginalized due to the absence of the building blocks
required to design such models. In this work, we provide the key atomic
components for complex-valued deep neural networks and apply them to
convolutional feed-forward networks and convolutional LSTMs. More precisely, we
rely on complex convolutions and present algorithms for complex
batch-normalization, complex weight initialization strategies for
complex-valued neural nets and we use them in experiments with end-to-end
training schemes. We demonstrate that such complex-valued models are
competitive with their real-valued counterparts. We test deep complex models on
several computer vision tasks, on music transcription using the MusicNet
dataset and on Speech Spectrum Prediction using the TIMIT dataset. We achieve
state-of-the-art performance on these audio-related tasks
Explosive synchronization transitions in complex neural network
It has been recently reported that explosive synchronization transitions can
take place in networks of phase oscillators [G\'omez-Garde\~nes \emph{et al.}
Phys.Rev.Letts. 106, 128701 (2011)] and chaotic oscillators [Leyva \emph{et
al.} Phys.Rev.Letts. 108, 168702 (2012)]. Here, we investigate the effect of a
microscopic correlation between the dynamics and the interacting topology of
coupled FitzHugh-Nagumo oscillators on phase synchronization transition in
Barab\'asi-Albert (BA) scale-free networks and Erd\"os-R\'enyi (ER) random
networks. We show that, if the width of distribution of natural frequencies of
the oscillations is larger than a threshold value, a strong hysteresis loop
arises in the synchronization diagram of BA networks due to the positive
correlation between node degrees and natural frequencies of the oscillations,
indicating the evidence of an explosive transition towards synchronization of
relaxation oscillators system. In contrast to the results in BA networks, in
more homogeneous ER networks the synchronization transition is always of
continuous type regardless of the the width of the frequency distribution.
Moreover, we consider the effect of degree-mixing patterns on the nature of the
synchronization transition, and find that the degree assortativity is
unfavorable for the occurrence of such an explosive transition.Comment: 5 pages, 5 figure
Stable Irregular Dynamics in Complex Neural Networks
For infinitely large sparse networks of spiking neurons mean field theory
shows that a balanced state of highly irregular activity arises under various
conditions. Here we analytically investigate the microscopic irregular dynamics
in finite networks of arbitrary connectivity, keeping track of all individual
spike times. For delayed, purely inhibitory interactions we demonstrate that
the irregular dynamics is not chaotic but rather stable and convergent towards
periodic orbits. Moreover, every generic periodic orbit of these dynamical
systems is stable. These results highlight that chaotic and stable dynamics are
equally capable of generating irregular activity.Comment: 10 pages, 2 figure
Complexity without chaos: Plasticity within random recurrent networks generates robust timing and motor control
It is widely accepted that the complex dynamics characteristic of recurrent
neural circuits contributes in a fundamental manner to brain function. Progress
has been slow in understanding and exploiting the computational power of
recurrent dynamics for two main reasons: nonlinear recurrent networks often
exhibit chaotic behavior and most known learning rules do not work in robust
fashion in recurrent networks. Here we address both these problems by
demonstrating how random recurrent networks (RRN) that initially exhibit
chaotic dynamics can be tuned through a supervised learning rule to generate
locally stable neural patterns of activity that are both complex and robust to
noise. The outcome is a novel neural network regime that exhibits both
transiently stable and chaotic trajectories. We further show that the recurrent
learning rule dramatically increases the ability of RRNs to generate complex
spatiotemporal motor patterns, and accounts for recent experimental data
showing a decrease in neural variability in response to stimulus onset
Hyperplane Neural Codes and the Polar Complex
Hyperplane codes are a class of convex codes that arise as the output of a
one layer feed-forward neural network. Here we establish several natural
properties of stable hyperplane codes in terms of the {\it polar complex} of
the code, a simplicial complex associated to any combinatorial code. We prove
that the polar complex of a stable hyperplane code is shellable and show that
most currently known properties of the hyperplane codes follow from the
shellability of the appropriate polar complex.Comment: 23 pages, 5 figures. To appear in Proceedings of the Abel Symposiu
Convolutional Drift Networks for Video Classification
Analyzing spatio-temporal data like video is a challenging task that requires
processing visual and temporal information effectively. Convolutional Neural
Networks have shown promise as baseline fixed feature extractors through
transfer learning, a technique that helps minimize the training cost on visual
information. Temporal information is often handled using hand-crafted features
or Recurrent Neural Networks, but this can be overly specific or prohibitively
complex. Building a fully trainable system that can efficiently analyze
spatio-temporal data without hand-crafted features or complex training is an
open challenge. We present a new neural network architecture to address this
challenge, the Convolutional Drift Network (CDN). Our CDN architecture combines
the visual feature extraction power of deep Convolutional Neural Networks with
the intrinsically efficient temporal processing provided by Reservoir
Computing. In this introductory paper on the CDN, we provide a very simple
baseline implementation tested on two egocentric (first-person) video activity
datasets.We achieve video-level activity classification results on-par with
state-of-the art methods. Notably, performance on this complex spatio-temporal
task was produced by only training a single feed-forward layer in the CDN.Comment: Published in IEEE Rebooting Computin
- …
