90 research outputs found
Convolutional unitary or orthogonal recurrent neural networks
Recurrent neural networks are extremely powerful yet hard to train. One of
their issues is the vanishing gradient problem, whereby propagation of training
signals may be exponentially attenuated, freezing training. Use of orthogonal
or unitary matrices, whose powers neither explode nor decay, has been proposed
to mitigate this issue, but their computational expense has hindered their use.
Here we show that in the specific case of convolutional RNNs, we can define a
convolutional exponential and that this operation transforms antisymmetric or
anti-Hermitian convolution kernels into orthogonal or unitary convolution
kernels. We explicitly derive FFT-based algorithms to compute the kernels and
their derivatives. The computational complexity of parametrizing this subspace
of orthogonal transformations is thus the same as the networks' iteration
Dynamical and Statistical Criticality in a Model of Neural Tissue
For the nervous system to work at all, a delicate balance of excitation and
inhibition must be achieved. However, when such a balance is sought by global
strategies, only few modes remain balanced close to instability, and all other
modes are strongly stable. Here we present a simple model of neural tissue in
which this balance is sought locally by neurons following `anti-Hebbian'
behavior: {\sl all} degrees of freedom achieve a close balance of excitation
and inhibition and become "critical" in the dynamical sense. At long
timescales, the modes of our model oscillate around the instability line, so an
extremely complex "breakout" dynamics ensues in which different modes of the
system oscillate between prominence and extinction. We show the system develops
various anomalous statistical behaviours and hence becomes self-organized
critical in the statistical sense
Noise-induced memory in extended excitable systems
We describe a form of memory exhibited by extended excitable systems driven
by stochastic fluctuations. Under such conditions, the system self-organizes
into a state characterized by power-law correlations thus retaining long-term
memory of previous states. The exponents are robust and model-independent. We
discuss novel implications of these results for the functioning of cortical
neurons as well as for networks of neurons.Comment: 4 pages, latex + 5 eps figure
Noise and Inertia-Induced Inhomogeneity in the Distribution of Small Particles in Fluid Flows
The dynamics of small spherical neutrally buoyant particulate impurities
immersed in a two-dimensional fluid flow are known to lead to particle
accumulation in the regions of the flow in which rotation dominates over shear,
provided that the Stokes number of the particles is sufficiently small. If the
flow is viewed as a Hamiltonian dynamical system, it can be seen that the
accumulations occur in the nonchaotic parts of the phase space: the
Kolmogorov--Arnold--Moser tori. This has suggested a generalization of these
dynamics to Hamiltonian maps, dubbed a bailout embedding. In this paper we use
a bailout embedding of the standard map to mimic the dynamics of impurities
subject not only to drag but also to fluctuating forces modelled as white
noise. We find that the generation of inhomogeneities associated with the
separation of particle from fluid trajectories is enhanced by the presence of
noise, so that they appear in much broader ranges of the Stokes number than
those allowing spontaneous separation
- …