1,138 research outputs found
The Power of Linear Recurrent Neural Networks
Recurrent neural networks are a powerful means to cope with time series. We
show how a type of linearly activated recurrent neural networks, which we call
predictive neural networks, can approximate any time-dependent function f(t)
given by a number of function values. The approximation can effectively be
learned by simply solving a linear equation system; no backpropagation or
similar methods are needed. Furthermore, the network size can be reduced by
taking only most relevant components. Thus, in contrast to others, our approach
not only learns network weights but also the network architecture. The networks
have interesting properties: They end up in ellipse trajectories in the long
run and allow the prediction of further values and compact representations of
functions. We demonstrate this by several experiments, among them multiple
superimposed oscillators (MSO), robotic soccer, and predicting stock prices.
Predictive neural networks outperform the previous state-of-the-art for the MSO
task with a minimal number of units.Comment: 22 pages, 14 figures and tables, revised implementatio
Correlations between synapses in pairs of neurons slow down dynamics in randomly connected neural networks
Networks of randomly connected neurons are among the most popular models in
theoretical neuroscience. The connectivity between neurons in the cortex is
however not fully random, the simplest and most prominent deviation from
randomness found in experimental data being the overrepresentation of
bidirectional connections among pyramidal cells. Using numerical and analytical
methods, we investigated the effects of partially symmetric connectivity on
dynamics in networks of rate units. We considered the two dynamical regimes
exhibited by random neural networks: the weak-coupling regime, where the firing
activity decays to a single fixed point unless the network is stimulated, and
the strong-coupling or chaotic regime, characterized by internally generated
fluctuating firing rates. In the weak-coupling regime, we computed analytically
for an arbitrary degree of symmetry the auto-correlation of network activity in
presence of external noise. In the chaotic regime, we performed simulations to
determine the timescale of the intrinsic fluctuations. In both cases, symmetry
increases the characteristic asymptotic decay time of the autocorrelation
function and therefore slows down the dynamics in the network.Comment: 17 pages, 7 figure
Convolutional unitary or orthogonal recurrent neural networks
Recurrent neural networks are extremely powerful yet hard to train. One of
their issues is the vanishing gradient problem, whereby propagation of training
signals may be exponentially attenuated, freezing training. Use of orthogonal
or unitary matrices, whose powers neither explode nor decay, has been proposed
to mitigate this issue, but their computational expense has hindered their use.
Here we show that in the specific case of convolutional RNNs, we can define a
convolutional exponential and that this operation transforms antisymmetric or
anti-Hermitian convolution kernels into orthogonal or unitary convolution
kernels. We explicitly derive FFT-based algorithms to compute the kernels and
their derivatives. The computational complexity of parametrizing this subspace
of orthogonal transformations is thus the same as the networks' iteration
- …