12 research outputs found
Transient Dynamics of Sparsely Connected Hopfield Neural Networks with Arbitrary Degree Distributions
Using probabilistic approach, the transient dynamics of sparsely connected
Hopfield neural networks is studied for arbitrary degree distributions. A
recursive scheme is developed to determine the time evolution of overlap
parameters. As illustrative examples, the explicit calculations of dynamics for
networks with binomial, power-law, and uniform degree distribution are
performed. The results are good agreement with the extensive numerical
simulations. It indicates that with the same average degree, there is a gradual
improvement of network performance with increasing sharpness of its degree
distribution, and the most efficient degree distribution for global storage of
patterns is the delta function.Comment: 11 pages, 5 figures. Any comments are favore
Transient dynamics for sequence processing neural networks
An exact solution of the transient dynamics for a sequential associative
memory model is discussed through both the path-integral method and the
statistical neurodynamics. Although the path-integral method has the ability to
give an exact solution of the transient dynamics, only stationary properties
have been discussed for the sequential associative memory. We have succeeded in
deriving an exact macroscopic description of the transient dynamics by
analyzing the correlation of crosstalk noise. Surprisingly, the order parameter
equations of this exact solution are completely equivalent to those of the
statistical neurodynamics, which is an approximation theory that assumes
crosstalk noise to obey the Gaussian distribution. In order to examine our
theoretical findings, we numerically obtain cumulants of the crosstalk noise.
We verify that the third- and fourth-order cumulants are equal to zero, and
that the crosstalk noise is normally distributed even in the non-retrieval
case. We show that the results obtained by our theory agree with those obtained
by computer simulations. We have also found that the macroscopic unstable state
completely coincides with the separatrix.Comment: 21 pages, 4 figure
Bifurcation analysis in an associative memory model
We previously reported the chaos induced by the frustration of interaction in
a non-monotonic sequential associative memory model, and showed the chaotic
behaviors at absolute zero. We have now analyzed bifurcation in a stochastic
system, namely a finite-temperature model of the non-monotonic sequential
associative memory model. We derived order-parameter equations from the
stochastic microscopic equations. Two-parameter bifurcation diagrams obtained
from those equations show the coexistence of attractors, which do not appear at
absolute zero, and the disappearance of chaos due to the temperature effect.Comment: 19 page
Analysis of Bidirectional Associative Memory using SCSNA and Statistical Neurodynamics
Bidirectional associative memory (BAM) is a kind of an artificial neural
network used to memorize and retrieve heterogeneous pattern pairs. Many efforts
have been made to improve BAM from the the viewpoint of computer application,
and few theoretical studies have been done. We investigated the theoretical
characteristics of BAM using a framework of statistical-mechanical analysis. To
investigate the equilibrium state of BAM, we applied self-consistent signal to
noise analysis (SCSNA) and obtained a macroscopic parameter equations and
relative capacity. Moreover, to investigate not only the equilibrium state but
also the retrieval process of reaching the equilibrium state, we applied
statistical neurodynamics to the update rule of BAM and obtained evolution
equations for the macroscopic parameters. These evolution equations are
consistent with the results of SCSNA in the equilibrium state.Comment: 13 pages, 4 figure
The path-integral analysis of an associative memory model storing an infinite number of finite limit cycles
It is shown that an exact solution of the transient dynamics of an
associative memory model storing an infinite number of limit cycles with l
finite steps by means of the path-integral analysis. Assuming the Maxwell
construction ansatz, we have succeeded in deriving the stationary state
equations of the order parameters from the macroscopic recursive equations with
respect to the finite-step sequence processing model which has retarded
self-interactions. We have also derived the stationary state equations by means
of the signal-to-noise analysis (SCSNA). The signal-to-noise analysis must
assume that crosstalk noise of an input to spins obeys a Gaussian distribution.
On the other hand, the path-integral method does not require such a Gaussian
approximation of crosstalk noise. We have found that both the signal-to-noise
analysis and the path-integral analysis give the completely same result with
respect to the stationary state in the case where the dynamics is
deterministic, when we assume the Maxwell construction ansatz.
We have shown the dependence of storage capacity (alpha_c) on the number of
patterns per one limit cycle (l). Storage capacity monotonously increases with
the number of steps, and converges to alpha_c=0.269 at l ~= 10. The original
properties of the finite-step sequence processing model appear as long as the
number of steps of the limit cycle has order l=O(1).Comment: 24 pages, 3 figure
Application of two-parameter dynamical replica theory to retrieval dynamics of associative memory with non-monotonic neurons
The two-parameter dynamical replica theory (2-DRT) is applied to investigate
retrieval properties of non-monotonic associative memory, a model which lacks
thermodynamic potential functions. 2-DRT reproduces dynamical properties of the
model quite well, including the capacity and basin of attraction.
Superretrieval state is also discussed in the framework of 2-DRT. The local
stability condition of the superretrieval state is given, which provides a
better estimate of the region in which superretrieval is observed
experimentally than the self-consistent signal-to-noise analysis (SCSNA) does.Comment: 16 pages, 19 postscript figure
Storage Capacity Diverges with Synaptic Efficiency in an Associative Memory Model with Synaptic Delay and Pruning
It is known that storage capacity per synapse increases by synaptic pruning
in the case of a correlation-type associative memory model. However, the
storage capacity of the entire network then decreases. To overcome this
difficulty, we propose decreasing the connecting rate while keeping the total
number of synapses constant by introducing delayed synapses. In this paper, a
discrete synchronous-type model with both delayed synapses and their prunings
is discussed as a concrete example of the proposal. First, we explain the
Yanai-Kim theory by employing the statistical neurodynamics. This theory
involves macrodynamical equations for the dynamics of a network with serial
delay elements. Next, considering the translational symmetry of the explained
equations, we re-derive macroscopic steady state equations of the model by
using the discrete Fourier transformation. The storage capacities are analyzed
quantitatively. Furthermore, two types of synaptic prunings are treated
analytically: random pruning and systematic pruning. As a result, it becomes
clear that in both prunings, the storage capacity increases as the length of
delay increases and the connecting rate of the synapses decreases when the
total number of synapses is constant. Moreover, an interesting fact becomes
clear: the storage capacity asymptotically approaches due to random
pruning. In contrast, the storage capacity diverges in proportion to the
logarithm of the length of delay by systematic pruning and the proportion
constant is . These results theoretically support the significance of
pruning following an overgrowth of synapses in the brain and strongly suggest
that the brain prefers to store dynamic attractors such as sequences and limit
cycles rather than equilibrium states.Comment: 27 pages, 14 figure