11,475 research outputs found
Phase Diagram and Storage Capacity of Sequence Processing Neural Networks
We solve the dynamics of Hopfield-type neural networks which store sequences
of patterns, close to saturation. The asymmetry of the interaction matrix in
such models leads to violation of detailed balance, ruling out an equilibrium
statistical mechanical analysis. Using generating functional methods we derive
exact closed equations for dynamical order parameters, viz. the sequence
overlap and correlation- and response functions, in the thermodynamic limit. We
calculate the time translation invariant solutions of these equations,
describing stationary limit-cycles, which leads to a phase diagram. The
effective retarded self-interaction usually appearing in symmetric models is
here found to vanish, which causes a significantly enlarged storage capacity of
, compared to \alpha_\c\sim 0.139 for Hopfield networks
storing static patterns. Our results are tested against extensive computer
simulations and excellent agreement is found.Comment: 17 pages Latex2e, 2 postscript figure
Analysis of Oscillator Neural Networks for Sparsely Coded Phase Patterns
We study a simple extended model of oscillator neural networks capable of
storing sparsely coded phase patterns, in which information is encoded both in
the mean firing rate and in the timing of spikes. Applying the methods of
statistical neurodynamics to our model, we theoretically investigate the
model's associative memory capability by evaluating its maximum storage
capacities and deriving its basins of attraction. It is shown that, as in the
Hopfield model, the storage capacity diverges as the activity level decreases.
We consider various practically and theoretically important cases. For example,
it is revealed that a dynamically adjusted threshold mechanism enhances the
retrieval ability of the associative memory. It is also found that, under
suitable conditions, the network can recall patterns even in the case that
patterns with different activity levels are stored at the same time. In
addition, we examine the robustness with respect to damage of the synaptic
connections. The validity of these theoretical results is confirmed by
reasonable agreement with numerical simulations.Comment: 23 pages, 11 figure
The path-integral analysis of an associative memory model storing an infinite number of finite limit cycles
It is shown that an exact solution of the transient dynamics of an
associative memory model storing an infinite number of limit cycles with l
finite steps by means of the path-integral analysis. Assuming the Maxwell
construction ansatz, we have succeeded in deriving the stationary state
equations of the order parameters from the macroscopic recursive equations with
respect to the finite-step sequence processing model which has retarded
self-interactions. We have also derived the stationary state equations by means
of the signal-to-noise analysis (SCSNA). The signal-to-noise analysis must
assume that crosstalk noise of an input to spins obeys a Gaussian distribution.
On the other hand, the path-integral method does not require such a Gaussian
approximation of crosstalk noise. We have found that both the signal-to-noise
analysis and the path-integral analysis give the completely same result with
respect to the stationary state in the case where the dynamics is
deterministic, when we assume the Maxwell construction ansatz.
We have shown the dependence of storage capacity (alpha_c) on the number of
patterns per one limit cycle (l). Storage capacity monotonously increases with
the number of steps, and converges to alpha_c=0.269 at l ~= 10. The original
properties of the finite-step sequence processing model appear as long as the
number of steps of the limit cycle has order l=O(1).Comment: 24 pages, 3 figure
Storage Capacity Diverges with Synaptic Efficiency in an Associative Memory Model with Synaptic Delay and Pruning
It is known that storage capacity per synapse increases by synaptic pruning
in the case of a correlation-type associative memory model. However, the
storage capacity of the entire network then decreases. To overcome this
difficulty, we propose decreasing the connecting rate while keeping the total
number of synapses constant by introducing delayed synapses. In this paper, a
discrete synchronous-type model with both delayed synapses and their prunings
is discussed as a concrete example of the proposal. First, we explain the
Yanai-Kim theory by employing the statistical neurodynamics. This theory
involves macrodynamical equations for the dynamics of a network with serial
delay elements. Next, considering the translational symmetry of the explained
equations, we re-derive macroscopic steady state equations of the model by
using the discrete Fourier transformation. The storage capacities are analyzed
quantitatively. Furthermore, two types of synaptic prunings are treated
analytically: random pruning and systematic pruning. As a result, it becomes
clear that in both prunings, the storage capacity increases as the length of
delay increases and the connecting rate of the synapses decreases when the
total number of synapses is constant. Moreover, an interesting fact becomes
clear: the storage capacity asymptotically approaches due to random
pruning. In contrast, the storage capacity diverges in proportion to the
logarithm of the length of delay by systematic pruning and the proportion
constant is . These results theoretically support the significance of
pruning following an overgrowth of synapses in the brain and strongly suggest
that the brain prefers to store dynamic attractors such as sequences and limit
cycles rather than equilibrium states.Comment: 27 pages, 14 figure
Optimal modularity and memory capacity of neural reservoirs
The neural network is a powerful computing framework that has been exploited
by biological evolution and by humans for solving diverse problems. Although
the computational capabilities of neural networks are determined by their
structure, the current understanding of the relationships between a neural
network's architecture and function is still primitive. Here we reveal that
neural network's modular architecture plays a vital role in determining the
neural dynamics and memory performance of the network of threshold neurons. In
particular, we demonstrate that there exists an optimal modularity for memory
performance, where a balance between local cohesion and global connectivity is
established, allowing optimally modular networks to remember longer. Our
results suggest that insights from dynamical analysis of neural networks and
information spreading processes can be leveraged to better design neural
networks and may shed light on the brain's modular organization
Non-Convex Multi-species Hopfield models
In this work we introduce a multi-species generalization of the Hopfield
model for associative memory, where neurons are divided into groups and both
inter-groups and intra-groups pair-wise interactions are considered, with
different intensities. Thus, this system contains two of the main ingredients
of modern Deep neural network architectures: Hebbian interactions to store
patterns of information and multiple layers coding different levels of
correlations. The model is completely solvable in the low-load regime with a
suitable generalization of the Hamilton-Jacobi technique, despite the
Hamiltonian can be a non-definite quadratic form of the magnetizations. The
family of multi-species Hopfield model includes, as special cases, the 3-layers
Restricted Boltzmann Machine (RBM) with Gaussian hidden layer and the
Bidirectional Associative Memory (BAM) model.Comment: This is a pre-print of an article published in J. Stat. Phy
- …