It is known that storage capacity per synapse increases by synaptic pruning
in the case of a correlation-type associative memory model. However, the
storage capacity of the entire network then decreases. To overcome this
difficulty, we propose decreasing the connecting rate while keeping the total
number of synapses constant by introducing delayed synapses. In this paper, a
discrete synchronous-type model with both delayed synapses and their prunings
is discussed as a concrete example of the proposal. First, we explain the
Yanai-Kim theory by employing the statistical neurodynamics. This theory
involves macrodynamical equations for the dynamics of a network with serial
delay elements. Next, considering the translational symmetry of the explained
equations, we re-derive macroscopic steady state equations of the model by
using the discrete Fourier transformation. The storage capacities are analyzed
quantitatively. Furthermore, two types of synaptic prunings are treated
analytically: random pruning and systematic pruning. As a result, it becomes
clear that in both prunings, the storage capacity increases as the length of
delay increases and the connecting rate of the synapses decreases when the
total number of synapses is constant. Moreover, an interesting fact becomes
clear: the storage capacity asymptotically approaches 2/Ï€ due to random
pruning. In contrast, the storage capacity diverges in proportion to the
logarithm of the length of delay by systematic pruning and the proportion
constant is 4/Ï€. These results theoretically support the significance of
pruning following an overgrowth of synapses in the brain and strongly suggest
that the brain prefers to store dynamic attractors such as sequences and limit
cycles rather than equilibrium states.Comment: 27 pages, 14 figure