32,782 research outputs found

    Storage Capacity Diverges with Synaptic Efficiency in an Associative Memory Model with Synaptic Delay and Pruning

    Full text link
    It is known that storage capacity per synapse increases by synaptic pruning in the case of a correlation-type associative memory model. However, the storage capacity of the entire network then decreases. To overcome this difficulty, we propose decreasing the connecting rate while keeping the total number of synapses constant by introducing delayed synapses. In this paper, a discrete synchronous-type model with both delayed synapses and their prunings is discussed as a concrete example of the proposal. First, we explain the Yanai-Kim theory by employing the statistical neurodynamics. This theory involves macrodynamical equations for the dynamics of a network with serial delay elements. Next, considering the translational symmetry of the explained equations, we re-derive macroscopic steady state equations of the model by using the discrete Fourier transformation. The storage capacities are analyzed quantitatively. Furthermore, two types of synaptic prunings are treated analytically: random pruning and systematic pruning. As a result, it becomes clear that in both prunings, the storage capacity increases as the length of delay increases and the connecting rate of the synapses decreases when the total number of synapses is constant. Moreover, an interesting fact becomes clear: the storage capacity asymptotically approaches 2/Ï€2/\pi due to random pruning. In contrast, the storage capacity diverges in proportion to the logarithm of the length of delay by systematic pruning and the proportion constant is 4/Ï€4/\pi. These results theoretically support the significance of pruning following an overgrowth of synapses in the brain and strongly suggest that the brain prefers to store dynamic attractors such as sequences and limit cycles rather than equilibrium states.Comment: 27 pages, 14 figure

    Unipolar terminal-attractor-based neural associative memory with adaptive threshold and perfect convergence

    Get PDF
    A perfectly convergent unipolar neural associative-memory system based on nonlinear dynamical terminal attractors is presented. With adaptive setting of the threshold values for the dynamic iteration for the unipolar binary neuron states with terminal attractors, perfect convergence is achieved. This achievement and correct retrieval are demonstrated by computer simulation. The simulations are completed (1) by exhaustive tests with all of the possible combinations of stored and test vectors in small-scale networks and (2) by Monte Carlo simulations with randomly generated stored and test vectors in large-scale networks with an M/N ratio of 4 (M is the number of stored vectors; N is the number of neurons < 256). An experiment with exclusive-oR logic operations with liquid-crystal-television spatial light modulators is used to show the feasibility of an optoelectronic implementation of the model. The behavior of terminal attractors in basins of energy space is illustrated by examples

    Phase Transitions of an Oscillator Neural Network with a Standard Hebb Learning Rule

    Full text link
    Studies have been made on the phase transition phenomena of an oscillator network model based on a standard Hebb learning rule like the Hopfield model. The relative phase informations---the in-phase and anti-phase, can be embedded in the network. By self-consistent signal-to-noise analysis (SCSNA), it was found that the storage capacity is given by αc=0.042\alpha_c = 0.042, which is better than that of Cook's model. However, the retrieval quality is worse. In addition, an investigation was made into an acceleration effect caused by asymmetry of the phase dynamics. Finally, it was numerically shown that the storage capacity can be improved by modifying the shape of the coupling function.Comment: 10 pages, 6 figure

    An associative memory of Hodgkin-Huxley neuron networks with Willshaw-type synaptic couplings

    Full text link
    An associative memory has been discussed of neural networks consisting of spiking N (=100) Hodgkin-Huxley (HH) neurons with time-delayed couplings, which memorize P patterns in their synaptic weights. In addition to excitatory synapses whose strengths are modified after the Willshaw-type learning rule with the 0/1 code for quiescent/active states, the network includes uniform inhibitory synapses which are introduced to reduce cross-talk noises. Our simulations of the HH neuron network for the noise-free state have shown to yield a fairly good performance with the storage capacity of αc=Pmax/N∼0.4−2.4\alpha_c = P_{\rm max}/N \sim 0.4 - 2.4 for the low neuron activity of f∼0.04−0.10f \sim 0.04-0.10. This storage capacity of our temporal-code network is comparable to that of the rate-code model with the Willshaw-type synapses. Our HH neuron network is realized not to be vulnerable to the distribution of time delays in couplings. The variability of interspace interval (ISI) of output spike trains in the process of retrieving stored patterns is also discussed.Comment: 15 pages, 3 figures, changed Titl
    • …
    corecore