237 research outputs found

    Response Functions Improving Performance in Analog Attractor Neural Networks

    Full text link
    In the context of attractor neural networks, we study how the equilibrium analog neural activities, reached by the network dynamics during memory retrieval, may improve storage performance by reducing the interferences between the recalled pattern and the other stored ones. We determine a simple dynamics that stabilizes network states which are highly correlated with the retrieved pattern, for a number of stored memories that does not exceed αN\alpha_{\star} N, where α[0,0.41]\alpha_{\star}\in[0,0.41] depends on the global activity level in the network and NN is the number of neurons.Comment: 13 pages (with figures), LaTex (RevTex), to appear on Phys.Rev.E (RC

    Storage Capacity Diverges with Synaptic Efficiency in an Associative Memory Model with Synaptic Delay and Pruning

    Full text link
    It is known that storage capacity per synapse increases by synaptic pruning in the case of a correlation-type associative memory model. However, the storage capacity of the entire network then decreases. To overcome this difficulty, we propose decreasing the connecting rate while keeping the total number of synapses constant by introducing delayed synapses. In this paper, a discrete synchronous-type model with both delayed synapses and their prunings is discussed as a concrete example of the proposal. First, we explain the Yanai-Kim theory by employing the statistical neurodynamics. This theory involves macrodynamical equations for the dynamics of a network with serial delay elements. Next, considering the translational symmetry of the explained equations, we re-derive macroscopic steady state equations of the model by using the discrete Fourier transformation. The storage capacities are analyzed quantitatively. Furthermore, two types of synaptic prunings are treated analytically: random pruning and systematic pruning. As a result, it becomes clear that in both prunings, the storage capacity increases as the length of delay increases and the connecting rate of the synapses decreases when the total number of synapses is constant. Moreover, an interesting fact becomes clear: the storage capacity asymptotically approaches 2/π2/\pi due to random pruning. In contrast, the storage capacity diverges in proportion to the logarithm of the length of delay by systematic pruning and the proportion constant is 4/π4/\pi. These results theoretically support the significance of pruning following an overgrowth of synapses in the brain and strongly suggest that the brain prefers to store dynamic attractors such as sequences and limit cycles rather than equilibrium states.Comment: 27 pages, 14 figure

    The path-integral analysis of an associative memory model storing an infinite number of finite limit cycles

    Full text link
    It is shown that an exact solution of the transient dynamics of an associative memory model storing an infinite number of limit cycles with l finite steps by means of the path-integral analysis. Assuming the Maxwell construction ansatz, we have succeeded in deriving the stationary state equations of the order parameters from the macroscopic recursive equations with respect to the finite-step sequence processing model which has retarded self-interactions. We have also derived the stationary state equations by means of the signal-to-noise analysis (SCSNA). The signal-to-noise analysis must assume that crosstalk noise of an input to spins obeys a Gaussian distribution. On the other hand, the path-integral method does not require such a Gaussian approximation of crosstalk noise. We have found that both the signal-to-noise analysis and the path-integral analysis give the completely same result with respect to the stationary state in the case where the dynamics is deterministic, when we assume the Maxwell construction ansatz. We have shown the dependence of storage capacity (alpha_c) on the number of patterns per one limit cycle (l). Storage capacity monotonously increases with the number of steps, and converges to alpha_c=0.269 at l ~= 10. The original properties of the finite-step sequence processing model appear as long as the number of steps of the limit cycle has order l=O(1).Comment: 24 pages, 3 figure

    Bayesian retrieval in associative memories with storage errors

    Full text link

    A habituation account of change detection in same/different judgments

    Get PDF
    We investigated the basis of change detection in a short-term priming task. In two experiments, participants were asked to indicate whether or not a target word was the same as a previously presented cue. Data from an experiment measuring magnetoencephalography failed to find different patterns for “same” and “different” responses, consistent with the claim that both arise from a common neural source, with response magnitude defining the difference between immediate novelty versus familiarity. In a behavioral experiment, we tested and confirmed the predictions of a habituation account of these judgments by comparing conditions in which the target, the cue, or neither was primed by its presentation in the previous trial. As predicted, cue-primed trials had faster response times, and target-primed trials had slower response times relative to the neither-primed baseline. These results were obtained irrespective of response repetition and stimulus–response contingencies. The behavioral and brain activity data support the view that detection of change drives performance in these tasks and that the underlying mechanism is neuronal habituation

    Selective retrieval of memory and concept sequences through neuro-windows

    Get PDF
    This letter presents a crosscorrelational associative memory model which realizes selective retrieval of pattern sequences. When hierarchically correlated sequences are memorized, sequences of the correlational centers can be defined as the concept sequences. The authors propose a modified neuro-window method which enables selective retrieval of memory sequences and concept sequences. It is also shown that the proposed model realizes capacity expansion of the memory which stores random sequences

    Neural Correlates of Learning in the Prefrontal Cortex of the Monkey: A Predictive Model

    Get PDF
    The principles underlying the organization and operation of the prefrontal cortex have been addressed by neural network modeling. The involvement of the prefrontal cortex in the temporal organization of behavior can be defined by processing units that switch between two stable states of activity (bistable behavior) in response to synaptic inputs. Long-term representation of programs requiring short-term memory can result from activity-dependent modifications of the synaptic transmission controlling the bistable behavior. After learning, the sustained activity of a given neuron represents the selective memorization of a past event the selective anticipation of a future event, and the predictability of reinforcement A simulated neural network illustrates the abilities of the model (1) to learn, via a natural step-by-step training protocol, the paradigmatic task (delayed response) used for testing prefrontal neurons in primates, (2) to display the same categories of neuronal activities, and (3) to predict how they change during learning. In agreement with experimental data, two main types of activity contribute to the adaptive properties of the network. The first is transient activity time-locked to events of the task and its profile remains constant during successive training stages. The second is sustained activity that undergoes nonmonotonic changes with changes in reward contingency that occur during the transition between stage
    corecore