5,864 research outputs found

    Statics and dynamics of an Ashkin-Teller neural network with low loading

    Full text link
    An Ashkin-Teller neural network, allowing for two types of neurons is considered in the case of low loading as a function of the strength of the respective couplings between these neurons. The storage and retrieval of embedded patterns built from the two types of neurons, with different degrees of (in)dependence is studied. In particular, thermodynamic properties including the existence and stability of Mattis states are discussed. Furthermore, the dynamic behaviour is examined by deriving flow equations for the macroscopic overlap. It is found that for linked patterns the model shows better retrieval properties than a corresponding Hopfield model.Comment: 20 pages, 6 figures, Latex with postscript figures in one tar.gz fil

    Multiplicative versus additive noise in multi-state neural networks

    Full text link
    The effects of a variable amount of random dilution of the synaptic couplings in Q-Ising multi-state neural networks with Hebbian learning are examined. A fraction of the couplings is explicitly allowed to be anti-Hebbian. Random dilution represents the dying or pruning of synapses and, hence, a static disruption of the learning process which can be considered as a form of multiplicative noise in the learning rule. Both parallel and sequential updating of the neurons can be treated. Symmetric dilution in the statics of the network is studied using the mean-field theory approach of statistical mechanics. General dilution, including asymmetric pruning of the couplings, is examined using the generating functional (path integral) approach of disordered systems. It is shown that random dilution acts as additive gaussian noise in the Hebbian learning rule with a mean zero and a variance depending on the connectivity of the network and on the symmetry. Furthermore, a scaling factor appears that essentially measures the average amount of anti-Hebbian couplings.Comment: 15 pages, 5 figures, to appear in the proceedings of the Conference on Noise in Complex Systems and Stochastic Dynamics II (SPIE International

    Retrieval Properties of Hopfield and Correlated Attractors in an Associative Memory Model

    Full text link
    We examine a previouly introduced attractor neural network model that explains the persistent activities of neurons in the anterior ventral temporal cortex of the brain. In this model, the coexistence of several attractors including correlated attractors was reported in the cases of finite and infinite loading. In this paper, by means of a statistical mechanical method, we study the statics and dynamics of the model in both finite and extensive loading, mainly focusing on the retrieval properties of the Hopfield and correlated attractors. In the extensive loading case, we derive the evolution equations by the dynamical replica theory. We found several characteristic temporal behaviours, both in the finite and extensive loading cases. The theoretical results were confirmed by numerical simulations.Comment: 12 pages, 7 figure

    Statistical Mechanics of Dilute Batch Minority Games with Random External Information

    Full text link
    We study the dynamics and statics of a dilute batch minority game with random external information. We focus on the case in which the number of connections per agent is infinite in the thermodynamic limit. The dynamical scenario of ergodicity breaking in this model is different from the phase transition in the standard minority game and is characterised by the onset of long-term memory at finite integrated response. We demonstrate that finite memory appears at the AT-line obtained from the corresponding replica calculation, and compare the behaviour of the dilute model with the minority game with market impact correction, which is known to exhibit similar features.Comment: 22 pages, 6 figures, text modified, references updated and added, figure added, typos correcte

    Synchronous versus sequential updating in the three-state Ising neural network with variable dilution

    Full text link
    The three-state Ising neural network with synchronous updating and variable dilution is discussed starting from the appropriate Hamiltonians. The thermodynamic and retrieval properties are examined using replica mean-field theory. Capacity-temperature phase diagrams are derived for several values of the pattern activity and different gradations of dilution, and the information content is calculated. The results are compared with those for sequential updating. The effect of self-coupling is established. Also the dynamics is studied using the generating function technique for both synchronous and sequential updating. Typical flow diagrams for the overlap order parameter are presented. The differences with the signal-to-noise approach are outlined.Comment: 21 pages Latex, 12 eps figures and 1 ps figur
    • …
    corecore