218 research outputs found

    Synchronous versus sequential updating in the three-state Ising neural network with variable dilution

    Full text link
    The three-state Ising neural network with synchronous updating and variable dilution is discussed starting from the appropriate Hamiltonians. The thermodynamic and retrieval properties are examined using replica mean-field theory. Capacity-temperature phase diagrams are derived for several values of the pattern activity and different gradations of dilution, and the information content is calculated. The results are compared with those for sequential updating. The effect of self-coupling is established. Also the dynamics is studied using the generating function technique for both synchronous and sequential updating. Typical flow diagrams for the overlap order parameter are presented. The differences with the signal-to-noise approach are outlined.Comment: 21 pages Latex, 12 eps figures and 1 ps figur

    A layered neural network with three-state neurons optimizing the mutual information

    Full text link
    The time evolution of an exactly solvable layered feedforward neural network with three-state neurons and optimizing the mutual information is studied for arbitrary synaptic noise (temperature). Detailed stationary temperature-capacity and capacity-activity phase diagrams are obtained. The model exhibits pattern retrieval, pattern-fluctuation retrieval and spin-glass phases. It is found that there is an improved performance in the form of both a larger critical capacity and information content compared with three-state Ising-type layered network models. Flow diagrams reveal that saddle-point solutions associated with fluctuation overlaps slow down considerably the flow of the network states towards the stable fixed-points.Comment: 17 pages Latex including 6 eps-figure

    The Blume-Emery-Griffiths neural network: dynamics for arbitrary temperature

    Full text link
    The parallel dynamics of the fully connected Blume-Emery-Griffiths neural network model is studied for arbitrary temperature. By employing a probabilistic signal-to-noise approach, a recursive scheme is found determining the time evolution of the distribution of the local fields and, hence, the evolution of the order parameters. A comparison of this approach is made with the generating functional method, allowing to calculate any physical relevant quantity as a function of time. Explicit analytic formula are given in both methods for the first few time steps of the dynamics. Up to the third time step the results are identical. Some arguments are presented why beyond the third time step the results differ for certain values of the model parameters. Furthermore, fixed-point equations are derived in the stationary limit. Numerical simulations confirm our theoretical findings.Comment: 26 pages in Latex, 8 eps figure

    Parallel dynamics of the fully connected Blume-Emery-Griffiths neural network

    Full text link
    The parallel dynamics of the fully connected Blume-Emery-Griffiths neural network model is studied at zero temperature for arbitrary using a probabilistic approach. A recursive scheme is found determining the complete time evolution of the order parameters, taking into account all feedback correlations. It is based upon the evolution of the distribution of the local field, the structure of which is determined in detail. As an illustrative example, explicit analytic formula are given for the first few time steps of the dynamics. Furthermore, equilibrium fixed-point equations are derived and compared with the thermodynamic approach. The analytic results find excellent confirmation in extensive numerical simulations.Comment: 22 pages, 12 figure

    Mutual information and self-control of a fully-connected low-activity neural network

    Full text link
    A self-control mechanism for the dynamics of a three-state fully-connected neural network is studied through the introduction of a time-dependent threshold. The self-adapting threshold is a function of both the neural and the pattern activity in the network. The time evolution of the order parameters is obtained on the basis of a recently developed dynamical recursive scheme. In the limit of low activity the mutual information is shown to be the relevant parameter in order to determine the retrieval quality. Due to self-control an improvement of this mutual information content as well as an increase of the storage capacity and an enlargement of the basins of attraction are found. These results are compared with numerical simulations.Comment: 8 pages, 8 ps.figure

    Gardner optimal capacity of the diluted Blume-Emery-Griffiths neural network

    Full text link
    The optimal capacity of a diluted Blume-Emery-Griffiths neural network is studied as a function of the pattern activity and the embedding stability using the Gardner entropy approach. Annealed dilution is considered, cutting some of the couplings referring to the ternary patterns themselves and some of the couplings related to the active patterns, both simultaneously (synchronous dilution) or independently (asynchronous dilution). Through the de Almeida-Thouless criterion it is found that the replica-symmetric solution is locally unstable as soon as there is dilution. The distribution of the couplings shows the typical gap with a width depending on the amount of dilution, but this gap persists even in cases where a particular type of coupling plays no role in the learning process.Comment: 9 pages Latex, 2 eps figure

    An optimal Q-state neural network using mutual information

    Full text link
    Starting from the mutual information we present a method in order to find a hamiltonian for a fully connected neural network model with an arbitrary, finite number of neuron states, Q. For small initial correlations between the neurons and the patterns it leads to optimal retrieval performance. For binary neurons, Q=2, and biased patterns we recover the Hopfield model. For three-state neurons, Q=3, we find back the recently introduced Blume-Emery-Griffiths network hamiltonian. We derive its phase diagram and compare it with those of related three-state models. We find that the retrieval region is the largest.Comment: 8 pages, 1 figur

    On the equivalence of the Ashkin-Teller and the four-state Potts-glass models of neural networks

    Full text link
    We show that for a particular choice of the coupling parameters the Ashkin-Teller spin-glass neural network model with the Hebb learning rule and one condensed pattern yields the same thermodynamic properties as the four-state anisotropic Potts-glass neural network model. This equivalence is not seen at the level of the Hamiltonians.Comment: 3 pages, revtex, additional arguments presente

    Optimal coloured perceptrons

    Full text link
    Ashkin-Teller type perceptron models are introduced. Their maximal capacity per number of couplings is calculated within a first-step replica-symmetry-breaking Gardner approach. The results are compared with extensive numerical simulations using several algorithms.Comment: 8 pages in Latex with 2 eps figures, RSB1 calculations has been adde

    Correlated patterns in non-monotonic graded-response perceptrons

    Full text link
    The optimal capacity of graded-response perceptrons storing biased and spatially correlated patterns with non-monotonic input-output relations is studied. It is shown that only the structure of the output patterns is important for the overall performance of the perceptrons.Comment: 4 pages, 4 figure
    • …
    corecore