386 research outputs found

    Fixed Points of Hopfield Type Neural Networks

    Full text link
    The set of the fixed points of the Hopfield type network is under investigation. The connection matrix of the network is constructed according to the Hebb rule from the set of memorized patterns which are treated as distorted copies of the standard-vector. It is found that the dependence of the set of the fixed points on the value of the distortion parameter can be described analytically. The obtained results are interpreted in the terms of neural networks and the Ising model.Comment: RevTEX, 19 pages, 2 Postscript figures, the full version of the earler brief report (cond-mat/9901251

    On the center of mass of Ising vectors

    Full text link
    We show that the center of mass of Ising vectors that obey some simple constraints, is again an Ising vector.Comment: 8 pages, 3 figures, LaTeX; Claims in connection with disordered systems have been withdrawn; More detailed description of the simulations; Inset added to figure

    Selection of Australian Root Nodule Bacteria for Broad-Scale Inoculation of Native Legumes

    Get PDF
    The unique and diverse native Australian perennial legumes are under current investigation for use as pastures in Australian agriculture. Identification of root nodule bacteria (RNB) that can fix nitrogen effectively for the plant is a critical factor for the success of a legume species in agriculture (Howieson et al., 2000). Some legumes under investigation are relatively promiscuous (Lange, 1961). This trait may allow the development of a single, broad-scale inoculant that could allow inoculation of multiple species of agricultural importance, whilst more effective, specific RNB are developed in time. Aimed to identify strains that can form effective symbioses with several native legume species of potential interest to agriculture, this experiment screened putative indigenous RNB on 5 native legumes

    Statistical Mechanics of Learning in the Presence of Outliers

    Full text link
    Using methods of statistical mechanics, we analyse the effect of outliers on the supervised learning of a classification problem. The learning strategy aims at selecting informative examples and discarding outliers. We compare two algorithms which perform the selection either in a soft or a hard way. When the fraction of outliers grows large, the estimation errors undergo a first order phase transition.Comment: 24 pages, 7 figures (minor extensions added

    Correlated patterns in non-monotonic graded-response perceptrons

    Full text link
    The optimal capacity of graded-response perceptrons storing biased and spatially correlated patterns with non-monotonic input-output relations is studied. It is shown that only the structure of the output patterns is important for the overall performance of the perceptrons.Comment: 4 pages, 4 figure

    A Hebbian approach to complex network generation

    Full text link
    Through a redefinition of patterns in an Hopfield-like model, we introduce and develop an approach to model discrete systems made up of many, interacting components with inner degrees of freedom. Our approach clarifies the intrinsic connection between the kind of interactions among components and the emergent topology describing the system itself; also, it allows to effectively address the statistical mechanics on the resulting networks. Indeed, a wide class of analytically treatable, weighted random graphs with a tunable level of correlation can be recovered and controlled. We especially focus on the case of imitative couplings among components endowed with similar patterns (i.e. attributes), which, as we show, naturally and without any a-priori assumption, gives rise to small-world effects. We also solve the thermodynamics (at a replica symmetric level) by extending the double stochastic stability technique: free energy, self consistency relations and fluctuation analysis for a picture of criticality are obtained

    Slowly evolving geometry in recurrent neural networks I: extreme dilution regime

    Full text link
    We study extremely diluted spin models of neural networks in which the connectivity evolves in time, although adiabatically slowly compared to the neurons, according to stochastic equations which on average aim to reduce frustration. The (fast) neurons and (slow) connectivity variables equilibrate separately, but at different temperatures. Our model is exactly solvable in equilibrium. We obtain phase diagrams upon making the condensed ansatz (i.e. recall of one pattern). These show that, as the connectivity temperature is lowered, the volume of the retrieval phase diverges and the fraction of mis-aligned spins is reduced. Still one always retains a region in the retrieval phase where recall states other than the one corresponding to the `condensed' pattern are locally stable, so the associative memory character of our model is preserved.Comment: 18 pages, 6 figure

    Optimally adapted multi-state neural networks trained with noise

    Full text link
    The principle of adaptation in a noisy retrieval environment is extended here to a diluted attractor neural network of Q-state neurons trained with noisy data. The network is adapted to an appropriate noisy training overlap and training activity which are determined self-consistently by the optimized retrieval attractor overlap and activity. The optimized storage capacity and the corresponding retriever overlap are considerably enhanced by an adequate threshold in the states. Explicit results for improved optimal performance and new retriever phase diagrams are obtained for Q=3 and Q=4, with coexisting phases over a wide range of thresholds. Most of the interesting results are stable to replica-symmetry-breaking fluctuations.Comment: 22 pages, 5 figures, accepted for publication in PR

    The signal-to-noise analysis of the Little-Hopfield model revisited

    Full text link
    Using the generating functional analysis an exact recursion relation is derived for the time evolution of the effective local field of the fully connected Little-Hopfield model. It is shown that, by leaving out the feedback correlations arising from earlier times in this effective dynamics, one precisely finds the recursion relations usually employed in the signal-to-noise approach. The consequences of this approximation as well as the physics behind it are discussed. In particular, it is pointed out why it is hard to notice the effects, especially for model parameters corresponding to retrieval. Numerical simulations confirm these findings. The signal-to-noise analysis is then extended to include all correlations, making it a full theory for dynamics at the level of the generating functional analysis. The results are applied to the frequently employed extremely diluted (a)symmetric architectures and to sequence processing networks.Comment: 26 pages, 3 figure

    Generalizing with perceptrons in case of structured phase- and pattern-spaces

    Full text link
    We investigate the influence of different kinds of structure on the learning behaviour of a perceptron performing a classification task defined by a teacher rule. The underlying pattern distribution is permitted to have spatial correlations. The prior distribution for the teacher coupling vectors itself is assumed to be nonuniform. Thus classification tasks of quite different difficulty are included. As learning algorithms we discuss Hebbian learning, Gibbs learning, and Bayesian learning with different priors, using methods from statistics and the replica formalism. We find that the Hebb rule is quite sensitive to the structure of the actual learning problem, failing asymptotically in most cases. Contrarily, the behaviour of the more sophisticated methods of Gibbs and Bayes learning is influenced by the spatial correlations only in an intermediate regime of α\alpha, where α\alpha specifies the size of the training set. Concerning the Bayesian case we show, how enhanced prior knowledge improves the performance.Comment: LaTeX, 32 pages with eps-figs, accepted by J Phys
    corecore