6,434 research outputs found

    The Relativistic Hopfield network: rigorous results

    Full text link
    The relativistic Hopfield model constitutes a generalization of the standard Hopfield model that is derived by the formal analogy between the statistical-mechanic framework embedding neural networks and the Lagrangian mechanics describing a fictitious single-particle motion in the space of the tuneable parameters of the network itself. In this analogy the cost-function of the Hopfield model plays as the standard kinetic-energy term and its related Mattis overlap (naturally bounded by one) plays as the velocity. The Hamiltonian of the relativisitc model, once Taylor-expanded, results in a P-spin series with alternate signs: the attractive contributions enhance the information-storage capabilities of the network, while the repulsive contributions allow for an easier unlearning of spurious states, conferring overall more robustness to the system as a whole. Here we do not deepen the information processing skills of this generalized Hopfield network, rather we focus on its statistical mechanical foundation. In particular, relying on Guerra's interpolation techniques, we prove the existence of the infinite volume limit for the model free-energy and we give its explicit expression in terms of the Mattis overlaps. By extremizing the free energy over the latter we get the generalized self-consistent equations for these overlaps, as well as a picture of criticality that is further corroborated by a fluctuation analysis. These findings are in full agreement with the available previous results.Comment: 11 pages, 1 figur

    The Hopfield model and its role in the development of synthetic biology

    Get PDF
    Neural network models make extensive use of concepts coming from physics and engineering. How do scientists justify the use of these concepts in the representation of biological systems? How is evidence for or against the use of these concepts produced in the application and manipulation of the models? It will be shown in this article that neural network models are evaluated differently depending on the scientific context and its modeling practice. In the case of the Hopfield model, the different modeling practices related to theoretical physics and neurobiology played a central role for how the model was received and used in the different scientific communities. In theoretical physics, where the Hopfield model has its roots, mathematical modeling is much more common and established than in neurobiology which is strongly experiment driven. These differences in modeling practice contributed to the development of the new field of synthetic biology which introduced a third type of model which combines mathematical modeling and experimenting on biological systems and by doing so mediates between the different modeling practices

    A walk in the statistical mechanical formulation of neural networks

    Full text link
    Neural networks are nowadays both powerful operational tools (e.g., for pattern recognition, data mining, error correction codes) and complex theoretical models on the focus of scientific investigation. As for the research branch, neural networks are handled and studied by psychologists, neurobiologists, engineers, mathematicians and theoretical physicists. In particular, in theoretical physics, the key instrument for the quantitative analysis of neural networks is statistical mechanics. From this perspective, here, we first review attractor networks: starting from ferromagnets and spin-glass models, we discuss the underlying philosophy and we recover the strand paved by Hopfield, Amit-Gutfreund-Sompolinky. One step forward, we highlight the structural equivalence between Hopfield networks (modeling retrieval) and Boltzmann machines (modeling learning), hence realizing a deep bridge linking two inseparable aspects of biological and robotic spontaneous cognition. As a sideline, in this walk we derive two alternative (with respect to the original Hebb proposal) ways to recover the Hebbian paradigm, stemming from ferromagnets and from spin-glasses, respectively. Further, as these notes are thought of for an Engineering audience, we highlight also the mappings between ferromagnets and operational amplifiers and between antiferromagnets and flip-flops (as neural networks -built by op-amp and flip-flops- are particular spin-glasses and the latter are indeed combinations of ferromagnets and antiferromagnets), hoping that such a bridge plays as a concrete prescription to capture the beauty of robotics from the statistical mechanical perspective.Comment: Contribute to the proceeding of the conference: NCTA 2014. Contains 12 pages,7 figure

    Hierarchical neural networks perform both serial and parallel processing

    Get PDF
    In this work we study a Hebbian neural network, where neurons are arranged according to a hierarchical architecture such that their couplings scale with their reciprocal distance. As a full statistical mechanics solution is not yet available, after a streamlined introduction to the state of the art via that route, the problem is consistently approached through signal- to-noise technique and extensive numerical simulations. Focusing on the low-storage regime, where the amount of stored patterns grows at most logarithmical with the system size, we prove that these non-mean-field Hopfield-like networks display a richer phase diagram than their classical counterparts. In particular, these networks are able to perform serial processing (i.e. retrieve one pattern at a time through a complete rearrangement of the whole ensemble of neurons) as well as parallel processing (i.e. retrieve several patterns simultaneously, delegating the management of diff erent patterns to diverse communities that build network). The tune between the two regimes is given by the rate of the coupling decay and by the level of noise affecting the system. The price to pay for those remarkable capabilities lies in a network's capacity smaller than the mean field counterpart, thus yielding a new budget principle: the wider the multitasking capabilities, the lower the network load and viceversa. This may have important implications in our understanding of biological complexity

    Dreaming neural networks: forgetting spurious memories and reinforcing pure ones

    Full text link
    The standard Hopfield model for associative neural networks accounts for biological Hebbian learning and acts as the harmonic oscillator for pattern recognition, however its maximal storage capacity is α∼0.14\alpha \sim 0.14, far from the theoretical bound for symmetric networks, i.e. α=1\alpha =1. Inspired by sleeping and dreaming mechanisms in mammal brains, we propose an extension of this model displaying the standard on-line (awake) learning mechanism (that allows the storage of external information in terms of patterns) and an off-line (sleep) unlearning&\&consolidating mechanism (that allows spurious-pattern removal and pure-pattern reinforcement): this obtained daily prescription is able to saturate the theoretical bound α=1\alpha=1, remaining also extremely robust against thermal noise. Both neural and synaptic features are analyzed both analytically and numerically. In particular, beyond obtaining a phase diagram for neural dynamics, we focus on synaptic plasticity and we give explicit prescriptions on the temporal evolution of the synaptic matrix. We analytically prove that our algorithm makes the Hebbian kernel converge with high probability to the projection matrix built over the pure stored patterns. Furthermore, we obtain a sharp and explicit estimate for the "sleep rate" in order to ensure such a convergence. Finally, we run extensive numerical simulations (mainly Monte Carlo sampling) to check the approximations underlying the analytical investigations (e.g., we developed the whole theory at the so called replica-symmetric level, as standard in the Amit-Gutfreund-Sompolinsky reference framework) and possible finite-size effects, finding overall full agreement with the theory.Comment: 31 pages, 12 figure
    • …
    corecore