837 research outputs found

    The Relativistic Hopfield network: rigorous results

    Full text link
    The relativistic Hopfield model constitutes a generalization of the standard Hopfield model that is derived by the formal analogy between the statistical-mechanic framework embedding neural networks and the Lagrangian mechanics describing a fictitious single-particle motion in the space of the tuneable parameters of the network itself. In this analogy the cost-function of the Hopfield model plays as the standard kinetic-energy term and its related Mattis overlap (naturally bounded by one) plays as the velocity. The Hamiltonian of the relativisitc model, once Taylor-expanded, results in a P-spin series with alternate signs: the attractive contributions enhance the information-storage capabilities of the network, while the repulsive contributions allow for an easier unlearning of spurious states, conferring overall more robustness to the system as a whole. Here we do not deepen the information processing skills of this generalized Hopfield network, rather we focus on its statistical mechanical foundation. In particular, relying on Guerra's interpolation techniques, we prove the existence of the infinite volume limit for the model free-energy and we give its explicit expression in terms of the Mattis overlaps. By extremizing the free energy over the latter we get the generalized self-consistent equations for these overlaps, as well as a picture of criticality that is further corroborated by a fluctuation analysis. These findings are in full agreement with the available previous results.Comment: 11 pages, 1 figur

    Quantum Pattern Retrieval by Qubit Networks with Hebb Interactions

    Get PDF
    Qubit networks with long-range interactions inspired by the Hebb rule can be used as quantum associative memories. Starting from a uniform superposition, the unitary evolution generated by these interactions drives the network through a quantum phase transition at a critical computation time, after which ferromagnetic order guarantees that a measurement retrieves the stored memory. The maximum memory capacity p of these qubit networks is reached at a memory density p/n=1.Comment: To appear in Physical Review Letter

    Inference and learning in sparse systems with multiple states

    Full text link
    We discuss how inference can be performed when data are sampled from the non-ergodic phase of systems with multiple attractors. We take as model system the finite connectivity Hopfield model in the memory phase and suggest a cavity method approach to reconstruct the couplings when the data are separately sampled from few attractor states. We also show how the inference results can be converted into a learning protocol for neural networks in which patterns are presented through weak external fields. The protocol is simple and fully local, and is able to store patterns with a finite overlap with the input patterns without ever reaching a spin glass phase where all memories are lost.Comment: 15 pages, 10 figures, to be published in Phys. Rev.

    Quantum annealing for the number partitioning problem using a tunable spin glass of ions

    Full text link
    Exploiting quantum properties to outperform classical ways of information-processing is an outstanding goal of modern physics. A promising route is quantum simulation, which aims at implementing relevant and computationally hard problems in controllable quantum systems. Here we demonstrate that in a trapped ion setup, with present day technology, it is possible to realize a spin model of the Mattis type that exhibits spin glass phases. Remarkably, our method produces the glassy behavior without the need for any disorder potential, just by controlling the detuning of the spin-phonon coupling. Applying a transverse field, the system can be used to benchmark quantum annealing strategies which aim at reaching the ground state of the spin glass starting from the paramagnetic phase. In the vicinity of a phonon resonance, the problem maps onto number partitioning, and instances which are difficult to address classically can be implemented.Comment: accepted version (11 pages, 7 figures

    Dreaming neural networks: forgetting spurious memories and reinforcing pure ones

    Full text link
    The standard Hopfield model for associative neural networks accounts for biological Hebbian learning and acts as the harmonic oscillator for pattern recognition, however its maximal storage capacity is α∼0.14\alpha \sim 0.14, far from the theoretical bound for symmetric networks, i.e. α=1\alpha =1. Inspired by sleeping and dreaming mechanisms in mammal brains, we propose an extension of this model displaying the standard on-line (awake) learning mechanism (that allows the storage of external information in terms of patterns) and an off-line (sleep) unlearning&\&consolidating mechanism (that allows spurious-pattern removal and pure-pattern reinforcement): this obtained daily prescription is able to saturate the theoretical bound α=1\alpha=1, remaining also extremely robust against thermal noise. Both neural and synaptic features are analyzed both analytically and numerically. In particular, beyond obtaining a phase diagram for neural dynamics, we focus on synaptic plasticity and we give explicit prescriptions on the temporal evolution of the synaptic matrix. We analytically prove that our algorithm makes the Hebbian kernel converge with high probability to the projection matrix built over the pure stored patterns. Furthermore, we obtain a sharp and explicit estimate for the "sleep rate" in order to ensure such a convergence. Finally, we run extensive numerical simulations (mainly Monte Carlo sampling) to check the approximations underlying the analytical investigations (e.g., we developed the whole theory at the so called replica-symmetric level, as standard in the Amit-Gutfreund-Sompolinsky reference framework) and possible finite-size effects, finding overall full agreement with the theory.Comment: 31 pages, 12 figure
    • …
    corecore