1,010 research outputs found

    Adiabatic Quantum Optimization for Associative Memory Recall

    Get PDF
    Hopfield networks are a variant of associative memory that recall information stored in the couplings of an Ising model. Stored memories are fixed points for the network dynamics that correspond to energetic minima of the spin state. We formulate the recall of memories stored in a Hopfield network using energy minimization by adiabatic quantum optimization (AQO). Numerical simulations of the quantum dynamics allow us to quantify the AQO recall accuracy with respect to the number of stored memories and the noise in the input key. We also investigate AQO performance with respect to how memories are stored in the Ising model using different learning rules. Our results indicate that AQO performance varies strongly with learning rule due to the changes in the energy landscape. Consequently, learning rules offer indirect methods for investigating change to the computational complexity of the recall task and the computational efficiency of AQO.Comment: 22 pages, 11 figures. Updated for clarity and figures, to appear in Frontiers of Physic

    Learning in stochastic neural networks for constraint satisfaction problems

    Get PDF
    Researchers describe a newly-developed artificial neural network algorithm for solving constraint satisfaction problems (CSPs) which includes a learning component that can significantly improve the performance of the network from run to run. The network, referred to as the Guarded Discrete Stochastic (GDS) network, is based on the discrete Hopfield network but differs from it primarily in that auxiliary networks (guards) are asymmetrically coupled to the main network to enforce certain types of constraints. Although the presence of asymmetric connections implies that the network may not converge, it was found that, for certain classes of problems, the network often quickly converges to find satisfactory solutions when they exist. The network can run efficiently on serial machines and can find solutions to very large problems (e.g., N-queens for N as large as 1024). One advantage of the network architecture is that network connection strengths need not be instantiated when the network is established: they are needed only when a participating neural element transitions from off to on. They have exploited this feature to devise a learning algorithm, based on consistency techniques for discrete CSPs, that updates the network biases and connection strengths and thus improves the network performance

    Transformations in the Scale of Behaviour and the Global Optimisation of Constraints in Adaptive Networks

    No full text
    The natural energy minimisation behaviour of a dynamical system can be interpreted as a simple optimisation process, finding a locally optimal resolution of problem constraints. In human problem solving, high-dimensional problems are often made much easier by inferring a low-dimensional model of the system in which search is more effective. But this is an approach that seems to require top-down domain knowledge; not one amenable to the spontaneous energy minimisation behaviour of a natural dynamical system. However, in this paper we investigate the ability of distributed dynamical systems to improve their constraint resolution ability over time by self-organisation. We use a ‘self-modelling’ Hopfield network with a novel type of associative connection to illustrate how slowly changing relationships between system components can result in a transformation into a new system which is a low-dimensional caricature of the original system. The energy minimisation behaviour of this new system is significantly more effective at globally resolving the original system constraints. This model uses only very simple, and fully-distributed positive feedback mechanisms that are relevant to other ‘active linking’ and adaptive networks. We discuss how this neural network model helps us to understand transformations and emergent collective behaviour in various non-neural adaptive networks such as social, genetic and ecological networks

    Dreaming neural networks: forgetting spurious memories and reinforcing pure ones

    Full text link
    The standard Hopfield model for associative neural networks accounts for biological Hebbian learning and acts as the harmonic oscillator for pattern recognition, however its maximal storage capacity is ι∟0.14\alpha \sim 0.14, far from the theoretical bound for symmetric networks, i.e. ι=1\alpha =1. Inspired by sleeping and dreaming mechanisms in mammal brains, we propose an extension of this model displaying the standard on-line (awake) learning mechanism (that allows the storage of external information in terms of patterns) and an off-line (sleep) unlearning&\&consolidating mechanism (that allows spurious-pattern removal and pure-pattern reinforcement): this obtained daily prescription is able to saturate the theoretical bound ι=1\alpha=1, remaining also extremely robust against thermal noise. Both neural and synaptic features are analyzed both analytically and numerically. In particular, beyond obtaining a phase diagram for neural dynamics, we focus on synaptic plasticity and we give explicit prescriptions on the temporal evolution of the synaptic matrix. We analytically prove that our algorithm makes the Hebbian kernel converge with high probability to the projection matrix built over the pure stored patterns. Furthermore, we obtain a sharp and explicit estimate for the "sleep rate" in order to ensure such a convergence. Finally, we run extensive numerical simulations (mainly Monte Carlo sampling) to check the approximations underlying the analytical investigations (e.g., we developed the whole theory at the so called replica-symmetric level, as standard in the Amit-Gutfreund-Sompolinsky reference framework) and possible finite-size effects, finding overall full agreement with the theory.Comment: 31 pages, 12 figure

    Genetic Algorithm for Restricted Maximum k-Satisfiability in the Hopfield Network

    Get PDF
    The restricted Maximum k-Satisfiability MAX- kSAT is an enhanced Boolean satisfiability counterpart that has attracted numerous amount of research. Genetic algorithm has been the prominent optimization heuristic algorithm to solve constraint optimization problem. The core motivation of this paper is to introduce Hopfield network incorporated with genetic algorithm in solving MAX-kSAT problem. Genetic algorithm will be integrated with Hopfield network as a single network. The proposed method will be compared with the conventional Hopfield network. The results demonstrate that Hopfield network with genetic algorithm outperforms conventional Hopfield networks. Furthermore, the outcome had provided a solid evidence of the robustness of our proposed algorithms to be used in other satisfiability problem

    A ferrofluid based neural network: design of an analogue associative memory

    Full text link
    We analyse an associative memory based on a ferrofluid, consisting of a system of magnetic nano-particles suspended in a carrier fluid of variable viscosity subject to patterns of magnetic fields from an array of input and output magnetic pads. The association relies on forming patterns in the ferrofluid during a trainingdphase, in which the magnetic dipoles are free to move and rotate to minimize the total energy of the system. Once equilibrated in energy for a given input-output magnetic field pattern-pair the particles are fully or partially immobilized by cooling the carrier liquid. Thus produced particle distributions control the memory states, which are read out magnetically using spin-valve sensors incorporated in the output pads. The actual memory consists of spin distributions that is dynamic in nature, realized only in response to the input patterns that the system has been trained for. Two training algorithms for storing multiple patterns are investigated. Using Monte Carlo simulations of the physical system we demonstrate that the device is capable of storing and recalling two sets of images, each with an accuracy approaching 100%.Comment: submitted to Neural Network
    • …
    corecore