345 research outputs found
Adiabatic Quantum Optimization for Associative Memory Recall
Hopfield networks are a variant of associative memory that recall information
stored in the couplings of an Ising model. Stored memories are fixed points for
the network dynamics that correspond to energetic minima of the spin state. We
formulate the recall of memories stored in a Hopfield network using energy
minimization by adiabatic quantum optimization (AQO). Numerical simulations of
the quantum dynamics allow us to quantify the AQO recall accuracy with respect
to the number of stored memories and the noise in the input key. We also
investigate AQO performance with respect to how memories are stored in the
Ising model using different learning rules. Our results indicate that AQO
performance varies strongly with learning rule due to the changes in the energy
landscape. Consequently, learning rules offer indirect methods for
investigating change to the computational complexity of the recall task and the
computational efficiency of AQO.Comment: 22 pages, 11 figures. Updated for clarity and figures, to appear in
Frontiers of Physic
A "Cellular Neuronal" Approach to Optimization Problems
The Hopfield-Tank (1985) recurrent neural network architecture for the
Traveling Salesman Problem is generalized to a fully interconnected "cellular"
neural network of regular oscillators. Tours are defined by synchronization
patterns, allowing the simultaneous representation of all cyclic permutations
of a given tour. The network converges to local optima some of which correspond
to shortest-distance tours, as can be shown analytically in a stationary phase
approximation. Simulated annealing is required for global optimization, but the
stochastic element might be replaced by chaotic intermittency in a further
generalization of the architecture to a network of chaotic oscillators.Comment: -2nd revised version submitted to Chaos (original version submitted
6/07
Energy-based General Sequential Episodic Memory Networks at the Adiabatic Limit
The General Associative Memory Model (GAMM) has a constant state-dependant
energy surface that leads the output dynamics to fixed points, retrieving
single memories from a collection of memories that can be asynchronously
preloaded. We introduce a new class of General Sequential Episodic Memory
Models (GSEMM) that, in the adiabatic limit, exhibit temporally changing energy
surface, leading to a series of meta-stable states that are sequential episodic
memories. The dynamic energy surface is enabled by newly introduced asymmetric
synapses with signal propagation delays in the network's hidden layer. We study
the theoretical and empirical properties of two memory models from the GSEMM
class, differing in their activation functions. LISEM has non-linearities in
the feature layer, whereas DSEM has non-linearity in the hidden layer. In
principle, DSEM has a storage capacity that grows exponentially with the number
of neurons in the network. We introduce a learning rule for the synapses based
on the energy minimization principle and show it can learn single memories and
their sequential relationships online. This rule is similar to the Hebbian
learning algorithm and Spike-Timing Dependent Plasticity (STDP), which describe
conditions under which synapses between neurons change strength. Thus, GSEMM
combines the static and dynamic properties of episodic memory under a single
theoretical framework and bridges neuroscience, machine learning, and
artificial intelligence
- …