1,040 research outputs found
Inference and learning in sparse systems with multiple states
We discuss how inference can be performed when data are sampled from the
non-ergodic phase of systems with multiple attractors. We take as model system
the finite connectivity Hopfield model in the memory phase and suggest a cavity
method approach to reconstruct the couplings when the data are separately
sampled from few attractor states. We also show how the inference results can
be converted into a learning protocol for neural networks in which patterns are
presented through weak external fields. The protocol is simple and fully local,
and is able to store patterns with a finite overlap with the input patterns
without ever reaching a spin glass phase where all memories are lost.Comment: 15 pages, 10 figures, to be published in Phys. Rev.
A "Cellular Neuronal" Approach to Optimization Problems
The Hopfield-Tank (1985) recurrent neural network architecture for the
Traveling Salesman Problem is generalized to a fully interconnected "cellular"
neural network of regular oscillators. Tours are defined by synchronization
patterns, allowing the simultaneous representation of all cyclic permutations
of a given tour. The network converges to local optima some of which correspond
to shortest-distance tours, as can be shown analytically in a stationary phase
approximation. Simulated annealing is required for global optimization, but the
stochastic element might be replaced by chaotic intermittency in a further
generalization of the architecture to a network of chaotic oscillators.Comment: -2nd revised version submitted to Chaos (original version submitted
6/07
Psychophysical identity and free energy
An approach to implementing variational Bayesian inference in biological
systems is considered, under which the thermodynamic free energy of a system
directly encodes its variational free energy. In the case of the brain, this
assumption places constraints on the neuronal encoding of generative and
recognition densities, in particular requiring a stochastic population code.
The resulting relationship between thermodynamic and variational free energies
is prefigured in mind-brain identity theses in philosophy and in the Gestalt
hypothesis of psychophysical isomorphism.Comment: 22 pages; published as a research article on 8/5/2020 in Journal of
the Royal Society Interfac
Adiabatic Quantum Optimization for Associative Memory Recall
Hopfield networks are a variant of associative memory that recall information
stored in the couplings of an Ising model. Stored memories are fixed points for
the network dynamics that correspond to energetic minima of the spin state. We
formulate the recall of memories stored in a Hopfield network using energy
minimization by adiabatic quantum optimization (AQO). Numerical simulations of
the quantum dynamics allow us to quantify the AQO recall accuracy with respect
to the number of stored memories and the noise in the input key. We also
investigate AQO performance with respect to how memories are stored in the
Ising model using different learning rules. Our results indicate that AQO
performance varies strongly with learning rule due to the changes in the energy
landscape. Consequently, learning rules offer indirect methods for
investigating change to the computational complexity of the recall task and the
computational efficiency of AQO.Comment: 22 pages, 11 figures. Updated for clarity and figures, to appear in
Frontiers of Physic
Algorithms for identification and categorization
The main features of a family of efficient algorithms for recognition and
classification of complex patterns are briefly reviewed. They are inspired in
the observation that fast synaptic noise is essential for some of the
processing of information in the brain.Comment: 6 pages, 5 figure
Disappearance of Spurious States in Analog Associative Memories
We show that symmetric n-mixture states, when they exist, are almost never
stable in autoassociative networks with threshold-linear units. Only with a
binary coding scheme we could find a limited region of the parameter space in
which either 2-mixtures or 3-mixtures are stable attractors of the dynamics.Comment: 5 pages, 3 figures, accepted for publication in Phys Rev
State-Dependent Computation Using Coupled Recurrent Networks
Although conditional branching between possible behavioral states is a hallmark of intelligent behavior, very little is known about the neuronal mechanisms that support this processing. In a step toward solving this problem, we demonstrate by theoretical analysis and simulation how
networks of richly interconnected neurons, such as those observed in the superficial layers of the neocortex, can embed reliable, robust finite state machines. We show how a multistable neuronal network containing a number of states can be created very simply by coupling two recurrent
networks whose synaptic weights have been configured for soft winner-take-all (sWTA) performance. These two sWTAs have simple, homogeneous, locally recurrent connectivity except for a small fraction of recurrent cross-connections between them, which are used to embed the required states. This coupling between the maps allows the network to continue to express the current state even after the input that elicited that state iswithdrawn. In addition, a small number of transition neurons implement the necessary input-driven transitions between the embedded states. We provide simple rules to systematically design and construct neuronal state machines of this kind. The significance of our finding is that it offers a method whereby the cortex could construct networks supporting a broad range of sophisticated processing by applying only small specializations to the same generic neuronal circuit
- …