39,965 research outputs found
Dense Associative Memory for Pattern Recognition
A model of associative memory is studied, which stores and reliably retrieves
many more patterns than the number of neurons in the network. We propose a
simple duality between this dense associative memory and neural networks
commonly used in deep learning. On the associative memory side of this duality,
a family of models that smoothly interpolates between two limiting cases can be
constructed. One limit is referred to as the feature-matching mode of pattern
recognition, and the other one as the prototype regime. On the deep learning
side of the duality, this family corresponds to feedforward neural networks
with one hidden layer and various activation functions, which transmit the
activities of the visible neurons to the hidden layer. This family of
activation functions includes logistics, rectified linear units, and rectified
polynomials of higher degrees. The proposed duality makes it possible to apply
energy-based intuition from associative memory to analyze computational
properties of neural networks with unusual activation functions - the higher
rectified polynomials which until now have not been used in deep learning. The
utility of the dense memories is illustrated for two test cases: the logical
gate XOR and the recognition of handwritten digits from the MNIST data set.Comment: Accepted for publication at NIPS 201
You can go your own way: effectiveness of participant-driven versus experimenter-driven processing strategies in memory training and transfer
Cognitive training programs that instruct specific strategies frequently
show limited transfer. Open-ended approaches can achieve greater transfer, but may fail to benefit many older adults due to age deficits in self-initiated processing. We examined whether a compromise that encourages effort at encoding without an experimenter-prescribed strategy might yield better results. Older adults completed memory training under conditions that either (1) mandated a specific strategy to increase deep, associative encoding, (2) attempted to suppress such encoding by mandating rote rehearsal, or (3) encouraged time and effort toward encoding but allowed for strategy choice. The experimenter-enforced associative encoding strategy succeeded in creating integrated representations of studied items, but training-task progress was related to pre-existing ability. Independent of condition assignment, self-reported deep encoding was associated with positive training and transfer effects, suggesting that the most beneficial outcomes occur when environmental support guiding effort is provided but participants generate their own strategies
Quantum pattern recognition with liquid-state nuclear magnetic resonance
A novel quantum pattern recognition scheme is presented, which combines the
idea of a classic Hopfield neural network with adiabatic quantum computation.
Both the input and the memorized patterns are represented by means of the
problem Hamiltonian. In contrast to classic neural networks, the algorithm can
return a quantum superposition of multiple recognized patterns. A proof of
principle for the algorithm for two qubits is provided using a liquid state NMR
quantum computer.Comment: updated version, Journal-ref adde
- …