2,059 research outputs found
Contrastive Hebbian Learning with Random Feedback Weights
Neural networks are commonly trained to make predictions through learning
algorithms. Contrastive Hebbian learning, which is a powerful rule inspired by
gradient backpropagation, is based on Hebb's rule and the contrastive
divergence algorithm. It operates in two phases, the forward (or free) phase,
where the data are fed to the network, and a backward (or clamped) phase, where
the target signals are clamped to the output layer of the network and the
feedback signals are transformed through the transpose synaptic weight
matrices. This implies symmetries at the synaptic level, for which there is no
evidence in the brain. In this work, we propose a new variant of the algorithm,
called random contrastive Hebbian learning, which does not rely on any synaptic
weights symmetries. Instead, it uses random matrices to transform the feedback
signals during the clamped phase, and the neural dynamics are described by
first order non-linear differential equations. The algorithm is experimentally
verified by solving a Boolean logic task, classification tasks (handwritten
digits and letters), and an autoencoding task. This article also shows how the
parameters affect learning, especially the random matrices. We use the
pseudospectra analysis to investigate further how random matrices impact the
learning process. Finally, we discuss the biological plausibility of the
proposed algorithm, and how it can give rise to better computational models for
learning
Complex Networks and Symmetry I: A Review
In this review we establish various connections between complex networks and
symmetry. While special types of symmetries (e.g., automorphisms) are studied
in detail within discrete mathematics for particular classes of deterministic
graphs, the analysis of more general symmetries in real complex networks is far
less developed. We argue that real networks, as any entity characterized by
imperfections or errors, necessarily require a stochastic notion of invariance.
We therefore propose a definition of stochastic symmetry based on graph
ensembles and use it to review the main results of network theory from an
unusual perspective. The results discussed here and in a companion paper show
that stochastic symmetry highlights the most informative topological properties
of real networks, even in noisy situations unaccessible to exact techniques.Comment: Final accepted versio
One-Component Order Parameter in URuSi Uncovered by Resonant Ultrasound Spectroscopy and Machine Learning
The unusual correlated state that emerges in URuSi below T =
17.5 K is known as "hidden order" because even basic characteristics of the
order parameter, such as its dimensionality (whether it has one component or
two), are "hidden". We use resonant ultrasound spectroscopy to measure the
symmetry-resolved elastic anomalies across T. We observe no anomalies in
the shear elastic moduli, providing strong thermodynamic evidence for a
one-component order parameter. We develop a machine learning framework that
reaches this conclusion directly from the raw data, even in a crystal that is
too small for traditional resonant ultrasound. Our result rules out a broad
class of theories of hidden order based on two-component order parameters, and
constrains the nature of the fluctuations from which unconventional
superconductivity emerges at lower temperature. Our machine learning framework
is a powerful new tool for classifying the ubiquitous competing orders in
correlated electron systems
- …