2 research outputs found
Activation Learning by Local Competitions
Despite its great success, backpropagation has certain limitations that
necessitate the investigation of new learning methods. In this study, we
present a biologically plausible local learning rule that improves upon Hebb's
well-known proposal and discovers unsupervised features by local competitions
among neurons. This simple learning rule enables the creation of a forward
learning paradigm called activation learning, in which the output activation
(sum of the squared output) of the neural network estimates the likelihood of
the input patterns, or "learn more, activate more" in simpler terms. For
classification on a few small classical datasets, activation learning performs
comparably to backpropagation using a fully connected network, and outperforms
backpropagation when there are fewer training samples or unpredictable
disturbances. Additionally, the same trained network can be used for a variety
of tasks, including image generation and completion. Activation learning also
achieves state-of-the-art performance on several real-world datasets for
anomaly detection. This new learning paradigm, which has the potential to unify
supervised, unsupervised, and semi-supervised learning and is reasonably more
resistant to adversarial attacks, deserves in-depth investigation.Comment: Updated Equation (13) for the modification rule with feedback; Adding
discussions regarding activation learning for anormaly detectio