Skip to main content
Article thumbnail
Location of Repository

Unsupervised Topology Preserving Networks that Learns Sequentially

By George Palamas, George Papadourakis, Manolis Kavoussanos and Andrew Ware

Abstract

Abstract. In most artificial neural networks, learning capabilities suffer from sudden and total forgetting of all previous learned information when newly arrived information is requested for learning. The particularity that gives to these networks the ability to generalize causes also a phenomenon known as catastrophic forgetting. In this article we examine a biological plausible solution to prevent this undesirable effect while retaining an abstract of past knowledge which attenuates with time. The following mechanism derived from a connectionist model, recently proposed, is known as the reverberating self-refreshing mechanism. That mechanism appears capable of overcoming the so called plasticity – elasticity dilema

Topics: pseudorehearsal, GNG, SOM, catastrophic
Year: 2009
OAI identifier: oai:CiteSeerX.psu:10.1.1.135.9048
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.ripublication.com/i... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.