Abstract. In most artificial neural networks, learning capabilities suffer from sudden and total forgetting of all previous learned information when newly arrived information is requested for learning. The particularity that gives to these networks the ability to generalize causes also a phenomenon known as catastrophic forgetting. In this article we examine a biological plausible solution to prevent this undesirable effect while retaining an abstract of past knowledge which attenuates with time. The following mechanism derived from a connectionist model, recently proposed, is known as the reverberating self-refreshing mechanism. That mechanism appears capable of overcoming the so called plasticity – elasticity dilema
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.