IGMN: An incremental connectionist approach for concept formation, reinforcement learning and robotics

Abstract

This paper demonstrates the use of a new connectionist approach, called IGMN (standing for Incremental Gaussian Mixture Network) in some state-of-the-art research problems such as incremental concept formation, reinforcement learning and robotic mapping. IGMN is inspired on recent theories about the brain, especially the Memory-Prediction Framework and the Constructivist Artificial Intelligence, which endows it with some special features that are not present in most neural network models such as MLP, RBF and GRNN. Moreover, IGMN is based on strong statistical principles (Gaussian mixture models) and asymptotically converges to the optimal regression surface as more training data arrive. Through several experiments using the proposed model it is also demonstrated that IGMN learns incrementally from data flows (each data can be immediately used and discarded), it is not sensible to initialization conditions, does not require fine-tuning its configuration parameters and has a good computational performance, thus allowing its use in real time control applications. Therefore, IGMN is a very useful machine learning tool for concept formation and robotic tasks.Key words: Artificial neural networks, Bayesian methods, concept formation, incremental learning, Gaussianmixture models, autonomous robots, reinforcement learning

    Similar works