16 research outputs found
Input significance analysis: feature ranking through synaptic weights manipulation for ANNS-based classifiers
Due to the ANNs architecture, the ISA methods that can manipulate synaptic weights selectedare Connection Weights (CW) and Garsonâs Algorithm (GA). The ANNs-based classifiers thatcan provide such manipulation are Multi-Layer Perceptron (MLP) and Evolving Fuzzy NeuralNetworks (EFuNNs). The goals for this work are firstly to identify which of the twoclassifiers works best with the filtered/ranked data, secondly is to test the FR method by usinga selected dataset taken from the UCI Machine Learning Repository and in an onlineenvironment and lastly to attest the FR results by using another selected dataset taken fromthe same source and in the same environment. There are three groups of experimentsconducted to accomplish these goals. The results are promising when FR is applied, someefficiency and accuracy are noticeable compared to the original data.Keywords: artificial neural networks, input significance analysis; feature selection; featureranking; connection weights; Garsonâs algorithm; multi-layer perceptron; evolving fuzzyneural networks
Input significance analysis: feature selection through synaptic weights manipulation for EFuNNs classifier
This work is interested in ISA methods that can manipulate synaptic weights namelyConnection Weights (CW) and Garsonâs Algorithm (GA) and the classifier selected isEvolving Fuzzy Neural Networks (EFuNNs). Firstly, it test FS method on a dataset selectedfrom the UCI Machine Learning Repository and executed in an online environment, recordthe results and compared with the results that used original and ranked data from the previouswork. This is to identify whether FS can contribute to improved results and which of the ISAmethods mentioned above that work well with FS, i.e. give the best results. Secondly, to attestthe FS results by using a differently selected dataset taken from the same source and in thesame environment. The results are promising when FS is applied, some efficiency andaccuracy are noticeable compared to the original and ranked data.Keywords: feature selection; feature ranking; input significance analysis; evolvingconnectionist systems; evolving fuzzy neural network; connection weights; Garsonâsalgorithm
Um modelo de rede neuro-fuzzy baseada em funçþes de base radial capaz de inferir regras do tipo Mamdani
Dissertação (mestrado) - Universidade Federal de Santa Catarina, Centro TecnolĂłgico, Programa de PĂłs-Graduação em CiĂŞncia da Computação, FlorianĂłpolis, 2015.Este trabalho tem como objetivo apresentar um novo sistema de inferĂŞncia neuro-fuzzy, chamado RBFuzzy, capaz de extrair conhecimento a partir de dados e gerar regras fuzzy do tipo Mamdani com alta interpretabilidade. A RBFuzzy ĂŠ um sistema de inferĂŞncia neuro-fuzzy que aproveita o comportamento funcional de neurĂ´nios ativados por Funçþes de Base Radial (RBF) e sua relação com sistemas de inferĂŞncia fuzzy. A arquitetura da rede RBFuzzy permite extrair um conjunto de regras linguĂsticas a partir da estrutura conexionista e dos pesos ajustados de uma rede neural. Uma extensĂŁo do algoritmo da otimização da colĂ´nia de formigas (ACO, do inglĂŞs ant colony optimization algorithm) ĂŠ utilizada para ajustar os pesos de cada regra para gerar um conjunto de regras fuzzy acurado e interpretĂĄvel. Tendo um conjunto de regras fuzzy um especialista pode adicionar regras novas para incorporar conhecimento novo ao modelo de previsĂŁo gerado e tambĂŠm corrigir regras que foram geradas por dados imprecisos.Abstract : This work presents a novel neuro-fuzzy inference system, called RBFuzzy, capable of knowledge extraction and generation of highly interpretable Mamdani-type fuzzy rules. RBFuzzy is a four layer neuro-fuzzy inference system that takes advantage of the functional behavior of Radial Basis Function (RBF) neurons and their relationship with fuzzy inference systems. Inputs are combined in the RBF neurons to compound the antecedents of fuzzy rules. The fuzzy rules consequents are determined by the third layer neurons where each neuron represents a Mamdani-type fuzzy output variable in the form of a linguistic term. The last layer weights each fuzzy rule and generates the crisp output. An extension of the ant-colony optimization (ACO) algorithm is used to adjust the weights of each rule in order to generate an accurate and interpretable fuzzy rule set. For benchmarking purposes some experiments with classic datasets were carried out to compare our proposal with the EFuNN neuro-fuzzy model. The RBFuzzy was also applied in a real world oil well-log database to model and forecast the Rate of Penetration (ROP) of a drill bit for a given oshore well drilling section. The obtained results show that our model can reach the same level of accuracy with fewer rules when compared to the EFuNN, which facilitates understandingthe operation of the system by a human expert
Analysis of the macroeconomic development of European and Asia-Pacific countries with the use of connectionist models
Please note that this is a searchable PDF derived via optical character recognition (OCR) from the original source document. As the OCR process is never 100% perfect, there may be some discrepancies between the document image and the underlying text.The paper applies novel techniques for on-line, adaptive learning of macroeconomic data and a consecutive analysis and prediction. The evolving connectionist system paradigm (ECOS) is used in its two versionsâunsupervised (evolving self-organised maps), and supervised (evolving fuzzy neural networksâEFuNN). In addition to these techniques self-organised maps (SOM) are also employed for finding clusters of countries based on their macroeconomic parameters. EFuNNs allow for modelling, clustering, prediction and rule extraction. The rules that describe future annual values for the consumer price index (CPI), interest rate, unemployment and GDP per capita are extracted from data and reported in the paper for both globalâEU-Asia block of countries, and for smaller groupsâEU, EU-candidate countries, Asia-Pacific countries. The analysis and prediction models proof to be useful tools for the analysis of trends in macroeconomic development of clusters of countries and their future prediction.Unpublished1. Kasabov, N. (1998) âThe ECOS Framework and the ECO Learning Method for Evolving Connectionist Systemsâ, Journal of Advanced Computational Intelligence, 2(6), 1-8.
2. Kohonen, T. (1997) Self-organizing Maps, 2nd Edition, Springer-Verlag, 1997.
3. Deng, D. and Kasabov, N. (1999) ESOM: An Algorithm to Evolve Self-Organizing Maps from Online Data Streams, Proc. of IJCNNâ200, VI:38, Como, Italy, July 2000.
4. Kasabov, N. (1998) âEvolving Fuzzy Neural Networks - Algorithms, Applications and Biological Motivationâ, in: in: Yamakawa and Matsumoto (Eds.), Methodologies for the Conception, Design and Application of Scientific Computing, World Scientific, 271-274.
5 N. Kasabov, L. Erzegovesi, M.Fedrizzi, A. Beber, D. Deng: Hybrid intelligent decision support systems and applications for risk analysis and discovery of evolving economic clusters in Europe, in: Future Directions for Intelligent Systems and Information Sciences, N.Kasabov (ed), 2000, Springer Verlag (Physica Verlag)
Analysis of the macroeconomic development of European and Asia-Pacific countries with the use of connectionist models
Please note that this is a searchable PDF derived via optical character recognition (OCR) from the original source document. As the OCR process is never 100% perfect, there may be some discrepancies between the document image and the underlying text.The paper applies novel techniques for on-line, adaptive learning of macroeconomic data and a consecutive analysis and prediction. The evolving connectionist system paradigm (ECOS) is used in its two versionsâunsupervised (evolving self-organised maps), and supervised (evolving fuzzy neural networksâEFuNN). In addition to these techniques self-organised maps (SOM) are also employed for finding clusters of countries based on their macroeconomic parameters. EFuNNs allow for modelling, clustering, prediction and rule extraction. The rules that describe future annual values for the consumer price index (CPI), interest rate, unemployment and GDP per capita are extracted from data and reported in the paper for both globalâEU-Asia block of countries, and for smaller groupsâEU, EU-candidate countries, Asia-Pacific countries. The analysis and prediction models proof to be useful tools for the analysis of trends in macroeconomic development of clusters of countries and their future prediction.Unpublished1. Kasabov, N. (1998) âThe ECOS Framework and the ECO Learning Method for Evolving Connectionist Systemsâ, Journal of Advanced Computational Intelligence, 2(6), 1-8.
2. Kohonen, T. (1997) Self-organizing Maps, 2nd Edition, Springer-Verlag, 1997.
3. Deng, D. and Kasabov, N. (1999) ESOM: An Algorithm to Evolve Self-Organizing Maps from Online Data Streams, Proc. of IJCNNâ200, VI:38, Como, Italy, July 2000.
4. Kasabov, N. (1998) âEvolving Fuzzy Neural Networks - Algorithms, Applications and Biological Motivationâ, in: in: Yamakawa and Matsumoto (Eds.), Methodologies for the Conception, Design and Application of Scientific Computing, World Scientific, 271-274.
5 N. Kasabov, L. Erzegovesi, M.Fedrizzi, A. Beber, D. Deng: Hybrid intelligent decision support systems and applications for risk analysis and discovery of evolving economic clusters in Europe, in: Future Directions for Intelligent Systems and Information Sciences, N.Kasabov (ed), 2000, Springer Verlag (Physica Verlag)
Evolving self-organizing maps for on-line learning, data analysis and modelling
In real world information systems, data analysis and processing are usually needed to be done in an on-line, self-adaptive way. In this respect, neural algorithms of incremental learning and constructive network models are of increased interest. In this paper we present a new algorithm of evolving self-organizing map (ESOM), which features fast one-pass learning, dynamic network structure, and good visualisation ability. Simulations have been carried out on some benchmark data sets for classification and prediction tasks, as well as on some macroeconomic data for data analysis. Compared with other methods, ESOM achieved better classification with much shorter learning time. Its performance for time series modelling is also comparable, requiring more hidden units but with only one-pass learning. Our results demonstrate that ESOM is an effective computational model for on-line learning, data analysis and modelling.UnpublishedBezdek J.C., Pal N.R. 1995. A note on self-organizing semantic maps. IEEE Trans. on Neural Networks, 6(5):1029-1036.
Blackmore J., and Miikkulainen R. 1993. Incremental Grid Growing: Encoding High-Dimensional Structure into a Two-Dimensional Feature Map, Proc. ICNNâ93, Int. Conf. on Neural Networks, Vol. I, 450-455, IEEE Service Center.
Bruske, J., and Sommer G. 1995. Dynamic cell structure learns perfectly topology preserving map. Neural Comp., 7, 845-865.
Deboeck, G. 1999. Investment maps for emerging markets. Neuro-fuzzy techniques for intelligent information systems (N.Kasabov and R.Kozma Eds.). Physica Verlag, 373-395.
Fritzke, B. 1991. Unsupervised clustering with growing cell structures. Proc. IJCNN 91, 531-536.
Fritzke, B. 1994. Growing cell structures - a self-organizing network for unsupervised and supervised learning. Neural Networks, 7, 1441-1460.
Fritzke, B. 1995. A growing neural gas network learns topologies, in Advances in neural information processing Systems , D. Touretzky, T.K. Keen eds., pp.625-632, Cambridge MA: MIT Press.
Heinke, D., and Hamker, F.H. 1998. Comparing neural networks: a benchmark on growing neural gas, growing cell structures, and fuzzy ARTMAP. IEEE Trans. on Neural Networks, 9, 1279- 1291.
Heskes, T.M., and Kappen B. 1991. Learning processes in neural networks. Physical Review A, 44, 2718-2726.
Kadirkamanathan, V., Niranjan, M. 1993. A function estimation approach to sequential learning with neural networks, Neural Comp., 5, 954-975.
Kasabov, N. 1998a. The ECOS framework and the ECO learning method for evolving connectionist systems. Jour. of Advanced Computational Intelligence, 2, 1-8.
Kasabov, N. 1998b. Evolving fuzzy neural networks - algorithms, applications and biological moti- vation. in: Yamakawa and Matsumoto (Eds.) Methodologies for the Conception, Design, and Application of Soft Computing, World Scientific, 271-274.
Kasabov, N., Erzegoveri, L. et. al. 2000. Hybrid intelligent decision support systems and applications for risk analysis and prediction of evolving economic clusters in Europe. To appear in N.Kasabov (ed), Future Directions for Intelligent Systems and Information Sciences, Physica Verlag (Springer Verlag).
Kaski, S. 1997. Data exploration using self-organizing maps. Mathematics, Computing and Management in Engineering Series No. 82, Acta Polytechnica Scandinavica.
Kohonen T. 1982. Self-organizing formation of topologically correct feature maps, Biological Cybernetics, v. 43, 59-69.
Kohonen, T., Hynninen, J., Kangas, J. et al. 1996. LVQ PAK: The learning vector quantization program package, Report A30, Laboratory of Computer and Information Science, Helsinki University of Technology.
Kohonen, T. 1997. Self-Organizing Maps, second edition, Springer.
Lawrence, S., and Giles, C.L. 1998. Searching the world wide web, Science, Apr. 3, 1998, pp.98-100.
Mao J., and Jain, A.K. 1995. Artificial neural networks for feature extraction and multivariate data projection. IEEE Trans. on Neural Networks, 6, 296-317.
MacQueen, J. 1967. Some methods for classification and analysis of multivariate observations, Proc. 5th Berkeley Symp. on Mathematics, L.M. LeCam and J. Neyman eds., pp.281-297.
Martinetz T.M., Berkovich S.G. and Schulten K.J. 1993. âNeural-Gasâ network for vector quantization and its application to time-series prediction. IEEE Trans. on Neural Networks, 4, 558-569.
Meyering, A. and Ritter H. 1992. Learning 3d-shape-perception with local linear maps, in Proc. of IJCNN 92, pp.IV:432-436, Baltimore.
Mulier, F. and Cherkassky, V. 1995. Self-organization as an interative kernel smoothing process, Neural Comp., 7, 1165-1177.
Nowlan, S.J. 1990. Maximum likelihood competitive learning, in Advances in Neural Information Processing Systems 2, D. Touretzky ed., pp.574-582, New York: Morgan Kauffman.
Platt, J. 1991. A resource-allocating network for function interpolation. Neural Comp., 3, 213-225.
Ritter H., Kohonen T. 1989. Self-organizing semantic maps, Biological Cybernetics, 61, 241-254.
Robinson A.J. 1989. Dynamic error propagation networks. Ph.D. thesis, Cambridge University.
Rosipal R., Koska M., and Farkas, I. 1998. Prediction of chaotic time-series with a resource-allocating RBF network. Neural Processing Letters, 7, 185-197.
Sammon, Jr., J. 1969. A non-linear mapping for data structure analysis. IEEE Trans. on Comput- ers, 18, 401-09.
Serrano-Cinca, C. 1996. Self organizing neural networks for financial diagnosis. Decision Support Systems, 17, 227-38.
Schaal, S., & Atkeson, C.G. 1998. Constructive incremental learning from only local information. Neural Comp., 10, 2047-2084.
Vesanto, J. 1997. Using the SOM and local models in time-series prediction. Proc. of WSOMâ97, pp.209-214. Helsinki University of Technology, Neural Networks Research Centre, Finland
Evolving fuzzy neural networks for on-line knowledge discovery
Fuzzy neural networks are connectionist systems that facilitate learning from data, reasoning over fuzzy rules, rule insertion, rule extraction, and rule adaptation. The concept evolving fuzzy neural networks (EFuNNs), with respective algorithms for learning, aggregation, rule insertion, rule extraction, is further developed here and applied for on-line knowledge discovery on both prediction and classification tasks. EFuNNs operate in an on-line mode and learn incrementally through locally tuned elements. They grow as data arrive, and regularly shrink through pruning of nodes, or through node aggregation. The aggregation procedure is functionally equivalent to knowledge abstraction. The features of EFuNNs are illustrated on two real-world application problems---one from macroeconomics and another from Bioinformatics. EFuNNs are suitable for fast learning of on-line incoming data (e.g., financial and economic time series, biological process control), adaptive learning of speech and video data, incremental learning and knowledge discovery from growing databases (e.g. in Bioinformatics), on-line tracing of processes over time, life-long learning. The paper includes also a short review of the most common types of rules used in the knowledge-based neural networks for knowledge discovery and data mining.Unpublished1. Alpaydin, E. âGAL: networks that grow when they learn and shrink when they forgetâ, TR 91-032, Int.Computer Sci. Inst., Berkeley, CA (1991).
2. Amari, S. and Kasabov, N. eds, Brain-like computing and intelligent information systems, Springer Verlag (1997).
3. Andrews, R., J. Diederich, A.B.Tickle, "A Survey and Critique of Techniques for Extracting Rules from Trained Artificial Neural Networks", Knowledge-Based Systems, 8, 373â389 (1995).
4. Arbib, M (ed.) The Handbook of Brain Theory and Neural Networks, The MIT Press (1995)
5. Berenji, H., Khedkar, P. âLearning and tuning fuzzy logic controllers through. IEEE Trans. on Neural Networks, 3, 724â740 (1992)
6. Carpenter, G. and Grossberg, S., Pattern recognition by self-organizing neural networks , The MIT Press, Cambridge, Massachusetts (1991)
7. Carpenter, G.A., Grossberg, S., Markuzon, N., Reynolds, J.H., Rosen, D.B., FuzzyARTMAP: A neural network architecture for incremental supervised learning of analog multi-dimensional maps, IEEE Transactions of Neural Networks , vol.3, No.5, 698â713, (1991)
8. DeGaris, H. , Circuits of Production Rule - GenNets â The genetic programming of nervous systems, in: Albrecht, R., Reeves, C. and N. Steele (eds) Artifical Neural Networks and Genetic Algorithms, Springer Verlag (1993)
9. Duch, W., and Diercksen, G. âFeature Space Mapping as a universal adaptive systemâ, Computer Physics Communication, 87 (1995) 341â371
10. Edelman, G., Neuronal Darwinism: The theory of neuronal group selection, Basic Books (1992).
11. Encarnacao, L.M., and Gross, M.H. âAn adaptive classification scheme to approximate decision boundaries using local Bayes criterias â Melting Octree Networks, Rep.92-047, Int.Computer Sci. Inst., Berkeley, CA (1992).
12. Fahlman, C .,and C. Lebiere, "The Cascade-Correlation Learning Architecture", in: Turetzky, D (ed) Advances in Neural Information Processing Systems, vol.2, Morgan Kaufmann, 524â532 (1990).
13. Freeman, J.A.S., Saad, D., On-line learning in radial basis function networks, Neural Computation, vol. 9, No.7 (1997)
14. Fritzke, B. âVector quantization with growing and splitting elastic netâ, in: ICANNâ93: Proc. of the Intern.Conf. on artificial neural networks, Amsterdam, (1993)
15. Fritzke, B., A growing neural gas network learns topologies, Advances in Neural Information Processing Systems, vol.7 (1995)
16.Golub, T.R., et al. Molecular Classification of Cancer: Class Discovery and Class Prediction by Gene Expression Monitoring, Science 286: 531-7, 1999
17. Goodman, R.M., C.M. Higgins, J.W. Miller, P.Smyth, "Rule-based neural networks for classification and probability estimation", Neural Computation, 14, 781â804 (1992)
18. Hashiyama, T., Furuhashi, T., Uchikawa, Y. âA Decision Making Model Using a Fuzzy Neural Networkâ, in: Proceedings of the 2nd International Conference on Fuzzy Logic & Neural Networks, Iizuka, Japan, 1057â1060, (1992).
19.Hassibi and Stork, âSecond order derivatives for network pruning: Optimal Brain Surgeonâ, in: Advances in Neural Information Processing Systems, 4, 164â171, (1992).
20. Heskes, T.M., Kappen, B., âOn-line learning processes in artificial neural networksâ, in: Math. foundations of neural networks, Elsevier, Amsterdam, 199â233, (1993).
21. Ishikawa, M. "Structural Learning with Forgetting", Neural Networks 9, 501â521, (1996).
22.Jang, R. ANFIS: adaptive network-based fuzzy inference system, IEEE Trans. on Syst.,Man, Cybernetics, 23(3), May/June 1993, 665â685, (1993).
23. Kasabov, N. and M. Watts, âSpatio-temporal evolving fuzzy neural networks and their applications for on-line, adaptive phoneme recognitionâ, TR 99/03, Department of Information Science, University of Otago, New Zealand
24.Kasabov, N. Foundations of Neural Networks, Fuzzy Systems and Knowledge Engineering, The MIT Press, CA, MA(1996).
25. Kasabov, N., "Adaptable connectionist production systemsâ. Neurocomputing, 13 (2-4) 95â117 (1996).
26. Kasabov, N., "Investigating the adaptation and forgetting in fuzzy neural networks by using the method of training and zeroing", Proceedings of the International Conference on Neural Networks ICNN'96, Plenary, Panel and Special Sessions volume, 118â123 (1996).
27. Kasabov, N., "Learning fuzzy rules and approximate reasoning in fuzzy neural networks and hybrid systems", Fuzzy Sets and Systems 82 (2) 2â20 (1996).
28. Kasabov, N., âECOS: A framework for evolving connectionist systems and the eco learning paradigmâ, Proc. of ICONIP'98, Kitakyushu, Oct. 1998, IOS Press, 1222â1235
29.Kasabov, N., âThe ECOS framework and the ECO learning method for evolving connectionist systems, Journal of Advanced Computational Intelligence, 2(6) 195â202 (1998)
30. Kasabov, N. Adaptive learning system and method, Patent Reg.No.503882, New Zealand (2000)
31.Kasabov, N., Evolving Fuzzy Neural NetworksâAlgorithms, Applications and Biological Motivation, in Proc. of Iizuka'98, Iizuka, Japan, Oct.1998, World Sci., 271â 274 (1998)
32. Kasabov, N., Kim J S, Watts, M., Gray, A., âFuNN/2âA Fuzzy Neural Network Architecture for Adaptive Learning and Knowledge Acquisitionâ, Information Sciences â Applications, 101(3â4): 155â175 (1997).
33. Kater, S.B., Mattson, N.P., Cohan, C. and Connor, J., âCalcium regulation of the neuronal cone growthâ, Trends in Neuroscience, 11, 315â321(1988).
34.Kim, J. and Kasabov, N. âHyFIS: Adaptive hybrid connectionist fuzzy inference systemsâ, TR 99/05, Department of Information Science, University of Otago, New Zealand
35. Kohonen, T. The Self-Organizing Map. Proceedings of the IEEE, vol.78, N-9, pp.1464â1497, (1990).
36. Kohonen, T.,. Self-Organizing Maps, second edition, Springer Verlag, 1997.
37. Kozma, R. and N.Kasabov, âRules of chaotic behaviour extracted from the fuzzy neural network FuNNâ, Proc. of the WCCIâ98 FUZZ-IEEE International Conference on Fuzzy Systems, Anchorage, May (1998).
38. Krogh, A. and Hertz, J.A., âA simple weight decay can improve generalisation. Advances in Neural Information Processing Systemsâ, 4, 951â957, (1992)
39. Le Cun, Y., J.S. Denker and S.A. Solla, âOptimal Brain Damageâ, in: D.S. Touretzky, ed., Advances in Neural Information Processing Systems, Morgan Kaufmann, 2, 598â605 (1990).
40. Lin, C.T. and C.S. G. Lee, Neuro Fuzzy Systems, Prentice Hall (1996).
41. Miller, D.,J.Zurada and J.H. Lilly, "Pruning via Dynamic Adaptation of the Forgetting Rate in Structural Learning," Proc. IEEE ICNN'96, Vol.1, p.448 (1996).
42. Mitchell, M.T. Machine Learning, MacGraw-Hill (1997)
43. Mitchell, Melanie, An Introduction to Genetic Algorithms, MIT Press, Cambridge, Massachusetts (1996).
44. Mozer. M, and Smolensky, P., âA technique for trimming the fat from a network via relevance assessmentâ, in: D.Touretzky (ed) Advances in Neural Information Processing Systems, vol.2, Morgan Kaufmann, 598â605 (1989).
45.Quartz, S.R., and Sejnowski, T.J., âThe neural basis of cognitive development: a constructivist manifestoâ, Behavioral and Brain Science, to appear
46. Reed, R., âPruning algorithms â a surveyâ, IEEE Trans. Neural Networks, 4 (5) 740â747, (1993).
47. Sankar, A. and R.J. Mammone, âGrowing and Pruning Neural Tree Networksâ, IEEE Trans. Comput. 42(3), 291â299, (1993).
48. Schiffman, W., Joost, M. and Werner. R., âApplication of Genetic Algorithms to the Construction of Topologies for Multilayer Perceptronsâ, In: Albrecht, R.F., Reeves, C. R., Steele, N. C. (Eds.), Artificial Neural Nets and Genetic Algorithms, Springer-Verlag Wien, New York (1993)
49. Sun, R. âA connectionist model for commonsense reasoning incorporating rules and similaritiesâ, in: Knowledge Acquisitions, Academic Press, Cambridge (1992)
50. Towel, G., J. Shavlik, J. and M. Noordewier, "Refinement of approximate domain theories by knowledge-based neural networks", Proc. of the 8th National Conf. on Artificial Intelligence AAAI'90, Morgan Kaufmann, 861â866 (1990).
51. Vapnik, V. and Bottou, L. Neural Computation, 5 (1993) 893â909
52.Watts, M., and Kasabov, N. âGenetic algorithms for the design of fuzzy neural networksâ, in Proc. of ICONIP'98, Kitakyushu, Oct. 1998, IOS Press, 793â795 (1998)
53. Wang, L.X., "Adaptive fuzzy systems and control, Prentice Hall, 1994
54. Woldrige, M. and Jennings, N., âIntelligent agents: Theory and practiceâ, The Knowledge Engineering review (10), 1995.
55. Yamakawa, T., H. Kusanagi, E. Uchino and T.Miki, "A new Effective Algorithm for Neo Fuzzy Neuron Model", in: Proceedings of Fifth IFSA World Congress, 1017â1020, (1993)
56. Zadeh, L. Fuzzy Sets, Information, and Control, vol.8, 338â353, (1965)
Evolving fuzzy neural networks for on-line knowledge discovery
Fuzzy neural networks are connectionist systems that facilitate learning from data, reasoning over fuzzy rules, rule insertion, rule extraction, and rule adaptation. The concept evolving fuzzy neural networks (EFuNNs), with respective algorithms for learning, aggregation, rule insertion, rule extraction, is further developed here and applied for on-line knowledge discovery on both prediction and classification tasks. EFuNNs operate in an on-line mode and learn incrementally through locally tuned elements. They grow as data arrive, and regularly shrink through pruning of nodes, or through node aggregation. The aggregation procedure is functionally equivalent to knowledge abstraction. The features of EFuNNs are illustrated on two real-world application problems---one from macroeconomics and another from Bioinformatics. EFuNNs are suitable for fast learning of on-line incoming data (e.g., financial and economic time series, biological process control), adaptive learning of speech and video data, incremental learning and knowledge discovery from growing databases (e.g. in Bioinformatics), on-line tracing of processes over time, life-long learning. The paper includes also a short review of the most common types of rules used in the knowledge-based neural networks for knowledge discovery and data mining.Unpublished1. Alpaydin, E. âGAL: networks that grow when they learn and shrink when they forgetâ, TR 91-032, Int.Computer Sci. Inst., Berkeley, CA (1991).
2. Amari, S. and Kasabov, N. eds, Brain-like computing and intelligent information systems, Springer Verlag (1997).
3. Andrews, R., J. Diederich, A.B.Tickle, "A Survey and Critique of Techniques for Extracting Rules from Trained Artificial Neural Networks", Knowledge-Based Systems, 8, 373â389 (1995).
4. Arbib, M (ed.) The Handbook of Brain Theory and Neural Networks, The MIT Press (1995)
5. Berenji, H., Khedkar, P. âLearning and tuning fuzzy logic controllers through. IEEE Trans. on Neural Networks, 3, 724â740 (1992)
6. Carpenter, G. and Grossberg, S., Pattern recognition by self-organizing neural networks , The MIT Press, Cambridge, Massachusetts (1991)
7. Carpenter, G.A., Grossberg, S., Markuzon, N., Reynolds, J.H., Rosen, D.B., FuzzyARTMAP: A neural network architecture for incremental supervised learning of analog multi-dimensional maps, IEEE Transactions of Neural Networks , vol.3, No.5, 698â713, (1991)
8. DeGaris, H. , Circuits of Production Rule - GenNets â The genetic programming of nervous systems, in: Albrecht, R., Reeves, C. and N. Steele (eds) Artifical Neural Networks and Genetic Algorithms, Springer Verlag (1993)
9. Duch, W., and Diercksen, G. âFeature Space Mapping as a universal adaptive systemâ, Computer Physics Communication, 87 (1995) 341â371
10. Edelman, G., Neuronal Darwinism: The theory of neuronal group selection, Basic Books (1992).
11. Encarnacao, L.M., and Gross, M.H. âAn adaptive classification scheme to approximate decision boundaries using local Bayes criterias â Melting Octree Networks, Rep.92-047, Int.Computer Sci. Inst., Berkeley, CA (1992).
12. Fahlman, C .,and C. Lebiere, "The Cascade-Correlation Learning Architecture", in: Turetzky, D (ed) Advances in Neural Information Processing Systems, vol.2, Morgan Kaufmann, 524â532 (1990).
13. Freeman, J.A.S., Saad, D., On-line learning in radial basis function networks, Neural Computation, vol. 9, No.7 (1997)
14. Fritzke, B. âVector quantization with growing and splitting elastic netâ, in: ICANNâ93: Proc. of the Intern.Conf. on artificial neural networks, Amsterdam, (1993)
15. Fritzke, B., A growing neural gas network learns topologies, Advances in Neural Information Processing Systems, vol.7 (1995)
16.Golub, T.R., et al. Molecular Classification of Cancer: Class Discovery and Class Prediction by Gene Expression Monitoring, Science 286: 531-7, 1999
17. Goodman, R.M., C.M. Higgins, J.W. Miller, P.Smyth, "Rule-based neural networks for classification and probability estimation", Neural Computation, 14, 781â804 (1992)
18. Hashiyama, T., Furuhashi, T., Uchikawa, Y. âA Decision Making Model Using a Fuzzy Neural Networkâ, in: Proceedings of the 2nd International Conference on Fuzzy Logic & Neural Networks, Iizuka, Japan, 1057â1060, (1992).
19.Hassibi and Stork, âSecond order derivatives for network pruning: Optimal Brain Surgeonâ, in: Advances in Neural Information Processing Systems, 4, 164â171, (1992).
20. Heskes, T.M., Kappen, B., âOn-line learning processes in artificial neural networksâ, in: Math. foundations of neural networks, Elsevier, Amsterdam, 199â233, (1993).
21. Ishikawa, M. "Structural Learning with Forgetting", Neural Networks 9, 501â521, (1996).
22.Jang, R. ANFIS: adaptive network-based fuzzy inference system, IEEE Trans. on Syst.,Man, Cybernetics, 23(3), May/June 1993, 665â685, (1993).
23. Kasabov, N. and M. Watts, âSpatio-temporal evolving fuzzy neural networks and their applications for on-line, adaptive phoneme recognitionâ, TR 99/03, Department of Information Science, University of Otago, New Zealand
24.Kasabov, N. Foundations of Neural Networks, Fuzzy Systems and Knowledge Engineering, The MIT Press, CA, MA(1996).
25. Kasabov, N., "Adaptable connectionist production systemsâ. Neurocomputing, 13 (2-4) 95â117 (1996).
26. Kasabov, N., "Investigating the adaptation and forgetting in fuzzy neural networks by using the method of training and zeroing", Proceedings of the International Conference on Neural Networks ICNN'96, Plenary, Panel and Special Sessions volume, 118â123 (1996).
27. Kasabov, N., "Learning fuzzy rules and approximate reasoning in fuzzy neural networks and hybrid systems", Fuzzy Sets and Systems 82 (2) 2â20 (1996).
28. Kasabov, N., âECOS: A framework for evolving connectionist systems and the eco learning paradigmâ, Proc. of ICONIP'98, Kitakyushu, Oct. 1998, IOS Press, 1222â1235
29.Kasabov, N., âThe ECOS framework and the ECO learning method for evolving connectionist systems, Journal of Advanced Computational Intelligence, 2(6) 195â202 (1998)
30. Kasabov, N. Adaptive learning system and method, Patent Reg.No.503882, New Zealand (2000)
31.Kasabov, N., Evolving Fuzzy Neural NetworksâAlgorithms, Applications and Biological Motivation, in Proc. of Iizuka'98, Iizuka, Japan, Oct.1998, World Sci., 271â 274 (1998)
32. Kasabov, N., Kim J S, Watts, M., Gray, A., âFuNN/2âA Fuzzy Neural Network Architecture for Adaptive Learning and Knowledge Acquisitionâ, Information Sciences â Applications, 101(3â4): 155â175 (1997).
33. Kater, S.B., Mattson, N.P., Cohan, C. and Connor, J., âCalcium regulation of the neuronal cone growthâ, Trends in Neuroscience, 11, 315â321(1988).
34.Kim, J. and Kasabov, N. âHyFIS: Adaptive hybrid connectionist fuzzy inference systemsâ, TR 99/05, Department of Information Science, University of Otago, New Zealand
35. Kohonen, T. The Self-Organizing Map. Proceedings of the IEEE, vol.78, N-9, pp.1464â1497, (1990).
36. Kohonen, T.,. Self-Organizing Maps, second edition, Springer Verlag, 1997.
37. Kozma, R. and N.Kasabov, âRules of chaotic behaviour extracted from the fuzzy neural network FuNNâ, Proc. of the WCCIâ98 FUZZ-IEEE International Conference on Fuzzy Systems, Anchorage, May (1998).
38. Krogh, A. and Hertz, J.A., âA simple weight decay can improve generalisation. Advances in Neural Information Processing Systemsâ, 4, 951â957, (1992)
39. Le Cun, Y., J.S. Denker and S.A. Solla, âOptimal Brain Damageâ, in: D.S. Touretzky, ed., Advances in Neural Information Processing Systems, Morgan Kaufmann, 2, 598â605 (1990).
40. Lin, C.T. and C.S. G. Lee, Neuro Fuzzy Systems, Prentice Hall (1996).
41. Miller, D.,J.Zurada and J.H. Lilly, "Pruning via Dynamic Adaptation of the Forgetting Rate in Structural Learning," Proc. IEEE ICNN'96, Vol.1, p.448 (1996).
42. Mitchell, M.T. Machine Learning, MacGraw-Hill (1997)
43. Mitchell, Melanie, An Introduction to Genetic Algorithms, MIT Press, Cambridge, Massachusetts (1996).
44. Mozer. M, and Smolensky, P., âA technique for trimming the fat from a network via relevance assessmentâ, in: D.Touretzky (ed) Advances in Neural Information Processing Systems, vol.2, Morgan Kaufmann, 598â605 (1989).
45.Quartz, S.R., and Sejnowski, T.J., âThe neural basis of cognitive development: a constructivist manifestoâ, Behavioral and Brain Science, to appear
46. Reed, R., âPruning algorithms â a surveyâ, IEEE Trans. Neural Networks, 4 (5) 740â747, (1993).
47. Sankar, A. and R.J. Mammone, âGrowing and Pruning Neural Tree Networksâ, IEEE Trans. Comput. 42(3), 291â299, (1993).
48. Schiffman, W., Joost, M. and Werner. R., âApplication of Genetic Algorithms to the Construction of Topologies for Multilayer Perceptronsâ, In: Albrecht, R.F., Reeves, C. R., Steele, N. C. (Eds.), Artificial Neural Nets and Genetic Algorithms, Springer-Verlag Wien, New York (1993)
49. Sun, R. âA connectionist model for commonsense reasoning incorporating rules and similaritiesâ, in: Knowledge Acquisitions, Academic Press, Cambridge (1992)
50. Towel, G., J. Shavlik, J. and M. Noordewier, "Refinement of approximate domain theories by knowledge-based neural networks", Proc. of the 8th National Conf. on Artificial Intelligence AAAI'90, Morgan Kaufmann, 861â866 (1990).
51. Vapnik, V. and Bottou, L. Neural Computation, 5 (1993) 893â909
52.Watts, M., and Kasabov, N. âGenetic algorithms for the design of fuzzy neural networksâ, in Proc. of ICONIP'98, Kitakyushu, Oct. 1998, IOS Press, 793â795 (1998)
53. Wang, L.X., "Adaptive fuzzy systems and control, Prentice Hall, 1994
54. Woldrige, M. and Jennings, N., âIntelligent agents: Theory and practiceâ, The Knowledge Engineering review (10), 1995.
55. Yamakawa, T., H. Kusanagi, E. Uchino and T.Miki, "A new Effective Algorithm for Neo Fuzzy Neuron Model", in: Proceedings of Fifth IFSA World Congress, 1017â1020, (1993)
56. Zadeh, L. Fuzzy Sets, Information, and Control, vol.8, 338â353, (1965)
Spatial-temporal adaptation in evolving fuzzy neural networks for on-line adaptive phoneme recognition
Please note that this is a searchable PDF derived via optical character recognition (OCR) from the original source document. As the OCR process is never 100% perfect, there may be some discrepancies between the document image and the underlying text. Searching and selecting the text of this PDF may also not work in all viewers; for example, they have been found to not work in Apple's Preview application. We therefore recommend Adobe Reader for viewing and searching this PDF.The paper is a study on a new class of spatial-temporal evolving fuzzy neural network systems (EFuNNs) for on-line adaptive learning, and their applications for adaptive phoneme recognition. The systems evolve through incremental, hybrid (supervised / unsupervised) learning. They accommodate new input data, including new features, new classes, etc. through local element tuning. Both feature-based similarities and temporal dependencies, that are present in the input data, are learned and stored in the connections, and adjusted over time. This is an important requirement for the task of adaptive, speaker independent spoken language recognition, where new pronunciations and new accents need to be learned in an on-line, adaptive mode. Experiments with EFuNNs, and also with multi-layer perceptrons, and fuzzy neural networks (FuNNs), conducted on the whole set of New Zealand English phonemes, show the superiority and the potential of EFuNNs when used for the task. Spatial allocation of nodes and their aggregation in EFuNNs allow for similarity preserving and similarity observation within one phoneme data and across phonemes, while subtle temporal variations within one phoneme data can be learned and adjusted through temporal feedback connections. The experimental results support the claim that spatial-temporal organisation in EFuNNs can lead to a significant improvement in the recognition rate especially for the diphthong and the vowel phonemes in English, which in many cases are problematic for a system to learn and adjust in an adaptive way.Unpublished[1] J.S. Albus. A new approach to manipulator control: The cerebellar model articulation controller (cmac). Transactions of the ASME: Journal of Dyanmic System, Measurement, and Control, pages 220â227, September 1975.
[2] E. Alpaydin. Gal: networks that grow when they learn and shrink when they forget. Technical Report TR91-032, International Computer Science Institute, Berkley, CA, 1991.
[3] G. Carpenter and S. Grossberg. Art3: Hierarchical search using chemical transmitters in self-organising pattern-recognition architectures. Neural Networks, 3(2):129â152, 1990.
[4] G. Carpenter and S. Grossberg. Pattern recognition by self-organizing neural networks. MIT Press, 1991.
[5] G. Carpenter, S. Grossberg, J.H. Markuzon, and D.B. Rosen. Fuzzyartmap: A neural network architecture for incremental supervised learning of analog multi-dimensional amps. IEEE Transations of Neural Networks, 3(5):698â713, 1991.
[6] H. DeGaris. Circuits of production rule gennets - the genetic programming of artiďŹcial nervous systems. In R. Albrecht, Reeves. C., and N. Steele, editors, ArtiďŹcial Neural Networks and Genetic Algorithms. Springer Verlag, 1993.
[7] J. Elman, E. Bates, M. Johnson, A. Karmilo-Smith, D. Parisi, and K. Plunkett. Rethinking Innateness (A Connectionist Perspective of Development. MIT Press, 1997.
[8] C. Fahlman and C. Lebiere. The cascade-correlation learning architecture. In D Turetzky, editor, Advances in Neural Information Processing Systems, volume 2, pages 524â532. Morgan Kaufmann, 1990.
[9] J. Freeman and D. Saad. On-line learning in radial basis function networks. Neural Computation, 9(7), 1997.
[10] French. Semi-destructive representations and catastrophic forgetting in connectionist networks. Connection Science, 1:365â377, 1995.
[11] B. Fritzke. Vector quantization with growing and splitting elasitc net. In ICANN'93: Proceedings of the Intern. Conf on aritiďŹcal neural networks, Amsterdam, 1993.
[12] B. Fritzke. A growing neural gas network learns topologies. In Advances in Neural Information Processing Systems, volume 7. Morgan Kaufmann, 1995.
[13] Hassibi and Stork. Second order derivatives for network pruning: Optimal brainsurgeon. In Advances in Neural Information Processing Systems, volume 4, pages 164â171. Morgan Kaufmann, 1992.
[14] R. Hecht-Nielson. Counter-propagation networks. In IEEE First International conference on neural networks, San Diego, volume 2, pages 19â31, 1987.
[15] T.M. Heskes and B. Kappen. On-line learning processes in artiďŹcial neural networks. In Math. foundation of neural networks. Elsevier, Amsterdam, 1993.
[16] M. Ishikawa. Structural learning with forgetting. Neural Networks, 9:501â521, 1996.
[17] R. Jang. AnďŹs: adaptive network-based fuzzy inference system. IEEE Trans. on SYst., Man, Cybernetics, 23(3):665â685, 1993.
[18] S.R.H. Joseph. Theories of adaptive neural growth. PhD thesis, University of Edinburgh, 1998.
[19] N. Kasabov. Adaptable connectionist production systems. Neurocomputing, 13(2-4):95â117, 1996.
[20] N. Kasabov. Foundations of Neural Networks, Fuzzy Systems and Knowledge Engineering. MIT Press, 1996.
[21] N. Kasabov. Investigating the adaptation and forgetting in fuzzy neural networks by using the method of training and zeroing. In Proceedins of the International Conference on Neural Networks ICNN'96, volume Plenary, Panel and Special Sessions volume, pages 118â123, 1996.
[22] N. Kasabov. Learning fuzzy rules and approximate reasoning in fuzzy neural networks and hybrid systems. Fuzzy Sets and Systems, 82(2):2â20, 1996.
[23] N. Kasabov. Ecos: A framework for evolving connectionist systems and the eco learning paradigm. In Pro. of ICONIP'98, Kitakyushu, Japan, Oct. 1998, pages 1222â1235. IOS Press, 1998.
[24] N. Kasabov. The ecos framework and the eco learning method for evolving connectionist systems. Journal of Advanced Computational Intelligence, 2(6):195â202, 1998.
[25] N. Kasabov. Evolving fuzzy neural networks - algorithms, applications and biological motivation. In Yamakawa and Matsumoto, editors, Methodologies for the Conception, Design and Application of Soft Computing, pages 271â274. World ScientiďŹc, 1998.
[26] N. Kasabov. Evolving connectionist systems for on-line, knowledge-based learning: principles and applictions. Technical Report TR99/02, Department of Information Science, University of Otago, New Zealand, 1999.
[27] N. Kasabov, J.S. Kim, M. Watts, and A. Gray. Funn/2 - a fuzzy neural network architecutre for adaptive learning and knowledge acquisition. Information Sciences - Applications, 101(3-4):155â175, 1997.
[28] N. Kasabov, R. Kozma, R. Kilgour, M. Laws, J. Taylor, M. Watts, and A. Gray. A methodology for speech data analysis and a framework for adaptive speech recognition using fuzzy neural networks and self organising maps. In N. Kasabov and R. Kozma, editors, Neuro-fuzzy techniques for intelligent information systems. Physica Verlag (Springer Verlag), 1999.
[29] N. Kasabov, E. Postma, and J. Van den Herik. Avis: A connectionist-based framework for integrated audio and visual information processing. In Proceedings of Iizuka'98, Iizuka, Japan, Oct. 1998, 1998.
[30] N. Kasabov and Q. Song. Dynamic, evolving fuzzy neural networks with 'm-out-of-n' activation nodes for on-line adaptive systems. Technical Report TR99/04, Department of Information Science, University of Otago, New Zealand, 1999.
[31] N. Kasabov and M. Watts. Geentic algorithms for structural opsimisation, dynamic adaptation and automated design fo fuzzy neural networks. In Proceedings of the International Conference on Neural Networks ICNN'97. IEEE Press, 1997.
[32] N Kasabov and B.Woodford. Rule insertionand rule extraction from evolving fuzzy neural networks: Algorithms and applications for building adaptive, intelligent expert systems. In Proceedings of International Conference FUZZ-IEEE, Seoul, August 1999, 1999.
[33] S.B. Kater, N.P. Mattson, C. Cohan, and J. Connor. Calcium regulation of the neuronal cone growth. Trends in Neuroscience, 1988.
[34] T. Kohonen. The self-organizing map. Proceedings of the IEEE, 78(9):1464â1497, 1990.
[35] T. Kohonen. Self-Organizing Maps. Springer Verlag, 2 edition, 1997.
[36] A. Krogh and J.A. Hertz. A simple weight decay can imprive generalisation. In Advances in Neural Information Processing Systems, volume 4, pages 951â957. 1992.
[37] Y. Le Cun, J.S. Denker, and S.A. Solla. Optimal brain damage. In D.S. Touretzky, editor, Advances in Neural Infromation Processing Systems, volume 2, pages 598â605. Morgan Kaufmann, 1990.
[38] C.T. Lin and C.S.G. Lee. Neuro-Fuzzy Systems. Prentice Hall, 1996.
[39] M. Maeda, H. Miyajima, and S. Murashima. A self organizing neural network with creating and deleting methods. Nonlinear theory and its applications, 1:397â400, 1996.
[40] J. Mandziuk and L. Shastri. Incremental class learning approach and its application to hand-written digit recognition. In Proceedings of the ďŹfth International Conference on neuro-information processing, Kitakyushu, Japan, Oct. 21-23, 1998.
[41] J. Massaro and M. Cohen. Integration of visual and auditory information in speech perception. Journal of Experimental Psychology: Human Perception and Performance, 9:753â771, 1983.
[42] D. Miller, J. Zurada, and J.H. Lilly. Pruning via dynamic adaptation of the forgetting rate in structural learning. In Poc. IEEE ICNN'96, volume 1, page 448, 1996.
[43] Van Ooyen and J. Van Pelt. Activity-dependent outgrowth of neurons and overshoot phenomena in developing neural networks. Journal of Theoretical Biology, (167):27â43, 1994.
[44] R. Port and T. van Gelder. Mind as Motion (Explorations in the Dynamics of Cognition. MIT Press, 1995.
[45] R. Reed. Pruning algorithms - a survey. IEEE Trans. Neural Networks, 4(4):740â747, 1993.
[46] A. Robins and M. Freans. Local learning algorithms for sequential learning tasks in neural networks. Journal of Advanced Computational Intelligence, 2(6), 1998.
[47] G.A. Rummery and M. Niranjan. On-line q-learning using connectionist systems. Technical report, Cambridge University Engineering Department, 1994.
[48] D. Saad, editor. On-line learning in neural networks. Cambridge University Press, 1999.
[49] A Sankar and R.J. Mammone. Growing and pruning neural tree networks. IEEE Trans Comput., 42(3):291â299, 1993.
[50] S.J. Segalowitz. Language functions and brain organization. Academic Press, 1993.
[51] A. Seleverston, editor. Model neural networks and behaviour. Plenum Press, 1985.
[52] M. Watts and N. Kasabov. Genetic algorithms for the design of fuzzy neural networks. In Proceedings of ICONIP'98, Kitakyushu, Oct. 1998, pages 793â795. IOS Press, 1998.
[53] D. Whitley and C. Bogart. The evolution of connectivity: Pruning neural networks using genetic algorithms. In Proceedings International Joint Conference Neural Networks, pages 17â22, 1990.
[54] T. Yamakawa, H. Kuzanagi, E. Uchino, and T. Miki. A new eective algorithm for neo fuzzy neuron model. In Proceedings of Fifth IFSA World Congress, pages 1017â1020, 1993