13 research outputs found

    On The Cooperation Of Fuzzy Neural Networks Via A Coevolutionary Approach

    No full text
    This paper brings effort on the characterization of Cooperative Fuzzy Neural Networks (CFNNs). CFNNs encompass any conceptual or architectural aggregate in which two or more Fuzzy Neural Networks (FNNs) work cooperatively for the accomplishment of high-level objectives. In such context, the behavior of an FNN is, by some means, influenced by the behavior of its peers, and the performance of the whole group should contribute as complementary guidance for its individual training. A coevolutionary approach is presented as an auxiliary mechanism for the design and implementation of CFNNs. Implementation issues are described as a means to attest the applicability of the proposal.2785790Caminhas, W., Tavares, H., Gomide, F., Pedrycz, W., Fuzzy set based neural networks: Structure, learning and application (1999) Journal of Advanced Computational Intelligence, 3 (3), pp. 151-157Buckley, J., Hayashi, Y., Fuzzy neural networks (1994) Fuzzy Sets, Neural Networks and Soft Computing, pp. 233-249. , Van Nostrand Reinhold, New YorkIshigami, H., Fukuda, T., Shibata, T., Arai, F., Structure optimization of fuzzy neural network by genetic algorithm (1995) Fuzzy Sets and Systems, 71, pp. 257-264. , MayAliev, R., Fazlollahi, B., Vahidov, R., Genetic algorithm-based learning of fuzzy neural networks. Part 1: Feed-forward fuzzy neural networks (2001) Fuzzy Sets and Systems, 118 (2), pp. 351-358. , March(1999) Multiagent Systems: A Modern Approach to Distributed Artificial Intelligence, , G. Weiss (ed.)MIT Press, Cambridge, MassachusettsMaclin, R., Shavlik, J., Combining the predictions of multiple classifiers: Using competitive learning to initialize neural networks Proc. of the 14th International Joint Conference on Artificial Intelligence (IJCAI), 1995, pp. 524-531Huang, Y., Liu, K., Sue, C., The combination of multiple classifiers by a neural network approach (1995) Journal of Pattern Recognition and Artificial Intelligence, 9, pp. 579-597(1999) Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems, , A. Scharkey (ed.)Springer, LondonPotter, M., De Jong, K., Cooperative coevolution: An architecture for evolving coadapted subcomponents (2000) Evolutionary Computation, 8 (1), pp. 1-29Puppala, N., Sen, S., Gordin, M., Shared memory based cooperative coevolution Proc. ICEC, 1998Burt, P.J., Hong, T.-H., Rosenfeld, A., Segmentation and estimation of image properties through cooperative hierarchical computation (1981) IEEE Transactions on Systems, Man, and Cybernetics, 11 (12)Coelho, A.L.V., Weingaertner, D., Von Zuben, F.J., Evolving heterogeneous neural networks for classification problems Proc. Genetic and Evolutionary Computation Conference (GECCO), San Francisco, USA, July 200

    Hybrid Genetic Training Of Gated Mixtures Of Experts For Nonlinear Time Series Forecasting

    No full text
    In this paper, we introduce a genetic algorithm-based training mechanism (HGT-GAME) toward the automatic structural design and parameter configuration of Gated Mixtures of Experts (ME). In HGT-GAME, a whole ME instance is codified into a given chromosome. By employing regulatory genes, our approach enables the automatic pruning and growing of experts in a way to properly match the complexity of the task at hand. Moreover, to leverage HGT-GAME's effectiveness, a local search refinement upon each ME chromosome is performed in each generation via the gradient descent learning algorithm. Forecasting experiments evaluate the performance of Gated MEs trained with HGT-GAME.546254630Anastasakis, L., Mort, N., Applying a feed-forward neural network for the prediction of the USD/BGP exchange rate Third Conference in Technology and Automation, Piraeus, Greece, October 2000, , Dept. of Automation-TEI of Piraeus(1997) Handbook of Evolutionary Computation, , T. Bäck, D. Fogel, and Z. Michalewicz, editorsDasgupta, D., McGregor, D., Designing application-specific neural networks using the structured genetic algorithm (1992) Procs. of COGANN, pp. 87-96Hong, S.-G., Oh, S.-K., Kim, M.-S., Lee, J.-J., Evolving mixture of experts for nonlinear time series modelling and prediction (2002) Electronics Letters, 38, pp. 34-35. , JanuaryJordan, M.I., Jacobs, A.R., Hierarchical mixtures of experts and the EM algorithm (1994) Neural Computation, 6, pp. 181-214Moerland, P., Some methods for training mixtures of experts (1997), Technical Report 97-05, IDIAPRamamurti, V., Ghosh, J., Structurally adaptive modular networks for nonstationary environments (1999) IEEE Transaction on Neural Networks, 10 (1), pp. 152-160. , JanuaryWeigend, A.S., Mangeas, M., Srivastava, A.N., Nonlinear gated experts for time series: Discovering regimes and avoiding overfitting (1995) International Journal of Neural Systems, 6, pp. 373-399Whitley, D.L., Gordon, V.S., Mathias, K.E., Lamarckian evolution, the baldwin effect and function optimization (1994) Parallel Problem Solving from Nature - PPSN III, LNCS 866, pp. 6-15. , BerlinSpringerXu, L., Jordan, M.I., Hinton, G.E., An alternative model for mixture of experts (1995) Advances in Neural Information Processing Systems (NIPS), 7, pp. 633-640. , In G. Tesauro, D. S. Touretzky, and T. K. Leen, editorsMIT PressZhang, B.-T., Joung, J.-G., Time series prediction using committee machines of evolutionary neural trees (1999) Procs. of IEEE International Conference on Evolutionary Computation, 1, pp. 281-28

    An Evolutionary Framework For Nonlinear Time-series Prediction With Adaptive Gated Mixtures Of Experts

    No full text
    A probabilistic learning technique, known as gated mixture of experts (MEs), is made more adaptive by employing a customized genetic algorithm based on the concepts of hierarchical mixed encoding and hybrid training. The objective of such effort is to promote the automatic design (i.e., structural configuration and parameter calibration) of whole gated ME instances more capable to cope with the intricacies of some difficult machine learning problems whose statistical properties are time-variant. In this chapter, we outline the main steps behind such novel hybrid intelligent system, focusing on its application to the nontrivial task of nonlinear time-series forecasting. Experiment results are reported with respect to three benchmarking time-series problems and confirmed our expectation that the new integrated approach is capable to outperform - both in terms of accuracy and generalization - other conventional approaches, such as single neural networks and non-adaptive, handcrafted gated MEs. © 2007, Idea Group Inc.114138Anastasakis, L., Mort, N., Applying a feedforward neural network for the prediction of the USD/BGP exchange rate (2000) Proceedings of 3rd Conference In Technology and Automation, pp. 169-174. , October, In, Piraeus, Greece T.E.I. of PiraeusArmstrong, J., Collopy, F., Error measures for generalizing about forecasting methods - Empirical comparisons (1992) International Journal of Forecasting, 8 (1), pp. 69-80(1997) Handbook of Evolutionary Computation, , New York, Oxford University PressBaker, J., Reducing bias and inefficiency in the selection algorithm (1987) Proceedings of the Second International Conference On Genetic Algorithms and Their Applications, pp. 14-21. , J. Grefenstette (Ed.), Hillsdale, NJ: Lawrence Erlbaum AssociatesBox, G., Jenkins, G.M., Reinsel, G., (1994) Time Series Analysis: Forecasting & Control, , (3rd. ed.). Upper Saddle River, NJ: Prentice HallCao, L., Support vector machines experts for time series forecasting (2003) Neurocomputing, 51, pp. 321-339Dasgupta, D., McGregor, D., Designing application-specific neural networks using the structured genetic algorithm (1992) Proceedings of the International Conference On Combinations of Genetic Algorithms and Neural Networks, pp. 87-96. , June, In, Baltimore (pp, Los Alamitos, CA: IEEE Computer Society PressDempster, A.P., Laird, N.M., Rubin, D.B., Maximum likelihood from incomplete data via the EM algorithm (1977) Journal of the Royal Statistical Society B, 39 (1), pp. 1-38Goonatilake, S., Khebbal, S., (1994) Intelligent Hybrid Systems, , New York, John Wiley & SonsHaykin, S., (1999) Neural Networks - a Comprehensive Foundation, , (2nd. ed.). Upper Saddle River, NJ: Prentice HallHong, S.-G., Oh, S.-K., Kim, M.-S., Lee, J.-J., Evolving mixture of experts for nonlinear time series modelling and prediction (2002) Electronics Letters, 38 (1), pp. 34-35Jacobs, R., Jordan, M., Nowlan, S., Hinton, G., Adaptive mixtures of local experts (1991) Neural Computation, 3 (1), pp. 79-87Jordan, M., Jacobs, R., Hierarchical mixtures of experts and the EM algorithm (1994) Neural Computation, 6 (2), pp. 181-214Kennedy, J., Eberhart, R., (2001) Swarm Intelligence, , San Mateo, CA: Morgan KaufmannKim, K.-H., Park, J.-K., Hwang, K.-J., Kim, S.-H., Implementation of hybrid short-term load forecasting system using artificial neural networks and fuzzy expert systems (1995) IEEE Transactions On Power Systems, 10 (3), pp. 1534-1539Lima, C.A.M., Coelho, A.L.V., von Zuben, F.J., Mixture of experts applied to nonlinear dynamic systems identification: A comparative study (2002) Proceedings of the VII Brazilian Symposium On Neural Networks, pp. 162-167. , Recife, Brazil: IEEE Computer Society PressMackey, M., Glass, L., Oscillations and chaos in physiological control systems (1977) Science, 197 (4300), pp. 287-289Man, K.F., Tang, K.S., Kwong, S., (1999) Genetic Algorithms: Concepts and Design, , New York, Springer-VerlagMangeas, M., Weigend, A.S., Muller, C., Forecasting electricity demand using nonlinear mixture of experts (1995) Proceedings of the World Congress On Neural Networks, pp. 48-53. , Washington, DCMeila, M., Jordan, M.I., Markov mixtures of experts (1997) Multiple Model Approaches to Modelling and Control, pp. 145-166. , R. Murray-Smith & T. A. Johansen (Eds.), Taylor and FrancisMichalewicz, Z., (1996) Genetic Algorithms + Data Structures = Evolution Programs, , (3rd. ed.). Berlin, Germany: Springer-VerlagPeter Zhang, G., Time series forecasting using a hybrid ARIMA and neural network model (2003) Neurocomputing, 50, pp. 159-175Peter Zhang, G., Patuwo, B.E., Hu, M.Y., Forecasting with artificial neural networks: The state of the art (1998) International Journal of Forecasting, 14 (1), pp. 35-62Refenes, A.N., (1995) Neural Networks In the Capital Markets, , New York, John Wiley & SonsRoss, B.J., A Lamarckian evolution strategy for genetic algorithms (1999) Practical Handbook of Genetic Algorithms-complex Coding Systems, pp. 1-16. , In L. D. Chambers (Ed.), Boca Raton, FL: CRC PressSrivastava, A.N., Su, R., Weigend, A.S., Data mining for features using scale-sensitive gated experts (1999) IEEE Transactions On Pattern Analysis and Machine Intelligence, 21 (12), pp. 1268-1279Titterington, D., Smith, A., Makov, U., (1985) Statistical Analysis of Finite Mixture Distributions, , New York, John Wiley & Sons(1994) Time Series Prediction: Forecasting the Future and Understanding the Past. Reading, , MA, Addison-WesleyWeigend, A.S., Mangeas, M., Srivastava, A.N., Nonlinear gated experts for time series: Discovering regimes and avoiding overfitting (1995) International Journal of Neural Systems, 6 (4), pp. 373-399Whitley, D.L., Modeling hybrid genetic algorithms (1996) Genetic Algorithms In Engineering and Computer Science, pp. 191-201. , G. Winter, J. Periaux, M. Galan, & P. Cuesta (Eds.), Chichester, UK: John Wiley & SonsWhitley, D.L., Scott Gordon, V., Mathias, K.E., Lamarckian evolution, the Baldwin effect and function optimization (1994) Parallel Problem Solving From Nature - PPSN III, pp. 6-15. , In Y. Davidor, H.-P. Schwefel, & R. Männer (Eds.), (LNCS 866, Berlin: SpringerXu, L., Jordan, M.I., Hinton, G.E., An alternative model for mixtures of experts (1995) Advances In Neural Information Processing Systems 7, pp. 633-640. , G. Tesauro, D. S. Touretzky, & T. K. Leen (Eds.), Cambridge, MAYao, X., Evolving artificial neural networks (1999) Proceedings of the IEEE, 87 (9), pp. 1423-1447Zhang, B.-T., Joung, J.-G., Time series prediction using committee machines of evolutionary neural trees (1999) Proceedings of IEEE International Conference On Evolutionary Computation, pp. 281-286. , September), In, Washington, DC, Piscataway, NJ: IEEE Pres

    Ga-based Selection Of Components For Heterogeneous Ensembles Of Support Vector Machines

    No full text
    Several support vector machine (SVM) instances with distinct kernel functions may be separately created and properly combined into the same learning machine structure. This is the idea underlying heterogeneous ensembles of SVMs (HE-SVMs), an approach conceived to alleviate the performance bottlenecks incurred with the kernel function choice problem inherent in SVM design. In this paper, we assess the effectiveness of applying an evolutionary based mechanism (GASe1) in the search of the optimal subset of SVM models for automatic HE-SVM construction. GASe1 has the advantage of merging both the selection and combination of component SVMs into the same optimization process, and has shown sound performance when compared with two other component selection methods in complicated classification problems. © 2003 IEEE.322382245Alon, U., Barkai, N., Notterman, D.A., Gish, K., Ybarra, S., Mack, D., Levine, A.J., Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon cancer tissues probed by oligonucleotide arrays (1999) Proc. Natl. Acad. Sci. USA, 96 (12), pp. 6745-6750Back, T., Fogel, D.B., Michalewicz, Z., (1997) Handbook of Evolutionary Computation, , editors. Institute of Physics Publishing and Oxford University PressBlake, C.L., Merz, C.J., (1998) The UC1 Repository of Machine Learning, , http://ics.uci.edu/mlearn/MLRepository.htmlCover, T., Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition (1965) IEEE Transactions on Electronic Computers, 14, pp. 326-334Cristianini, N., Shawe-Taylor, J., (2000) An Introduction to Support Vector Machines, , Cambridge PressFogel, D., Wasson III, E., Boughton, E., Evolving neural networks for detecting breast cancer (1995) Cancer Letters, 96, pp. 49-53Guyon, I., Weston, J., Barnhill, S., Vapnik, V., Gene selection for cancer classification using support vector machines (2002) Machine Learning, 46 (1-3), pp. 389-422Hansen, L.K., Salamon, P., Neural network ensembles (1990) IEEE Transactions on Pattern Analysis and Machine Intelligence, 12 (10), pp. 993-1001Kearns, M., Mansour, Y., Ng, A., Ron, D., An experimental and theoretical comparison of model selection methods (1997) Machine Learning, 27 (1), pp. 7-50Kim, H.-C., Pang, S., Je, H.-M., Kim, D., Bang, S.-Y., Support vector machine ensemble with bagging (2002) Procs. of SVM 2002, pp. 397-408. , volume 2388 of LNCS, SpringerLee, S.-I., Ahn, J.-H., Cho, S.-B., Exploiting diversity of neural ensembles with speciated evolution (2001) Procs. of UCNN'01, pp. 808-813. , Washington, D.C., IEEE PressLima, C.A.M., Coelho, A.L.V., Von Zuben, F.J., Ensembles of support vector machines for regression problems (2002) Procs. of IJCNN'02, pp. 2381-2386. , Hawaii, IEEE PressLima, C.A.M., Coelho, A.L.V., Von Zuben, F.J., Model selection based on VC-dimension for heterogeneous ensembles of support vector machines (2002) Procs. of RASC'02, pp. 459-464. , A. Lotfi, J. Garibaldi, and R. John, editors, Nottingham, December, The Nottingham Trent UniversityPerrone, M.P., Cooper, L.N., When networks disagree: Ensemble method for neural networks (1993) Neural Networks for Speech and Image Processing, pp. 126-142. , R. J. Mammone, editor, Chapman- HallRuta, D., Gabrys, B., Application of the evolutionary algorithms for classifier selection in multiple classifier systems with majority voting (2001) Procs. of MCS 2001, pp. 399-408. , volume 2096 of LNCS, SpringerValentini, G., Dietterich, T., Bias-variance analysis and ensembles of svm (2002) Procs. of MCS 2002, pp. 222-231. , volume 2364, SpringerVapnik, V.N., (1998) Statistical Learning Theory, , John WileyZhou, Z.-H., Wu, J.-X., Jiang, Y., Chen, S.-F., Genetic algorithm based selective neural network ensemble (2001) Procs. of JCAI 2001, 2, pp. 797-802. , Seattle, AAAI PressZhou, Z.-H., Wu, J.-X., Tang, W., Chen, Z.-Q., Selectively ensembling neural classifiers (2002) Procs. of IJCNN 2002, 2, pp. 1411-1415. , Hawaii, IEEE Pres

    Using Fuzzy Petri Nets To Coordinate Collaborative Activities

    No full text
    This paper presents a fuzzy Petri net based approach suitable for the modeling of flexible coordination mechanisms to deal with temporal interdependencies between collaborative tasks. Such approach is based on an extension of the Generalized Fuzzy Petri Net model, including the notion of time for the execution and synchronization of these tasks. A scenario of study is described, indicating the suitability of the proposal.314941499Malone, T.W., Crowston, K., What is coordination theory and how can it help design cooperative work systems? (1990) Proc. ACM Conf. on Computer Supported Cooperative Work, pp. 357-370Edwards, W.K., Policies and roles in collaborative applications Proc. ACM Conf. on Computer Supported Cooperative Work, 1996, pp. 11-20Raposo, A.B., Magalhães, L.P., Ricarte, I.L.M., Petri nets based coordination mechanisms for multi-workflow environments (2000) Int. J. of Computer Systems Science and Engineering, 15 (5 SPEC. ISSUE), pp. 315-326. , SeptemberPedrycz, W., Gomide, F., A generalized fuzzy petri net model (1994) IEEE Trans. on Fuzzy Systems, 2 (4), pp. 295-301. , NovemberMerlin, P.M., Farber, D.J., Recoverability of communication protocols - Implications of a theoretical study (1976) IEEE Trans. on Communications, COM-24 (9), pp. 1036-1043. , SeptemberValette, R., Cardoso, J., Dubois, D., Monitoring manufacturing systems by means of petri nets with imprecise markings IEEE Int. Symp. on Intelligent Control, 1989, pp. 233-238Murata, T., Temporal uncertainty and fuzzy-timing high-level petri nets (1996) Int. Conf. on Applications and Theory of Petri Nets, LNCS 1091, pp. 11-28. , Springer-VerlagHolliday, M.A., Vernon, M.K., A generalized timed petri net model for performance analysis (1987) IEEE Trans. on Software Engineering, SE-13 (12), pp. 1297-1310. , DecemberVan Der Aalst, W.M.P., Van Hee, K.M., Houben, G.J., Modelling and analysing workflow using a Petri-net based approach Proc. 2nd Workshop on Computer-Supported Cooperative Work, Petri Nets and Related Formalisms, 1994, pp. 31-50Allen, J.F., Towards a general theory of action and time (1984) Artificial Intelligence, 23, pp. 123-15

    Ensembles Of Support Vector Machines For Regression Problems

    No full text
    Support vector machines (SVMs) tackle classification and regression problems by non-linearly mapping input data into high-dimensional feature spaces, wherein a linear decision surface is designed. Even though the high potential of these techniques has been demonstrated, their applicability has been swamped by the necessity of the a priori choice of the kernel function to realize the non-linear mapping, which, sometimes, turns to be a complex and non-effective process. In this paper, we advocate that the application of neural ensembles theory to SVMs should alleviate such performance bottlenecks, because different networks with distinct kernel functions such as polynomials or radial basis functions may be created and properly combined into the same neural structure. Ensembles of SVMs, thus, promote the automatic configuration and tuning of SVMs, and have their generalization capability assessed here by means of some function regression experiments.323812386Cortes, C., Vapnik, V., Support-vector networks (1995) Machine Learning, 20, pp. 273-207Girosi, F., An equivalence between sparse approximation and support vector machines (1997), A. I. Memo 1606, MIT, MayGunn, S., Support vector machine for classification and regression (1998), Image Speech & Intelligent Systems Group, Technical Report ISIS-I-98, University of Southampton, NovVapnik, V., (1995) The Nature of Stastical Learning Theory, , Springer, VerlagScholkopf, B., Sung, K., Burges, C., Girosi, F., Niyogi, P., Poggio, J., Vapnik, V., Comparing support vector machines with Gaussian kernels to radial basis function classifiers (1996), A. I. Memo 1599, MIT, DecMuller, K., Smola, A., Ratsh, G., Scholkopf, B., Kohlmorgen, J., Vapnik, V., Predicting time series with support vector machines Proceedings of the International Conference on Artificial Neural Networks, 1997Reilly, R.L., Scofied, C.L., Elbaum, C., Cooper, L.N., Learning system architectures composed of multiple learning modules (1987) Proc. IEEE First Int. Conf. on Neural Networks, 2. , IEEEScofield, C., Kenton, L., Chang, J., Multiple neural net architectures for character recognition Proc. Compcon, San Francisco, CA, February 1991, pp. 487-491. , IEEE Comp. Soc. PressBaxt, W.G., Improving the accuracy of an artificial neural network using multiple differently trained networks (1992) Neural Computation, 4 (5), pp. 135-144Hashem, S., Schmeiser, B., Improving model accuracy using optimal linear combinations of trained neural networks (1995) IEEE Transactions on Neural Networks, 6 (3), pp. 792-794Wolpert, D.H., Stacked generalization (1990), Technical Report LA-UR-90-3460, Complex Systems Group, Los Alamos, NMJacobs, R., Jordan, M., Nowlan, S., Hinton, G., Adaptive mixtures of local experts (1991) Neural Computation, 3 (1), pp. 79-87Hashem, S., Optimal linear combinations of neural networks (1997) Neural Network, 10 (4), pp. 599-614Kwok, Tin-Yau, J., Support vector mixture for classification and regression problems (1998) Procs. of the International Conference on Pattern Recognition (ICPR), pp. 255-258. , Brisbane, AugustJordan, M., Jacobs, R., Hierarchical mixtures of experts and the EM algorithm (1994) Neural Computation, 6 (2), pp. 181-214. , MarPerrone, M.P., Cooper, L.N., When network disagree: Ensemble method for neural networks (1993) Neural Networks for Speech and Image Processing, pp. 126-142. , In Mammone, R. J., editorChapman-HallHwang, J.-N., Lay, S.R., Maechler, M.I., Martin, R.D., Schimert, J., Regression modeling in back-propagation and project pursuit learning (1994) IEEE Transactions on Neural Networks, 5 (3), pp. 342-35

    Controlling Nonlinear Dynamic Systems With Projection Pursuit Learning

    No full text
    Projection Pursuit Learning (PPL) refers to a well-known constructive learning algorithm characterized by a very efficient and accurate computational procedure oriented to nonparametric regression. It has been employed as a means to counteract some problems related to the design of Artificial Neural Network (ANN) models, namely, the estimation of a (usually large) number of free parameters, the proper definition of the model's dimension, and the choice of the sources of nonlinearities (activation functions). In this work, the potentials of PPL are exploited through a different perspective, namely, in designing one-hidden-layer feedforward ANNs for the adaptive control of nonlinear dynamic systems. For such purpose, the proposed methodology is divided into three stages. In the first, the model identification process is undertaken. In the second, the ANN structure is defined according to an offline control setting. In these two stages, the PPL algorithm estimates not only the optimal number of hidden neurons but also the best activation function for each node. The final stage is performed online and promotes a fine-tuning in the parameters of the identification model and the controller. Simulation results indicate that it is possible to design effective neural models based on PPL for the control of nonlinear multivariate systems, with superior performance when compared to benchmarks. ©2006 IEEE.332337Aström, K.J., Wittenmark, B., (1995) Adaptive Control, , 2nd edition, Addison Wesley PublishingBärman, F., Biegler-König, F., On a class of efficient learning algoritlims for neural networks (1992) Neural networks, 5 (1), pp. 139-144Bishop, C.M., (1995) Neural Networks for Pattern Recognition, , Oxford University PressFriedman, J.H., Stuetzle, W., Projection pursuit regression (1981) Journal of the American Statistical Association, 76 (376), pp. 817-823Geman, S., Bienenstock, E., Doursat, R., Neural Networks and the Bias/Variance Dilemma (1992) Neural Computation, 4 (1), pp. 1-58Haykin, S., (1999) Neural Networks: A Comprehensive Foundation, , Prentice HallHornik, K., Multilayer feed forward networks are universal approximators (1989) Neural Networks, 2 (5), pp. 359-366Hwang, J.-N., Lay, S.-R., Maechler, M., Martin, R.D., Schimert, J., Regression Modeling in Back-Propagation and Projection Pursuit Learning (1994) IEEE Transactions on Neural Networks, 5 (3), pp. 342-353Kwok, T.-Y., Yeung, D.-Y., Constructive algorithms for structure learning in feedforward neural networks for regression problems (1997) IEEE Trans. on Neural Networks, 8 (3), pp. 630-645Ma, L., Khorasani, K., Constructive feedforward neural networks using Hermite polynomial activation functions (2005) IEEE T. on Neural Networks, 16 (4), pp. 821-833Meleiro, L.A.C., Von Zuben, F.J., Maciel Filho, R., Multivariate predictive control based on constructive learning neural networks: A bioprocess application (2004) Proceedings of the Brazilian Congress on Chemical Engineering, , paper number 3036Narendra, K.S., Parthasarathy, K., Identification and control of dynamical systems using neural networks (1990) IEEE Trans. Neural Networks, 1 (1), pp. 4-27Narendra, K.S., Mukhopadhyay, S., Adaptive control of nonlinear multivariable systems using neural networks (1994) Neural Networks, 7 (5), pp. 737-752Ng, G.W., (1997) Application of Neural Networks to Adaptive Control of Nonlinear Systems, , John Wiley & Sons IncRussell, R., Pruning algorithm - A survey (1993) IEEE Trans. on Neural Networks, 4 (5), pp. 740-747Von Zuben, F.J., Netto, M.L.A., Unit-Growing Learning Optimizing the Solvability Condition for Model-Free Regression (1995) Proceedings of the IEEE International Conference on Neural Networks, pp. 795-800Von Zuben, F.J., Netto, M.L.A., Projection Pursuit and the Solvability Condition Applied to Constructive Learning (1997) Proceedings of the IEEE International Conference on Neural Networks, pp. 1062-1067(1992) Handbook of Intelligent Control: Neural, Fuzzy and Adaptive Approaches, , D. White and D. Sofge eds, Van Nostrand ReinholdZhao, Y., Atkeson, C.G., Implementing projection pursuit learning (1996) IEEE Trans. Neural Networks, 7 (2), pp. 362-37

    Enhancing perceptrons with contrastive biclusters

    No full text

    Incremental particle swarm-guided local search for continuous optimization

    No full text
    We present an algorithm that is inspired by theoretical and empirical results in social learning and swarm intelligence research. The algorithm is based on a framework that we call incremental social learning. In practical terms, the algorithm is a hybrid between a local search procedure and a particle swarm optimization algorithm with growing population size. The local search procedure provides rapid convergence to good solutions while the particle swarm algorithm enables a comprehensive exploration of the search space. We provide experimental evidence that shows that the algorithm can find good solutions very rapidly without compromising its global search capabilities. © 2008 Springer Berlin Heidelberg.SCOPUS: cp.k5th International Workshop on Hybrid Metaheuristics, HM 2008; Malaga; Spain; 8 October 2008 through 9 October 2008.info:eu-repo/semantics/publishe
    corecore