17 research outputs found

    ParaDisEO-Based Design of Parallel and Distributed Evolutionary Algorithms

    Get PDF
    The original publication is available at www.springerlink.comInternational audienceParaDisEO is a framework dedicated to the design of parallel and distributed metaheuristics including local search methods and evolutionary algorithms. This paper focuses on the latter aspect. We present the three parallel and distributed models implemented in ParaDisEO and show how these can be exploited in a user-friendly, flexible and transparent way. These models can be deployed on distributed memory machines as well as on shared memory multi-processors, taking advantage of the shared memory in the latter case. In addition, we illustrate the instantiation of the models through two applications demonstrating the efficiency and robustness of the framework

    Evolutionary Latent Class Clustering of Qualitative Data

    Get PDF
    The latent class model or multivariate multinomial mixture is a powerful model for clustering discrete data. This model is expected to be useful to represent non-homogeneous populations. It uses a conditional independence assumption given the latent class to which a statistical unit is belonging. However, whereas a predictive approach of cluster analysis from qualitative data can be easily derived from a fully Bayesian analysis with Jeffreys non informative prior distributions, it leads to a criterion (the integrated completed likelihood derived from the latent class model) that proves difficult to optimize by the standard approach based on the EM algorithm. An Evolutionary Algorithms is designed to tackle this discrete optimization problem, and an extensive parameter study on a large artificial dataset allows to derive stable parameters. A Monte Carlo approach is used to validate those parameters on other artificial datasets, as well as on some well-known real data: the Evolutionary Algorithm seems to repeatedly perform better than other standard clustering techniques on the same data

    Large-Scale Newscast Computing on the Internet

    Get PDF

    Evolutionary Latent Class Clustering of Qualitative Data

    Get PDF
    The latent class model or multivariate multinomial mixture is a powerful model for clustering discrete data. This model is expected to be useful to represent non-homogeneous populations. It uses a conditional independence assumption given the latent class to which a statistical unit is belonging. However, whereas a predictive approach of cluster analysis from qualitative data can be easily derived from a fully Bayesian analysis with Jeffreys non informative prior distributions, it leads to a criterion (the integrated completed likelihood derived from the latent class model) that proves difficult to optimize by the standard approach based on the EM algorithm. An Evolutionary Algorithms is designed to tackle this discrete optimization problem, and an extensive parameter study on a large artificial dataset allows to derive stable parameters. A Monte Carlo approach is used to validate those parameters on other artificial datasets, as well as on some well-known real data: the Evolutionary Algorithm seems to repeatedly perform better than other standard clustering techniques on the same data

    The Seamless Peer and Cloud Evolution Framework

    Get PDF
    Evolutionary algorithms are increasingly being applied to problems that are too computationally expensive to run on a single personal computer due to costly fitness function evaluations and/or large numbers of fitness evaluations. Here, we introduce the Seamless Peer And Cloud Evolution (SPACE) framework, which leverages bleeding edge web technologies to allow the computational resources necessary for running large scale evolutionary experiments to be made available to amateur and professional researchers alike, in a scalable and cost-effective manner, directly from their web browsers. The SPACE framework accomplishes this by distributing fitness evaluations across a heterogeneous pool of cloud compute nodes and peer computers. As a proof of concept, this framework has been attached to the RoboGen open-source platform for the co-evolution of robot bodies and brains, but importantly the framework has been built in a modular fashion such that it can be easily coupled with other evolutionary computation systems

    Genetic programming and serial processing for time series classification

    Full text link
    This work describes an approach devised by the authors for time series classification. In our approach genetic programming is used in combination with a serial processing of data, where the last output is the result of the classification. The use of genetic programming for classification, although still a field where more research in needed, is not new. However, the application of genetic programming to classification tasks is normally done by considering the input data as a feature vector. That is, to the best of our knowledge, there are not examples in the genetic programming literature of approaches where the time series data are processed serially and the last output is considered as the classification result. The serial processing approach presented here fills a gap in the existing literature. This approach was tested in three different problems. Two of them are real world problems whose data were gathered for online or conference competitions. As there are published results of these two problems this gives us the chance to compare the performance of our approach against top performing methods. The serial processing of data in combination with genetic programming obtained competitive results in both competitions, showing its potential for solving time series classification problems. The main advantage of our serial processing approach is that it can easily handle very large datasets.Alfaro Cid, E.; Sharman, KC.; Esparcia Alcázar, AI. (2014). Genetic programming and serial processing for time series classification. Evolutionary Computation. 22(2):265-285. doi:10.1162/EVCO_a_00110S265285222Adeodato, P. J. L., Arnaud, A. L., Vasconcelos, G. C., Cunha, R. C. L. V., Gurgel, T. B., & Monteiro, D. S. M. P. (2009). The role of temporal feature extraction and bagging of MLP neural networks for solving the WCCI 2008 Ford Classification Challenge. 2009 International Joint Conference on Neural Networks. doi:10.1109/ijcnn.2009.5178965Alfaro-Cid, E., Merelo, J. J., de Vega, F. F., Esparcia-Alcázar, A. I., & Sharman, K. (2010). Bloat Control Operators and Diversity in Genetic Programming: A Comparative Study. Evolutionary Computation, 18(2), 305-332. doi:10.1162/evco.2010.18.2.18206Alfaro-Cid, E., Sharman, K., & Esparcia-Alcazar, A. I. (s. f.). Evolving a Learning Machine by Genetic Programming. 2006 IEEE International Conference on Evolutionary Computation. doi:10.1109/cec.2006.1688316Arenas, M. G., Collet, P., Eiben, A. E., Jelasity, M., Merelo, J. J., Paechter, B., … Schoenauer, M. (2002). A Framework for Distributed Evolutionary Algorithms. Lecture Notes in Computer Science, 665-675. doi:10.1007/3-540-45712-7_64Blankertz, B., Muller, K.-R., Curio, G., Vaughan, T. M., Schalk, G., Wolpaw, J. R., … Birbaumer, N. (2004). The BCI Competition 2003: Progress and Perspectives in Detection and Discrimination of EEG Single Trials. IEEE Transactions on Biomedical Engineering, 51(6), 1044-1051. doi:10.1109/tbme.2004.826692Borrelli, A., De Falco, I., Della Cioppa, A., Nicodemi, M., & Trautteur, G. (2006). Performance of genetic programming to extract the trend in noisy data series. Physica A: Statistical Mechanics and its Applications, 370(1), 104-108. doi:10.1016/j.physa.2006.04.025Eads, D. R., Hill, D., Davis, S., Perkins, S. J., Ma, J., Porter, R. B., & Theiler, J. P. (2002). Genetic Algorithms and Support Vector Machines for Time Series Classification. Applications and Science of Neural Networks, Fuzzy Systems, and Evolutionary Computation V. doi:10.1117/12.453526Eggermont, J., Eiben, A. E., & van Hemert, J. I. (1999). A Comparison of Genetic Programming Variants for Data Classification. Lecture Notes in Computer Science, 281-290. doi:10.1007/3-540-48412-4_24Holladay, K. L., & Robbins, K. A. (2007). Evolution of Signal Processing Algorithms using Vector Based Genetic Programming. 2007 15th International Conference on Digital Signal Processing. doi:10.1109/icdsp.2007.4288629Kaboudan, M. A. (2000). Computational Economics, 16(3), 207-236. doi:10.1023/a:1008768404046Kishore, J. K., Patnaik, L. M., Mani, V., & Agrawal, V. K. (2000). Application of genetic programming for multicategory pattern classification. IEEE Transactions on Evolutionary Computation, 4(3), 242-258. doi:10.1109/4235.873235Kishore, J. K., Patnaik, L. M., Mani, V., & Agrawal, V. K. (2001). Genetic programming based pattern classification with feature space partitioning. Information Sciences, 131(1-4), 65-86. doi:10.1016/s0020-0255(00)00081-5Langdon, W. B., McKay, R. I., & Spector, L. (2010). Genetic Programming. International Series in Operations Research & Management Science, 185-225. doi:10.1007/978-1-4419-1665-5_7Yi Liu, & Khoshgoftaar, T. (s. f.). Reducing overfitting in genetic programming models for software quality classification. Eighth IEEE International Symposium on High Assurance Systems Engineering, 2004. Proceedings. doi:10.1109/hase.2004.1281730Luke, S. (2000). Two fast tree-creation algorithms for genetic programming. IEEE Transactions on Evolutionary Computation, 4(3), 274-283. doi:10.1109/4235.873237Luke, S., & Panait, L. (2006). A Comparison of Bloat Control Methods for Genetic Programming. Evolutionary Computation, 14(3), 309-344. doi:10.1162/evco.2006.14.3.309Mensh, B. D., Werfel, J., & Seung, H. S. (2004). BCI Competition 2003—Data Set Ia: Combining Gamma-Band Power With Slow Cortical Potentials to Improve Single-Trial Classification of Electroencephalographic Signals. IEEE Transactions on Biomedical Engineering, 51(6), 1052-1056. doi:10.1109/tbme.2004.827081Muni, D. P., Pal, N. R., & Das, J. (2006). Genetic programming for simultaneous feature selection and classifier design. IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics), 36(1), 106-117. doi:10.1109/tsmcb.2005.854499Oltean, M., & Dioşan, L. (2009). An autonomous GP-based system for regression and classification problems. Applied Soft Computing, 9(1), 49-60. doi:10.1016/j.asoc.2008.03.008Otero, F. E. B., Silva, M. M. S., Freitas, A. A., & Nievola, J. C. (2003). Genetic Programming for Attribute Construction in Data Mining. Genetic Programming, 384-393. doi:10.1007/3-540-36599-0_36Poli, R. (2010). Genetic programming theory. Proceedings of the 12th annual conference comp on Genetic and evolutionary computation - GECCO ’10. doi:10.1145/1830761.1830905Tsakonas, A. (2006). A comparison of classification accuracy of four genetic programming-evolved intelligent structures. Information Sciences, 176(6), 691-724. doi:10.1016/j.ins.2005.03.012Wolpaw, J. R., Birbaumer, N., Heetderks, W. J., McFarland, D. J., Peckham, P. H., Schalk, G., … Vaughan, T. M. (2000). Brain-computer interface technology: a review of the first international meeting. IEEE Transactions on Rehabilitation Engineering, 8(2), 164-173. doi:10.1109/tre.2000.84780
    corecore