570 research outputs found

    Genetic Algorithm and its Variants: Theory and Applications

    Get PDF
    The Genetic Algorithm is a popular optimization technique which is bio-inspired and is based on the concepts of natural genetics and natural selection theories proposed by Charles Darwin. The Algorithm functions on three basic genetic operators of selection, crossover and mutation. Based on the types of these operators GA has many variants like Real coded GA, Binary coded GA, Sawtooth GA, Micro GA, Improved GA, Differential Evolution GA. This paper discusses a few of the forms of GA and applies the techniques to the problem of Function optimization and System Identification. The paper makes a comparative analysis of the advantages and disadvantages of the different types of GA. The computer simulations illustrate the results. It also makes a comparison between the GA technique and Incremental LMS algorithm for System Identification

    A Comprehensive Survey on Particle Swarm Optimization Algorithm and Its Applications

    Get PDF
    Particle swarm optimization (PSO) is a heuristic global optimization method, proposed originally by Kennedy and Eberhart in 1995. It is now one of the most commonly used optimization techniques. This survey presented a comprehensive investigation of PSO. On one hand, we provided advances with PSO, including its modifications (including quantum-behaved PSO, bare-bones PSO, chaotic PSO, and fuzzy PSO), population topology (as fully connected, von Neumann, ring, star, random, etc.), hybridization (with genetic algorithm, simulated annealing, Tabu search, artificial immune system, ant colony algorithm, artificial bee colony, differential evolution, harmonic search, and biogeography-based optimization), extensions (to multiobjective, constrained, discrete, and binary optimization), theoretical analysis (parameter selection and tuning, and convergence analysis), and parallel implementation (in multicore, multiprocessor, GPU, and cloud computing forms). On the other hand, we offered a survey on applications of PSO to the following eight fields: electrical and electronic engineering, automation control systems, communication theory, operations research, mechanical engineering, fuel and energy, medicine, chemistry, and biology. It is hoped that this survey would be beneficial for the researchers studying PSO algorithms

    Population-based algorithms for improved history matching and uncertainty quantification of Petroleum reservoirs

    Get PDF
    In modern field management practices, there are two important steps that shed light on a multimillion dollar investment. The first step is history matching where the simulation model is calibrated to reproduce the historical observations from the field. In this inverse problem, different geological and petrophysical properties may provide equally good history matches. Such diverse models are likely to show different production behaviors in future. This ties the history matching with the second step, uncertainty quantification of predictions. Multiple history matched models are essential for a realistic uncertainty estimate of the future field behavior. These two steps facilitate decision making and have a direct impact on technical and financial performance of oil and gas companies. Population-based optimization algorithms have been recently enjoyed growing popularity for solving engineering problems. Population-based systems work with a group of individuals that cooperate and communicate to accomplish a task that is normally beyond the capabilities of each individual. These individuals are deployed with the aim to solve the problem with maximum efficiency. This thesis introduces the application of two novel population-based algorithms for history matching and uncertainty quantification of petroleum reservoir models. Ant colony optimization and differential evolution algorithms are used to search the space of parameters to find multiple history matched models and, using a Bayesian framework, the posterior probability of the models are evaluated for prediction of reservoir performance. It is demonstrated that by bringing latest developments in computer science such as ant colony, differential evolution and multiobjective optimization, we can improve the history matching and uncertainty quantification frameworks. This thesis provides insights into performance of these algorithms in history matching and prediction and develops an understanding of their tuning parameters. The research also brings a comparative study of these methods with a benchmark technique called Neighbourhood Algorithms. This comparison reveals the superiority of the proposed methodologies in various areas such as computational efficiency and match quality

    Artificial Intelligence Approach for Seismic Control of Structures

    Get PDF
    Abstract In the first part of this research, the utilization of tuned mass dampers in the vibration control of tall buildings during earthquake excitations is studied. The main issues such as optimizing the parameters of the dampers and studying the effects of frequency content of the target earthquakes are addressed. Abstract The non-dominated sorting genetic algorithm method is improved by upgrading generic operators, and is utilized to develop a framework for determining the optimum placement and parameters of dampers in tall buildings. A case study is presented in which the optimal placement and properties of dampers are determined for a model of a tall building under different earthquake excitations through computer simulations. Abstract In the second part, a novel framework for the brain learning-based intelligent seismic control of smart structures is developed. In this approach, a deep neural network learns how to improve structural responses during earthquake excitations using feedback control. Abstract Reinforcement learning method is improved and utilized to develop a framework for training the deep neural network as an intelligent controller. The efficiency of the developed framework is examined through two case studies including a single-degree-of-freedom system and a high-rise building under different earthquake excitation records. Abstract The results show that the controller gradually develops an optimum control policy to reduce the vibrations of a structure under an earthquake excitation through a cyclical process of actions and observations. Abstract It is shown that the controller efficiently improves the structural responses under new earthquake excitations for which it was not trained. Moreover, it is shown that the controller has a stable performance under uncertainties

    Coevolutionary particle swarm optimization using AIS and its application in multiparameter estimation of PMSM

    Get PDF
    In this paper, a coevolutionary particle-swarm-optimization (PSO) algorithm associating with the artificial immune principle is proposed. In the proposed algorithm, the whole population is divided into two kinds of subpopulations consisting of one elite subpopulation and several normal subpopulations. The best individual of each normal subpopulation will be memorized into the elite subpopulation during the evolution process. A hybrid method, which creates new individuals by using three different operators, is presented to ensure the diversity of all the subpopulations. Furthermore, a simple adaptive wavelet learning operator is utilized for accelerating the convergence speed of the pbest particles. The improved immune-clonal-selection operator is employed for optimizing the elite subpopulation, while the migration scheme is employed for the information exchange between elite subpopulation and normal subpopulations. The performance of the proposed algorithm is verified by testing on a suite of standard benchmark functions, which shows faster convergence and global search ability. Its performance is further evaluated by its application to multiparameter estimation of permanent-magnet synchronous machines, which shows that its performance significantly outperforms existing PSOs. The proposed algorithm can estimate the machine dq-axis inductances, stator winding resistance, and rotor flux linkage simultaneously. © 2013 IEEE

    Contributions on evolutionary computation for statistical inference

    Get PDF
    Evolutionary Computation (EC) techniques have been introduced in the 1960s for dealing with complex situations. One possible example is an optimization problems not having an analytical solution or being computationally intractable; in many cases such methods, named Evolutionary Algorithms (EAs), have been successfully implemented. In statistics there are many situations where complex problems arise, in particular concerning optimization. A general example is when the statistician needs to select, inside a prohibitively large discrete set, just one element, which could be a model, a partition, an experiment, or such: this would be the case of model selection, cluster analysis or design of experiment. In other situations there could be an intractable function of data, such as a likelihood, which needs to be maximized, as it happens in model parameter estimation. These kind of problems are naturally well suited for EAs, and in the last 20 years a large number of papers has been concerned with applications of EAs in tackling statistical issues. The present dissertation is set in this part of literature, as it reports several implementations of EAs in statistics, although being mainly focused on statistical inference problems. Original results are proposed, as well as overviews and surveys on several topics. EAs are employed and analyzed considering various statistical points of view, showing and confirming their efficiency and flexibility. The first proposal is devoted to parametric estimation problems. When EAs are employed in such analysis a novel form of variability related to their stochastic elements is introduced. We shall analyze both variability due to sampling, associated with selected estimator, and variability due to the EA. This analysis is set in a framework of statistical and computational tradeoff question, crucial in nowadays problems, by introducing cost functions related to both data acquisition and EA iterations. The proposed method will be illustrated by means of model building problem examples. Subsequent chapter is concerned with EAs employed in Markov Chain Monte Carlo (MCMC) sampling. When sampling from multimodal or highly correlated distribution is concerned, in fact, a possible strategy suggests to run several chains in parallel, in order to improve their mixing. If these chains are allowed to interact with each other then many analogies with EC techniques can be observed, and this has led to research in many fields. The chapter aims at reviewing various methods found in literature which conjugates EC techniques and MCMC sampling, in order to identify specific and common procedures, and unifying them in a framework of EC. In the last proposal we present a complex time series model and an identification procedure based on Genetic Algorithms (GAs). The model is capable of dealing with seasonality, by Periodic AutoRegressive (PAR) modelling, and structural changes in time, leading to a nonstationary structure. As far as a very large number of parameters and possibilites of change points are concerned, GAs are appropriate for identifying such model. Effectiveness of procedure is shown on both simulated data and real examples, these latter referred to river flow data in hydrology. The thesis concludes with some final remarks, concerning also future work

    Contributions on evolutionary computation for statistical inference

    Get PDF
    Evolutionary Computation (EC) techniques have been introduced in the 1960s for dealing with complex situations. One possible example is an optimization problems not having an analytical solution or being computationally intractable; in many cases such methods, named Evolutionary Algorithms (EAs), have been successfully implemented. In statistics there are many situations where complex problems arise, in particular concerning optimization. A general example is when the statistician needs to select, inside a prohibitively large discrete set, just one element, which could be a model, a partition, an experiment, or such: this would be the case of model selection, cluster analysis or design of experiment. In other situations there could be an intractable function of data, such as a likelihood, which needs to be maximized, as it happens in model parameter estimation. These kind of problems are naturally well suited for EAs, and in the last 20 years a large number of papers has been concerned with applications of EAs in tackling statistical issues. The present dissertation is set in this part of literature, as it reports several implementations of EAs in statistics, although being mainly focused on statistical inference problems. Original results are proposed, as well as overviews and surveys on several topics. EAs are employed and analyzed considering various statistical points of view, showing and confirming their efficiency and flexibility. The first proposal is devoted to parametric estimation problems. When EAs are employed in such analysis a novel form of variability related to their stochastic elements is introduced. We shall analyze both variability due to sampling, associated with selected estimator, and variability due to the EA. This analysis is set in a framework of statistical and computational tradeoff question, crucial in nowadays problems, by introducing cost functions related to both data acquisition and EA iterations. The proposed method will be illustrated by means of model building problem examples. Subsequent chapter is concerned with EAs employed in Markov Chain Monte Carlo (MCMC) sampling. When sampling from multimodal or highly correlated distribution is concerned, in fact, a possible strategy suggests to run several chains in parallel, in order to improve their mixing. If these chains are allowed to interact with each other then many analogies with EC techniques can be observed, and this has led to research in many fields. The chapter aims at reviewing various methods found in literature which conjugates EC techniques and MCMC sampling, in order to identify specific and common procedures, and unifying them in a framework of EC. In the last proposal we present a complex time series model and an identification procedure based on Genetic Algorithms (GAs). The model is capable of dealing with seasonality, by Periodic AutoRegressive (PAR) modelling, and structural changes in time, leading to a nonstationary structure. As far as a very large number of parameters and possibilites of change points are concerned, GAs are appropriate for identifying such model. Effectiveness of procedure is shown on both simulated data and real examples, these latter referred to river flow data in hydrology. The thesis concludes with some final remarks, concerning also future work

    Metaheuristic design of feedforward neural networks: a review of two decades of research

    Get PDF
    Over the past two decades, the feedforward neural network (FNN) optimization has been a key interest among the researchers and practitioners of multiple disciplines. The FNN optimization is often viewed from the various perspectives: the optimization of weights, network architecture, activation nodes, learning parameters, learning environment, etc. Researchers adopted such different viewpoints mainly to improve the FNN's generalization ability. The gradient-descent algorithm such as backpropagation has been widely applied to optimize the FNNs. Its success is evident from the FNN's application to numerous real-world problems. However, due to the limitations of the gradient-based optimization methods, the metaheuristic algorithms including the evolutionary algorithms, swarm intelligence, etc., are still being widely explored by the researchers aiming to obtain generalized FNN for a given problem. This article attempts to summarize a broad spectrum of FNN optimization methodologies including conventional and metaheuristic approaches. This article also tries to connect various research directions emerged out of the FNN optimization practices, such as evolving neural network (NN), cooperative coevolution NN, complex-valued NN, deep learning, extreme learning machine, quantum NN, etc. Additionally, it provides interesting research challenges for future research to cope-up with the present information processing era
    corecore