346 research outputs found

    A novel population-based multi-objective CMA-ES and the impact of different constraint handling techniques

    Get PDF
    htmlabstractThe Covariance Matrix Adaptation Evolutionary Strategy (CMA-ES) is a well-known, state-of-the-art optimization algorithm for single-objective real-valued problems, especially in black-box settings. Although several extensions of CMA-ES to multi-objective (MO) optimization exist, no extension incorporates a key component of the most robust and general CMA-ES variant: the association of a population with each Gaussian distribution that drives optimization. To achieve this, we use a recently introduced framework for extending population-based algorithms from single- to multi-objective optimization. We compare, using six well-known benchmark problems, the performance of the newly constructed MO-CMA-ES with existing variants and with the estimation of distribution algorithm (EDA) known as iMAMaLGaM, that is also an instance of the framework, extending the single-objective EDA iAMaLGaM to MO. Results underline the advantages of being able to use populations. Because many real-world problems have constraints, we also study the use of four constraint-handling techniques. We find that CMA-ES is typically less robust to these techniques than iAMaLGaM. Moreover, whereas we could verify that a penalty method that was previously used in literature leads to fast convergence, we also find that it has a high risk of finding only nearly, but not entirely, feasible solutions. We therefore propose that other constraint-handling techniques should be preferred in general

    A MOPSO Algorithm Based Exclusively on Pareto Dominance Concepts

    Get PDF
    Copyright © 2005 Springer Verlag. The final publication is available at link.springer.com3rd International Conference, EMO 2005, Guanajuato, Mexico, March 9-11, 2005. ProceedingsBook title: Evolutionary Multi-Criterion OptimizationIn extending the Particle Swarm Optimisation methodology to multi-objective problems it is unclear how global guides for particles should be selected. Previous work has relied on metric information in objective space, although this is at variance with the notion of dominance which is used to assess the quality of solutions. Here we propose methods based exclusively on dominance for selecting guides from a non-dominated archive. The methods are evaluated on standard test problems and we find that probabilistic selection favouring archival particles that dominate few particles provides good convergence towards and coverage of the Pareto front. We demonstrate that the scheme is robust to changes in objective scaling. We propose and evaluate methods for confining particles to the feasible region, and find that allowing particles to explore regions close to the constraint boundaries is important to ensure convergence to the Pareto front

    A Study of Simulated Annealing Techniques for Multi-Objective Optimisation

    Get PDF
    Many areas in which computational optimisation may be applied are multi-objective optimisation problems; those where multiple objectives must be minimised (for minimisation problems) or maximised (for maximisation problems). Where (as is usually the case) these are competing objectives, the optimisation involves the discovery of a set of solutions the quality of which cannot be distinguished without further preference information regarding the objectives. A large body of literature exists documenting the study and application of evolutionary algorithms to multi-objective optimisation, with particular focus being given to evolutionary strategy techniques which demonstrate the ability to converge to desired solutions rapidly on many problems. Simulated annealing is a single-objective optimisation technique which is provably convergent, making it a tempting technique for extension to multi-objective optimisation. Previous proposals for extending simulated annealing to the multi-objective case have mostly taken the form of a traditional single-objective simulated annealer optimising a composite (often summed) function of the objectives. The first part of this thesis deals with introducing an alternate method for multiobjective simulated annealing, dealing with the dominance relation which operates without assigning preference information to the objectives. Non-generic improvements to this algorithm are presented, providing methods for generating more desirable suggestions for new solutions. This new method is shown to exhibit rapid convergence to the desired set, dependent upon the properties of the problem, with empirical results on a range of popular test problems with comparison to the popular NSGA-II genetic algorithm and a leading multi-objective simulated annealer from the literature. The new algorithm is applied to the commercial optimisation of CDMA mobile telecommunication networks and is shown to perform well upon this problem. The second section of this thesis contains an investigation into the effects upon convergence of a range of optimiser properties. New algorithms are proposed with the properties desired to investigate. The relationship between evolutionary strategies and the simulated annealing techniques is illustrated, and explanation of the differing performance of the previously proposed algorithms across a standard test suite is given. The properties of problems on which simulated annealer approaches are desirable are investigated and new problems proposed to best provide comparisons between different simulated annealing techniques.Motorol

    Optimization of Low Reynolds Number Airfoils for Martian Rotor Applications Using an Evolutionary Algorithm

    Get PDF
    The Mars Helicopter (MH) will be flying on the NASA Mars 2020 rover mission scheduled to launch in July of 2020. Research is being performed at the Jet Propulsion Laboratory (JPL) and NASA Ames Research Center to extend the current capabilities and develop the Mars Science Helicopter (MSH) as the next possible step for Martian rotorcraft. The low atmospheric density and the relatively small-scale rotors result in very low chord-based Reynolds number flows over the rotor airfoils. The low Reynolds number regime results in rapid performance degradation for conventional airfoils due to laminar separation without reattachment. Unconventional airfoil shapes with sharp leading edges are explored and optimized for aerodynamic performance at representative Reynolds-Mach combinations for a concept rotor. Sharp leading edges initiate immediate flow separation, and the occurrence of large-scale vortex shedding is found to contribute to the relative performance increase of the optimized airfoils, compared to conventional airfoil shapes. The oscillations are shown to occur independent from laminar-turbulent transition and therefore result in sustainable performance at lower Reynolds numbers. Comparisons are presented to conventional airfoil shapes and peak lift-to-drag ratio increases between 17% and 41% are observed for similar section lift

    A fuzzy decision variables framework for large-scale multiobjective optimization

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.In large-scale multiobjective optimization, too many decision variables hinder the convergence search of evolutionary algorithms. Reducing the search range of the decision space will significantly alleviate this puzzle. With this in mind, this paper proposes a fuzzy decision variables framework for largescale multiobjective optimization. The framework divides the entire evolutionary process into two main stages: fuzzy evolution and precise evolution. In fuzzy evolution, we blur the decision variables of the original solution to reduce the search range of the evolutionary algorithm in the decision space so that the evolutionary population can quickly converge. The degree of fuzzification gradually decreases with the evolutionary process. Once the population approximately converges, the framework will turn to precise evolution. In precise evolution, the actual decision variables of the solution are directly optimized to increase the diversity of the population so as to be closer to the true Pareto optimal front. Finally, this paper embeds some representative algorithms into the proposed framework and verifies the framework’s effectiveness through comparative experiments on various large-scale multiobjective problems with 500 to 5000 decision variables. Experimental results show that in large-scale multiobjective optimization, the framework proposed in this paper can significantly improve the performance and computational efficiency of multiobjective optimization algorithms

    Surrogate - Assisted Optimisation -Based Verification & Validation

    Get PDF
    This thesis deals with the application of optimisation based Validation and Verification (V&V) analysis on aerospace vehicles in order to determine their worst case performance metrics. To this end, three aerospace models relating to satellite and launcher vehicles provided by European Space Agency (ESA) on various projects are utilised. As a means to quicken the process of optimisation based V&V analysis, surrogate models are developed using polynomial chaos method. Surro- gate models provide a quick way to ascertain the worst case directions as computation time required for evaluating them is very small. A sin- gle evaluation of a surrogate model takes less than a second. Another contribution of this thesis is the evaluation of operational safety margin metric with the help of surrogate models. Operational safety margin is a metric defined in the uncertain parameter space and is related to the distance between the nominal parameter value and the first instance of performance criteria violation. This metric can help to gauge the robustness of the controller but requires the evaluation of the model in the constraint function and hence could be computationally intensive. As surrogate models are computationally very cheap, they are utilised to rapidly compute the operational safety margin metric. But this metric focuses only on finding a safe region around the nominal parameter value and the possibility of other disjoint safe regions are not explored. In order to find other safe or failure regions in the param- eter space, the method of Bernstein expansion method is utilised on surrogate polynomial models to help characterise the uncertain param- eter space into safe and failure regions. Furthermore, Binomial failure analysis is used to assign failure probabilities to failure regions which might help the designer to determine if a re-design of the controller is required or not. The methodologies of optimisation based V&V, surrogate modelling, operational safety margin, Bernstein expansion method and risk assessment have been combined together to form the WCAT-II MATLAB toolbox

    Advanced VLBI Imaging

    Get PDF
    Very Long Baseline Interferometry (VLBI) is an observational technique developed in astronomy for combining multiple radio telescopes into a single virtual instrument with an effective aperture reaching up to many thousand kilometers and enabling measurements at highest angular resolutions. The celebrated examples of applying VLBI to astrophysical studies include detailed, high-resolution images of the innermost parts of relativistic outflows (jets) in active galactic nuclei (AGN) and recent pioneering observations of the shadows of supermassive black holes (SMBH) in the center of our Galaxy and in the galaxy M87. Despite these and many other proven successes of VLBI, analysis and imaging of VLBI data still remain difficult, owing in part to the fact that VLBI imaging inherently constitutes an ill-posed inverse problem. Historically, this problem has been addressed in radio interferometry by the CLEAN algorithm, a matching-pursuit inverse modeling method developed in the early 1970-s and since then established as a de-facto standard approach for imaging VLBI data. In recent years, the constantly increasing demand for improving quality and fidelity of interferometric image reconstruction has resulted in several attempts to employ new approaches, such as forward modeling and Bayesian estimation, for application to VLBI imaging. While the current state-of-the-art forward modeling and Bayesian techniques may outperform CLEAN in terms of accuracy, resolution, robustness, and adaptability, they also tend to require more complex structure and longer computation times, and rely on extensive finetuning of a larger number of non-trivial hyperparameters. This leaves an ample room for further searches for potentially more effective imaging approaches and provides the main motivation for this dissertation and its particular focusing on the need to unify algorithmic frameworks and to study VLBI imaging from the perspective of inverse problems in general. In pursuit of this goal, and based on an extensive qualitative comparison of the existing methods, this dissertation comprises the development, testing, and first implementations of two novel concepts for improved interferometric image reconstruction. The concepts combine the known benefits of current forward modeling techniques, develop more automatic and less supervised algorithms for image reconstruction, and realize them within two different frameworks. The first framework unites multiscale imaging algorithms in the spirit of compressive sensing with a dictionary adapted to the uv-coverage and its defects (DoG-HiT, DoB-CLEAN). We extend this approach to dynamical imaging and polarimetric imaging. The core components of this framework are realized in a multidisciplinary and multipurpose software MrBeam, developed as part of this dissertation. The second framework employs a multiobjective genetic evolutionary algorithm (MOEA/D) for the purpose of achieving fully unsupervised image reconstruction and hyperparameter optimization. These new methods are shown to outperform the existing methods in various metrics such as angular resolution, structural sensitivity, and degree of supervision. We demonstrate the great potential of these new techniques with selected applications to frontline VLBI observations of AGN jets and SMBH. In addition to improving the quality and robustness of image reconstruction, DoG-HiT, DoB-CLEAN and MOEA/D also provide such novel capabilities as dynamic reconstruction of polarimetric images on minute time-scales, or near-real time and unsupervised data analysis (useful in particular for application to large imaging surveys). The techniques and software developed in this dissertation are of interest for a wider range of inverse problems as well. This includes such versatile fields such as Ly-alpha tomography (where we improve estimates of the thermal state of the intergalactic medium), the cosmographic search for dark matter (where we improve forecasted bounds on ultralight dilatons), medical imaging, and solar spectroscopy

    Methodology for Comparison of Algorithms for Real-World Multi-objective Optimization Problems: Space Surveillance Network Design

    Get PDF
    Space Situational Awareness (SSA) is an activity vital to protecting national and commercial satellites from damage or destruction due to collisions. Recent research has demonstrated a methodology using evolutionary algorithms (EAs) which is intended to develop near-optimal Space Surveillance Network (SSN) architectures in the sense of low cost, low latency, and high resolution. That research is extended here by (1) developing and applying a methodology to compare the performance of two or more algorithms against this problem, and (2) analyzing the effects of using reduced data sets in those searches. Computational experiments are presented in which the performance of five multi-objective search algorithms are compared to one another using four binary comparison methods, each quantifying the relationship between two solution sets in different ways. Relative rankings reveal strengths and weaknesses of evaluated algorithms empowering researchers to select the best algorithm for their specific needs. The use of reduced data sets is shown to be useful for producing relative rankings of algorithms that are representative of rankings produced using the full set

    Modelling and quantification of structural uncertainties in petroleum reservoirs assisted by a hybrid cartesian cut cell/enriched multipoint flux approximation approach

    Get PDF
    Efficient and profitable oil production is subject to make reliable predictions about reservoir performance. However, restricted knowledge about reservoir distributed properties and reservoir structure calls for History Matching in which the reservoir model is calibrated to emulate the field observed history. Such an inverse problem yields multiple history-matched models which might result in different predictions of reservoir performance. Uncertainty Quantification restricts the raised model uncertainties and boosts the model reliability for the forecasts of future reservoir behaviour. Conventional approaches of Uncertainty Quantification ignore large scale uncertainties related to reservoir structure, while structural uncertainties can influence the reservoir forecasts more intensely compared with petrophysical uncertainty. What makes the quantification of structural uncertainty impracticable is the need for global regridding at each step of History Matching process. To resolve this obstacle, we develop an efficient methodology based on Cartesian Cut Cell Method which decouples the model from its representation onto the grid and allows uncertain structures to be varied as a part of History Matching process. Reduced numerical accuracy due to cell degeneracies in the vicinity of geological structures is adequately compensated with an enhanced scheme of class Locally Conservative Flux Continuous Methods (Extended Enriched Multipoint Flux Approximation Method abbreviated to extended EMPFA). The robustness and consistency of proposed Hybrid Cartesian Cut Cell/extended EMPFA approach are demonstrated in terms of true representation of geological structures influence on flow behaviour. In this research, the general framework of Uncertainty Quantification is extended and well-equipped by proposed approach to tackle uncertainties of different structures such as reservoir horizons, bedding layers, faults and pinchouts. Significant improvements in the quality of reservoir recovery forecasts and reservoir volume estimation are presented for synthetic models of uncertain structures. Also this thesis provides a comparative study of structural uncertainty influence on reservoir forecasts among various geological structures
    corecore