17 research outputs found

    Multi-objective optimisation in the presence of uncertainty

    Get PDF
    2005 IEEE Congress on Evolutionary Computation, Edinburgh, Scotland, 2-5 September 2005The codebase for this paper is available at https://github.com/fieldsend/ieee_cec_2005_bayes_uncertainThere has been only limited discussion on the effect of uncertainty and noise in multi-objective optimisation problems and how to deal with it. Here we address this problem by assessing the probability of dominance and maintaining an archive of solutions which are, with some known probability, mutually non-dominating.We examine methods for estimating the probability of dominance. These depend crucially on estimating the effective noise variance and we introduce a novel method of learning the variance during optimisation.Probabilistic domination contours are presented as a method for conveying the confidence that may be placed in objectives that are optimised in the presence of uncertainty

    The Rolling Tide Evolutionary Algorithm: A Multi-Objective Optimiser for Noisy Optimisation Problems

    Get PDF
    As the methods for evolutionary multiobjective optimization (EMO) mature and are applied to a greater number of real-world problems, there has been gathering interest in the effect of uncertainty and noise on multiobjective optimization, specifically how algorithms are affected by it, how to mitigate its effects, and whether some optimizers are better suited to dealing with it than others. Here we address the problem of uncertain evaluation, in which the uncertainty can be modeled as an additive noise in objective space. We develop a novel algorithm, the rolling tide evolutionary algorithm (RTEA), which progressively improves the accuracy of its estimated Pareto set, while simultaneously driving the front toward the true Pareto front. It can cope with noise whose characteristics change as a function of location (both design and objective), or which alter during the course of an optimization. Four state-of-the-art noise-tolerant EMO algorithms, as well as four widely used standard EMO algorithms, are compared to RTEA on 70 instances of ten continuous space test problems from the CEC'09 multiobjective optimization test suite. Different instances of these problems are generated by modifying them to exhibit different types and intensities of noise. RTEA seems to provide competitive performance across both the range of test problems used and noise types

    Modified Selection Mechanisms Designed to Help Evolution Strategies Cope with Noisy Response Surfaces

    Get PDF
    With the rise in the application of evolution strategies for simulation optimization, a better understanding of how these algorithms are affected by the stochastic output produced by simulation models is needed. At very high levels of stochastic variance in the output, evolution strategies in their standard form experience difficulty locating the optimum. The degradation of the performance of evolution strategies in the presence of very high levels of variation can be attributed to the decrease in the proportion of correctly selected solutions as parents from which offspring solutions are generated. The proportion of solutions correctly selected as parents can be increased by conducting additional replications for each solution. However, experimental evaluation suggests that a very high proportion of correctly selected solutions as parents is not required. A proportion of correctly selected solutions of around 0.75 seems sufficient for evolution strategies to perform adequately. Integrating statistical techniques into the algorithm?s selection process does help evolution strategies cope with high levels of noise. There are four categories of techniques: statistical ranking and selection techniques, multiple comparison procedures, clustering techniques, and other techniques. Experimental comparison of indifference zone selection procedure by Dudewicz and Dalal (1975), sequential procedure by Kim and Nelson (2001), Tukey?s Procedure, clustering procedure by Calsinki and Corsten (1985), and Scheffe?s procedure (1985) under similar conditions suggests that the sequential ranking and selection procedure by Kim and Nelson (2001) helps evolution strategies cope with noise using the smallest number of replications. However, all of the techniques required a rather large number of replications, which suggests that better methods are needed. Experimental results also indicate that a statistical procedure is especially required during the later generations when solutions are spaced closely together in the search space (response surface)

    On the Robustness of Evolutionary Algorithms to Noise: Refined Results and an Example Where Noise Helps

    Get PDF
    We present reined results for the expected optimisation time of the (1+1) EA and the (1+λ) EA on LeadingOnes in the prior noise model, where in each itness evaluation the search point is altered before evaluation with probability p. Previous work showed that the (1+1) EA runs in polynomial time if p = O((logn)/n 2 ) and needs superpolynomial time if p = Ω((logn)/n), leaving a huge gap for which no results were known. We close this gap by showing that the expected optimisation time is Θ(n 2 ) · exp(Θ(pn2 )), allowing for the irst time to locate the threshold between polynomial and superpolynomial expected times at p = Θ((logn)/n 2 ). Hence the (1+1) EA on LeadingOnes is much more sensitive to noise than previously thought. We also show that ofspring populations of size λ ≥ 3.42 logn can efectively deal with much higher noise than known before. Finally, we present an example of a rugged landscape where prior noise can help to escape from local optima by blurring the landscape and allowing a hill climber to see the underlying gradient

    A survey of techniques for characterising fitness landscapes and some possible ways forward

    Get PDF
    Real-world optimisation problems are often very complex. Metaheuristics have been successful in solving many of these problems, but the difficulty in choosing the best approach can be a huge challenge for practitioners. One approach to this dilemma is to use fitness landscape analysis to better understand problems before deciding on approaches to solving the problems. However, despite extensive research on fitness landscape analysis and a large number of developed techniques, very few techniques are used in practice. This could be because fitness landscape analysis in itself can be complex. In an attempt to make fitness landscape analysis techniques accessible, this paper provides an overview of techniques from the 1980s to the present. Attributes that are important for practical implementation are highlighted and ways of adapting techniques to be more feasible or appropriate are suggested. The survey reveals the wide range of factors that can influence problem difficulty, emphasising the need for a shift in focus away from predicting problem hardness towards measuring characteristics. It is hoped that this survey will invoke renewed interest in the field of understanding complex optimisation problems and ultimately lead to better decision making on the use of appropriate metaheuristics.http://www.elsevier.com/locate/inshb201

    Decomposition Evolutionary Algorithms for Noisy Multiobjective Optimization

    Get PDF
    Multi-objective problems are a category of optimization problem that contain more than one objective function and these objective functions must be optimized simultaneously. Should the objective functions be conflicting, then a set of solutions instead of a single solution is required. This set is known as Pareto optimal. Multi-objective optimization problems arise in many real world applications where several competing objectives must be evaluated and optimal solutions found for them, in the presence of trade offs among conflicting objectives. Maximizing returns while minimizing the risk of stock market investments, or maximizing performance whilst minimizing fuel consumption and hazardous gas emission when buying a car are typical examples of real world multi-objective optimization problems. In this case a number of optimal solutions can be found, known as non-dominated or Pareto optimal solutions. Pareto optimal solutions are reached when it is impossible to improve one objective without making the others worse. Classical ways to address this problem used direct or gradient based methods that rendered them insufficient or computationally expensive for large scale or combinatorial problems. Other difficulties attended the classical methods, such as problem knowledge, which may not be available, or sensitivity to some problem features. For example, finding solutions on the entire Pareto optimal set can only be guaranteed for convex problems. Classical methods for generating the Pareto front set aggregate the objectives into a single or parametrized function before search. Thus, several runs and parameter settings are performed to achieve a set of solutions that approximate the Pareto optimals. Subsequently new methods have been developed, based on computer experiments with meta-heuristic algorithms. Most of these meta-heuristics implement some sort of stochastic search method, amongst which the 'Evolutionary Algorithm' is garnering much attention. It possesses several characteristics that make it a desirable method for confronting multi-objective problems. As a result, a number of studies in recent decades have developed or modified the MOEA for different purposes. This algorithm works with a population of solutions which are capable of searching for multiple Pareto optimal solutions in a single run. At the same time, only the fittest individuals in each generation are offered the chance for reproduction and representation in the next generation. The fitness assignment function is the guiding system of MOEA. Fitness value represents the strength of an individual. Unfortunately, many real world applications bring with them a certain degree of noise due to natural disasters, inefficient models, signal distortion or uncertain information. This noise affects the performance of the algorithm's fitness function and disrupts the optimization process. This thesis explores and targets the effect of this disruptive noise on the performance of the MOEA. In this thesis, we study the noisy MOP and modify the MOEA/D to improve its performance in noisy environments. To achieve this, we will combine the basic MOEA/D with the 'Ordinal Optimization' technique to handle uncertainties. The major contributions of this thesis are as follows. First, MOEA/D is tested in a noisy environment with different levels of noise, to give us a deeper understanding of where the basic algorithm fails to handle the noise. Then, we extend the basic MOEA/D to improve its noise handling by employing the ordinal optimization technique. This creates MOEA/D+OO, which will outperform MOEA/D in terms of diversity and convergence in noisy environments. It is tested against benchmark problems of varying levels of noise. Finally, to test the real world application of MOEA/D+OO, we solve a noisy portfolio optimization with the proposed algorithm. The portfolio optimization problem is a classic one in finance that has investors wanting to maximize a portfolio's return while minimizing risk of investment. The latter is measured by standard deviation of the portfolio's rate of return. These two objectives clearly make it a multi-objective problem

    Human visual evoked potentials: a computer aided investigation into their origin and variability

    Get PDF
    This thesis is concerned with the measurement of human visual Averaged Evoked Potentials (AEPs) to tachistoscopically presented pattern stimuli, i.e. the sudden appearance and disappearance of patterns into an otherwise;, continuously illuminated diffuse field, such that the overall change in luminance is zero or very small. Previous work reviewed includes that on the response of single cells in the cat and monkey visual cortices to contoured stimuli, and also that on the measurement of human visual AEPs to patterned stimuli. The work of D.A. Jeffreys, indicating that AEP scalp distribution measurements showed promise for identifying source locations of the first two (temporally separate) pattern AEP components, is considered in detail. The experimental apparatus and computing system are described, together with a detailed discussion of experimental errors. The computing system was designed to be on-line and interactive, and a general discussion is included on the man-computer interface. Four chapters report and discuss the experimental findings. The first describes the adaptation effect of one stimulus on the AEP to another which follows it after a short time interval. The adaptation is plotted as a function of relative timings and pattern types. Monocular stimulation showed that the effect must be partially central in origin. The second reports on variability of the AEP. The AEP standard deviation is plotted as a function of electrode position, and was found to be almost independent of the stimulus. A 'Running Average' technique is described for measuring longer term AEP variations. The third describes a computerised AEP component separation method, which was developed and used to provide further confirmation that the two AEP components first identified by Jeffreys give scalp distributions compatible with dipole sources in the striate and extrastriate cortices. Four subjects were tested in detail, and the results compared with a simple dipole model. The fourth describes the development and initial trials of an on-line Evoked Potential Stochastic Search Technique. The results are discussed, and some confirmatory and extension experiments suggested
    corecore