490 research outputs found

    A Fast Incremental Hypervolume Algorithm

    Full text link

    Approximating the least hypervolume contributor: NP-hard in general, but fast in practice

    Get PDF
    The hypervolume indicator is an increasingly popular set measure to compare the quality of two Pareto sets. The basic ingredient of most hypervolume indicator based optimization algorithms is the calculation of the hypervolume contribution of single solutions regarding a Pareto set. We show that exact calculation of the hypervolume contribution is #P-hard while its approximation is NP-hard. The same holds for the calculation of the minimal contribution. We also prove that it is NP-hard to decide whether a solution has the least hypervolume contribution. Even deciding whether the contribution of a solution is at most (1+\eps) times the minimal contribution is NP-hard. This implies that it is neither possible to efficiently find the least contributing solution (unless P=NPP = NP) nor to approximate it (unless NP=BPPNP = BPP). Nevertheless, in the second part of the paper we present a fast approximation algorithm for this problem. We prove that for arbitrarily given \eps,\delta>0 it calculates a solution with contribution at most (1+\eps) times the minimal contribution with probability at least (1−δ)(1-\delta). Though it cannot run in polynomial time for all instances, it performs extremely fast on various benchmark datasets. The algorithm solves very large problem instances which are intractable for exact algorithms (e.g., 10000 solutions in 100 dimensions) within a few seconds.Comment: 22 pages, to appear in Theoretical Computer Scienc

    Fast calculation of multiobjective probability of improvement and expected improvement criteria for Pareto optimization

    Get PDF
    The use of surrogate based optimization (SBO) is widely spread in engineering design to reduce the number of computational expensive simulations. However, "real-world" problems often consist of multiple, conflicting objectives leading to a set of competitive solutions (the Pareto front). The objectives are often aggregated into a single cost function to reduce the computational cost, though a better approach is to use multiobjective optimization methods to directly identify a set of Pareto-optimal solutions, which can be used by the designer to make more efficient design decisions (instead of weighting and aggregating the costs upfront). Most of the work in multiobjective optimization is focused on multiobjective evolutionary algorithms (MOEAs). While MOEAs are well-suited to handle large, intractable design spaces, they typically require thousands of expensive simulations, which is prohibitively expensive for the problems under study. Therefore, the use of surrogate models in multiobjective optimization, denoted as multiobjective surrogate-based optimization, may prove to be even more worthwhile than SBO methods to expedite the optimization of computational expensive systems. In this paper, the authors propose the efficient multiobjective optimization (EMO) algorithm which uses Kriging models and multiobjective versions of the probability of improvement and expected improvement criteria to identify the Pareto front with a minimal number of expensive simulations. The EMO algorithm is applied on multiple standard benchmark problems and compared against the well-known NSGA-II, SPEA2 and SMS-EMOA multiobjective optimization methods

    Efficient Real-Time Hypervolume Estimation with Monotonically Reducing Error

    Get PDF
    This is the author accepted manuscript. The final version is available from ACM via the DOI in this recordThe codebase for this paper is available at https://github.com/fieldsend/hypervolumeThe hypervolume (or S-metric) is a widely used quality measure employed in the assessment of multi- and many-objective evolutionary algorithms. It is also directly integrated as a component in the selection mechanism of some popular optimisers. Exact hypervolume calculation becomes prohibitively expensive in real-time applications as the number of objectives increases and/or the approximation set grows. As such, Monte Carlo (MC) sampling is often used to estimate its value rather than exactly calculating it. This estimation is inevitably subject to error. As standard with Monte Carlo approaches, the standard error decreases with the square root of the number of MC samples. We propose a number of realtime hypervolume estimation methods for unconstrained archives — principally for use in real-time convergence analysis. Furthermore, we show how the number of domination comparisons can be considerably reduced by exploiting incremental properties of the approximated Pareto front. In these methods the estimation error monotonically decreases over time for (i) a capped budget of samples per algorithm generation and (ii) a fixed budget of dedicated computation time per optimiser generation for new MC samples. Results are provided using an illustrative worst-case scenario with rapid archive growth, demonstrating the orders-of-magnitude of speed-up possible.Engineering and Physical Sciences Research Council (EPSRC)Innovate U

    Planning as Optimization: Dynamically Discovering Optimal Configurations for Runtime Situations

    Full text link
    The large number of possible configurations of modern software-based systems, combined with the large number of possible environmental situations of such systems, prohibits enumerating all adaptation options at design time and necessitates planning at run time to dynamically identify an appropriate configuration for a situation. While numerous planning techniques exist, they typically assume a detailed state-based model of the system and that the situations that warrant adaptations are known. Both of these assumptions can be violated in complex, real-world systems. As a result, adaptation planning must rely on simple models that capture what can be changed (input parameters) and observed in the system and environment (output and context parameters). We therefore propose planning as optimization: the use of optimization strategies to discover optimal system configurations at runtime for each distinct situation that is also dynamically identified at runtime. We apply our approach to CrowdNav, an open-source traffic routing system with the characteristics of a real-world system. We identify situations via clustering and conduct an empirical study that compares Bayesian optimization and two types of evolutionary optimization (NSGA-II and novelty search) in CrowdNav

    A Faster Algorithm for Calculating Hypervolume

    Get PDF
    We present an algorithm for calculating hypervolume exactly, the Hypervolume by Slicing Objectives (HSO) algorithm, that is faster than any that has previously been published. HSO processes objectives instead of points, an idea that has been considered before but that has never been properly evaluated in the literature. We show that both previously studied exact hypervolume algorithms are exponential in at least the number of objectives and that although HSO is also exponential in the number of objectives in the worst case, it runs in significantly less time, i.e., two to three orders of magnitude less for randomly generated and benchmark data in three to eight objectives. Thus, HSO increases the utility of hypervolume, both as a metric for general optimization algorithms and as a diversity mechanism for evolutionary algorithm
    • …
    corecore