137 research outputs found
A test problem for visual investigation of high-dimensional multi-objective search
An inherent problem in multiobjective optimization is that the visual observation of solution vectors with four or more objectives is infeasible, which brings major difficulties for algorithmic design, examination, and development. This paper presents a test problem, called the Rectangle problem, to aid the visual investigation of high-dimensional multiobjective search. Key features of the Rectangle problem are that the Pareto optimal solutions 1) lie in a rectangle in the two-variable decision space and 2) are similar (in the sense of Euclidean geometry) to their images in the four-dimensional objective space. In this case, it is easy to examine the behavior of objective vectors in terms of both convergence and diversity, by observing their proximity to the optimal rectangle and their distribution in the rectangle, respectively, in the decision space. Fifteen algorithms are investigated. Underperformance of Pareto-based algorithms as well as most state-of-the-art many-objective algorithms indicates that the proposed problem not only is a good tool to help visually understand the behavior of multiobjective search in a high-dimensional objective space but also can be used as a challenging benchmark function to test algorithms' ability in balancing the convergence and diversity of solutions
Evolutionary many-objective optimisation: pushing the boundaries
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonMany-objective optimisation poses great challenges to evolutionary algorithms. To start with, the ineffectiveness of the Pareto dominance relation, which is the most important criterion in multi-objective optimisation, results in the underperformance of traditional Pareto-based algorithms. Also, the aggravation of the conflict between proximity and diversity, along with increasing time or space requirement as well as parameter sensitivity, has become key barriers to the design of effective many-objective optimisation algorithms. Furthermore, the infeasibility of solutions' direct observation can lead to serious difficulties in algorithms' performance investigation and comparison. In this thesis, we address these challenges, aiming to make evolutionary algorithms as effective in many-objective optimisation as in two- or three-objective optimisation. First, we significantly enhance Pareto-based algorithms to make them suitable for many-objective optimisation by placing individuals with poor proximity into crowded regions so that these individuals can have a better chance to be eliminated. Second, we propose a grid-based evolutionary algorithm which explores the potential of the grid to deal with many-objective optimisation problems. Third, we present a bi-goal evolution framework that converts many objectives of a given problem into two objectives regarding proximity and diversity, thus creating an optimisation problem in which the objectives are the goals of the search process itself. Fourth, we propose a comprehensive performance indicator to compare evolutionary algorithms in optimisation problems with various Pareto front shapes and any objective dimensionality. Finally, we construct a test problem to aid the visual investigation of evolutionary search, with its Pareto optimal solutions in a two-dimensional decision space having similar distribution to their images in a higher-dimensional objective space. The work reported in this thesis is the outcome of innovative attempts at addressing some of the most challenging problems in evolutionary many-objective optimisation. This research has not only made some of the existing approaches, such as Pareto-based or grid-based algorithms that were traditionally regarded as unsuitable, now effective for many-objective optimisation, but also pushed other important boundaries with novel ideas including bi-goal evolution, a comprehensive performance indicator and a test problem for visual investigation. All the proposed algorithms have been systematically evaluated against existing state of the arts, and some of these algorithms have already been taken up by researchers and practitioners in the field.Department of Computer Science, Brunel University Londo
Adapting Multi-objectivized Software Configuration Tuning
When tuning software configuration for better performance (e.g., latency or throughput), an important issue that many optimizers face is the presence of local optimum traps, compounded by a highly rugged configuration landscape and expensive measurements. To mitigate these issues, a recent effort has shifted to focus on the level of optimization model (called meta multi-objectivization or MMO) instead of designing better optimizers as in traditional methods. This is done by using an auxiliary performance objective, together with the target performance objective, to help the search jump out of local optima. While effective, MMO needs a fixed weight to balance the two objectives—a parameter that has been found to be crucial as there is a large deviation of the performance between the best and the other settings. However, given the variety of configurable software systems, the “sweet spot” of the weight can vary dramatically in different cases and it is not possible to find the right setting without time-consuming trial and error. In this paper, we seek to overcome this significant shortcoming of MMO by proposing a weight adaptation method, dubbed AdMMO. Our key idea is to adaptively adjust the weight at the right time during tuning, such that a good proportion of the nondominated configurations can be maintained. Moreover, we design a partial duplicate retention mechanism to handle the issue of too many duplicate configurations without losing the rich information provided by the “good” duplicates. Experiments on several real-world systems, objectives, and budgets show that, for 71% of the cases, AdMMO is considerably superior to MMO and a wide range of state-of-the-art optimizers while achieving generally better efficiency with the best speedup between 2.2x and 20x
What weights work for you?:Adapting weights for any pareto front shape in decomposition-based evolutionary multiobjective optimisation
The quality of solution sets generated by decomposition-based evolutionary multiobjective optimisation (EMO) algorithms depends heavily on the consistency between a given problem’s Pareto front shape and the specified weights’ distribution. A set of weights distributed uniformly in a simplex often lead to a set of well-distributed solutions on a Pareto front with a simplex-like shape, but may fail on other Pareto front shapes. It is an open problem on how to specify a set of appropriate weights without the information of the problem’s Pareto front beforehand. In this paper, we propose an approach to adapt weights during the evolutionary process (called AdaW). AdaW progressively seeks a suitable distribution of weights for the given problem by elaborating several key parts in weight adaptation — weight generation, weight addition, weight deletion, and weight update frequency. Experimental results have shown the effectiveness of the proposed approach. AdaW works well for Pareto fronts with very different shapes: 1) the simplex-like, 2) the inverted simplex-like, 3) the highly nonlinear, 4) the disconnect, 5) the degenerate, 6) the scaled, and 7) the highdimensional
A multi-granularity locally optimal prototype-based approach for classification
Prototype-based approaches generally provide better explainability and are widely used for classification. However, the majority of them suffer from system obesity and lack transparency on complex problems. In this paper, a novel classification approach with a multi-layered system structure self-organized from data is proposed. This approach is able to identify local peaks of multi-modal density derived from static data and filter out more representative ones at multiple levels of granularity acting as prototypes. These prototypes are then optimized to their locally optimal positions in the data space and arranged in layers with meaningful dense links in-between to form pyramidal hierarchies based on the respective levels of granularity accordingly. After being primed offline, the constructed classification model is capable of self-developing continuously from streaming data to self-expend its knowledge base. The proposed approach offers higher transparency and is convenient for visualization thanks to the hierarchical nested architecture. Its system identification process is objective, data-driven and free from prior assumptions on data generation model with user- and problem- specific parameters. Its decision-making process follows the “nearest prototype” principle, and is highly explainable and traceable. Numerical examples on a wide range of benchmark problems demonstrate its high performance
- …