14 research outputs found

    Regression Error Characteristic Optimisation of Non-Linear Models.

    Get PDF
    Copyright © 2006 Springer-Verlag Berlin Heidelberg. The final publication is available at link.springer.comBook title: Multi-Objective Machine LearningIn this chapter recent research in the area of multi-objective optimisation of regression models is presented and combined. Evolutionary multi-objective optimisation techniques are described for training a population of regression models to optimise the recently defined Regression Error Characteristic Curves (REC). A method which meaningfully compares across regressors and against benchmark models (i.e. ‘random walk’ and maximum a posteriori approaches) for varying error rates. Through bootstrapping training data, degrees of confident out-performance are also highlighted

    Multi-objective optimisation for receiver operating characteristic analysis

    Get PDF
    Copyright © 2006 Springer-Verlag Berlin Heidelberg. The final publication is available at link.springer.comBook title: Multi-Objective Machine LearningSummary Receiver operating characteristic (ROC) analysis is now a standard tool for the comparison of binary classifiers and the selection operating parameters when the costs of misclassification are unknown. This chapter outlines the use of evolutionary multi-objective optimisation techniques for ROC analysis, in both its traditional binary classification setting, and in the novel multi-class ROC situation. Methods for comparing classifier performance in the multi-class case, based on an analogue of the Gini coefficient, are described, which leads to a natural method of selecting the classifier operating point. Illustrations are given concerning synthetic data and an application to Short Term Conflict Alert

    Shape optimisation of the sharp-heeled Kaplan draft tube: Performance evaluation using Computational Fluid Dynamics

    Get PDF
    A methodology to assess the performance of an elbow-type draft tube is outlined. This was achieved using Computational Fluid Dynamics (CFD) to evaluate the pressure recovery and mechanical energylosses along a draft tube design, while using open-source and commercial software to parameterise and regenerate the geometry and CFD grid. An initial validation study of the elbow-type draft tube is carriedout, focusing on the grid-regeneration methodology, steady-state assumption, and turbulence modelling approach for evaluating the design’s efficiency. The Grid Convergence Index (GCI) technique was used to assess the uncertainty of the pressure recovery to the grid resolution. It was found that estimating the pressure recovery through area-weighted averaging significantly reduced the uncertainty due to the grid. Simultaneously, it was found that this uncertainty fluctuated with the local cross-sectional area along the geometry. Subsequently, a study of the inflow cone and outer-heel designs on the flowfield and pressure recovery was carried out. Catmull-Rom splines were used to parameterise these components, so as torecreate a number of proposed designs from the literature. GCI analysis is also applied to these designs,demonstrating the robustness of the grid-regeneration methodology

    Automated configuration of genetic algorithms by tuning for anytime performance: hot-off-the-press track at GECCCO 2022

    No full text
    Algorithms and the Foundations of Software technolog

    On generalizing the power function exponent constructions with genetic programming

    No full text

    Using Structural Bias to Analyse the Behaviour of Modular CMA-ES

    Get PDF
    The Covariance Matrix Adaptation Evolution Strategy (CMA-ES) is a commonly used iterative optimisation heuristic for optimising black-box functions. CMA-ES comes in many flavours with different configuration settings. In this work, we investigate whether CMAES suffers from structural bias and which modules and parameters affect the strength and type of structural bias. Structural bias occurs when an algorithm or a component of the algorithm biases the search towards a specific direction in the search space irrespective of the objective function. In addition to this investigation, we propose a method to assess the relationship between structural bias and the performance of configurations with different types of bias on the BBOB suite of benchmark functions. Surprisingly for such a popular algorithm, 90.3% of the 1 620 CMA-ES configurations were found to have Structural Bias. Some interesting patterns between module settings and bias types are presented and further insights are discussed

    IOHanalyzer: detailed performance analyses for iterative optimization heuristics: hot-off-the-press track @ GECCO 2022

    No full text
    Algorithms and the Foundations of Software technolog

    Quantum-enhanced selection operators for evolutionary algorithms

    No full text
    Genetic algorithms have unique properties which are useful when applied to black-box optimization. Using selection, crossover, and mutation operators, candidate solutions may be obtained without the need to calculate a gradient. In this work, we study results obtained from using quantum-enhanced operators within the selection mechanism of a genetic algorithm. Our approach frames the selection process as a minimization of a binary quadratic model with which we encode fitness and distance between members of a population, and we leverage a quantum annealing system to sample low-energy solutions for the selection mechanism. We benchmark these quantum-enhanced algorithms against classical algorithms over various black-box objective functions, including the OneMax function, and functions from the IOHProfiler library for black-box optimization. We observe a performance gain in the average number of generations to convergence for the quantum-enhanced elitist selection operator in comparison to classical on the OneMax function. We also find that the quantum-enhanced selection operator with ∗Corresponding author email: [email protected] non-elitist selection outperforms benchmarks on functions with fitness perturbation from the IOHProfiler library. Additionally, we find that in the case of elitist selection, the quantum-enhanced operators outperform classical benchmarks on functions with varying degrees of dummy variables and neutralityAlgorithms and the Foundations of Software technolog

    MCS diversity and classifier confidence: A Bayesian approach

    No full text
    Bayes' rule is introduced as a coherent averaging strategy for multiclassifier system (MCS) output, and as a strategy for eliminating the uncertainty associated with a particular choice of classifier-model parameters. We use a Markov-Chain Monte Carlo method for efficient selection of classifiers to approximate the computationally intractable elements of the Bayesian approach --- the set of classifiers so selected is our MCS. Furthermore we exploit the massive sampling (thousands of classifiers) within the Bayesian framework to encompass an estimate of the confidence to be placed in any classification result --- thus providing a sound basis for rejection of some MCS classification results. We present uncertainty envelopes as one way to derive these confidence estimates from the population of classifiers that constitutes the MCS, and we show that as the diversity among component classifiers increases so does the accuracy of confident classification estimates, but diversity is not a panacea. If diversity is increased by elaboration of the data models then care must be taken to fit model sampling and model complexity, otherwise diversity can have the negative effect of leading to excessive numbers of low confidence classifications
    corecore