4 research outputs found

    On Constructing Ensembles for Combinatorial Optimisation

    Get PDF
    Although the use of ensemble methods in machine-learning is ubiquitous due to their proven ability to outperform their constituent algorithms, ensembles of optimisation algorithms have received relatively little attention. Existing approaches lag behind machine-learning in both theory and practice, with no principled design guidelines available. In this paper, we address fundamental questions regarding ensemble composition in opti-misation using the domain of bin-packing as a example; in particular we investigate the trade-off between accuracy and diversity, and whether diversity metrics can be used as a proxy for constructing an ensemble, proposing a number of novel metrics for comparing algorithm diversity. We find that randomly composed ensembles can outperform ensembles of high-performing algorithms under certain conditions and that judicious choice of diversity metric is required to construct good ensembles. The method and findings can be generalised to any meta-heuristic ensemble, and lead to better understanding of how to undertake principled ensemble design

    Evolving comprehensible and scalable solvers using CGP for solving some real-world inspired problems

    Get PDF
    My original contribution to knowledge is the application of Cartesian Genetic Programming to design some scalable and human-understandable metaheuristics automatically; those find some suitable solutions for real-world NP-hard and discrete problems. This technique is thought to possess the ability to raise the generality of a problem-solving process, allowing some supervised machine learning tasks and being able to evolve non-deterministic algorithms. \\ Two extensions of Cartesian Genetic Programming are presented. Iterative My original contribution to knowledge is the application of Cartesian Genetic Programming to design some scalable and human-understandable metaheuristics automatically; those find some suitable solutions for real-world NP-hard and discrete problems. This technique is thought to possess the ability to raise the generality of a problem-solving process, allowing some supervised machine learning tasks and being able to evolve non-deterministic algorithms. \\ Two extensions of Cartesian Genetic Programming are presented. Iterative Cartesian Genetic Programming can encode loops and nested loop with their termination criteria, making susceptible to evolutionary modification the whole programming construct. This newly developed extension and its application to metaheuristics are demonstrated to discover effective solvers for NP-hard and discrete problems. This thesis also extends Cartesian Genetic Programming and Iterative Cartesian Genetic Programming to adapt a hyper-heuristic reproductive operator at the same time of exploring the automatic design space. It is demonstrated the exploration of an automated design space can be improved when specific types of active and non-active genes are mutated. \\ A series of rigorous empirical investigations demonstrate that lowering the comprehension barrier of automatically designed algorithms can help communicating and identifying an effective and ineffective pattern of primitives. The complete evolution of loops and nested loops without imposing a hard limit on the number of recursive calls is shown to broaden the automatic design space. Finally, it is argued the capability of a learning objective function to assess the scalable potential of a generated algorithm can be beneficial to a generative hyper-heuristic
    corecore