5,368 research outputs found

    The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms

    Get PDF
    open access articleWe present Stochastic Optimisation Software (SOS), a Java platform facilitating the algorithmic design process and the evaluation of metaheuristic optimisation algorithms. SOS reduces the burden of coding miscellaneous methods for dealing with several bothersome and time-demanding tasks such as parameter tuning, implementation of comparison algorithms and testbed problems, collecting and processing data to display results, measuring algorithmic overhead, etc. SOS provides numerous off-the-shelf methods including: (1) customised implementations of statistical tests, such as the Wilcoxon rank-sum test and the Holm–Bonferroni procedure, for comparing the performances of optimisation algorithms and automatically generating result tables in PDF and formats; (2) the implementation of an original advanced statistical routine for accurately comparing couples of stochastic optimisation algorithms; (3) the implementation of a novel testbed suite for continuous optimisation, derived from the IEEE CEC 2014 benchmark, allowing for controlled activation of the rotation on each testbed function. Moreover, we briefly comment on the current state of the literature in stochastic optimisation and highlight similarities shared by modern metaheuristics inspired by nature. We argue that the vast majority of these algorithms are simply a reformulation of the same methods and that metaheuristics for optimisation should be simply treated as stochastic processes with less emphasis on the inspiring metaphor behind them

    A Feature-Based Analysis on the Impact of Set of Constraints for e-Constrained Differential Evolution

    Full text link
    Different types of evolutionary algorithms have been developed for constrained continuous optimization. We carry out a feature-based analysis of evolved constrained continuous optimization instances to understand the characteristics of constraints that make problems hard for evolutionary algorithm. In our study, we examine how various sets of constraints can influence the behaviour of e-Constrained Differential Evolution. Investigating the evolved instances, we obtain knowledge of what type of constraints and their features make a problem difficult for the examined algorithm.Comment: 17 Page

    Towards a Theory-Guided Benchmarking Suite for Discrete Black-Box Optimization Heuristics: Profiling (1+Îŧ)(1+\lambda) EA Variants on OneMax and LeadingOnes

    Full text link
    Theoretical and empirical research on evolutionary computation methods complement each other by providing two fundamentally different approaches towards a better understanding of black-box optimization heuristics. In discrete optimization, both streams developed rather independently of each other, but we observe today an increasing interest in reconciling these two sub-branches. In continuous optimization, the COCO (COmparing Continuous Optimisers) benchmarking suite has established itself as an important platform that theoreticians and practitioners use to exchange research ideas and questions. No widely accepted equivalent exists in the research domain of discrete black-box optimization. Marking an important step towards filling this gap, we adjust the COCO software to pseudo-Boolean optimization problems, and obtain from this a benchmarking environment that allows a fine-grained empirical analysis of discrete black-box heuristics. In this documentation we demonstrate how this test bed can be used to profile the performance of evolutionary algorithms. More concretely, we study the optimization behavior of several (1+Îŧ)(1+\lambda) EA variants on the two benchmark problems OneMax and LeadingOnes. This comparison motivates a refined analysis for the optimization time of the (1+Îŧ)(1+\lambda) EA on LeadingOnes

    The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms

    Get PDF
    We present Stochastic Optimisation Software (SOS), a Java platform facilitating the algorithmic design process and the evaluation of metaheuristic optimisation algorithms. SOS reduces the burden of coding miscellaneous methods for dealing with several bothersome and time-demanding tasks such as parameter tuning, implementation of comparison algorithms and testbed problems, collecting and processing data to display results, measuring algorithmic overhead, etc. SOS provides numerous off-the-shelf methods including: (1) customised implementations of statistical tests, such as the Wilcoxon rank-sum test and the Holm–Bonferroni procedure, for comparing the performances of optimisation algorithms and automatically generating result tables in PDF and LATEX formats; (2) the implementation of an original advanced statistical routine for accurately comparing couples of stochastic optimisation algorithms; (3) the implementation of a novel testbed suite for continuous optimisation, derived from the IEEE CEC 2014 benchmark, allowing for controlled activation of the rotation on each testbed function. Moreover, we briefly comment on the current state of the literature in stochastic optimisation and highlight similarities shared by modern metaheuristics inspired by nature. We argue that the vast majority of these algorithms are simply a reformulation of the same methods and that metaheuristics for optimisation should be simply treated as stochastic processes with less emphasis on the inspiring metaphor behind them

    A Feature-Based Comparison of Evolutionary Computing Techniques for Constrained Continuous Optimisation

    Full text link
    Evolutionary algorithms have been frequently applied to constrained continuous optimisation problems. We carry out feature based comparisons of different types of evolutionary algorithms such as evolution strategies, differential evolution and particle swarm optimisation for constrained continuous optimisation. In our study, we examine how sets of constraints influence the difficulty of obtaining close to optimal solutions. Using a multi-objective approach, we evolve constrained continuous problems having a set of linear and/or quadratic constraints where the different evolutionary approaches show a significant difference in performance. Afterwards, we discuss the features of the constraints that exhibit a difference in performance of the different evolutionary approaches under consideration.Comment: 16 Pagesm 2 Figure

    State-of-the-art in aerodynamic shape optimisation methods

    Get PDF
    Aerodynamic optimisation has become an indispensable component for any aerodynamic design over the past 60 years, with applications to aircraft, cars, trains, bridges, wind turbines, internal pipe flows, and cavities, among others, and is thus relevant in many facets of technology. With advancements in computational power, automated design optimisation procedures have become more competent, however, there is an ambiguity and bias throughout the literature with regards to relative performance of optimisation architectures and employed algorithms. This paper provides a well-balanced critical review of the dominant optimisation approaches that have been integrated with aerodynamic theory for the purpose of shape optimisation. A total of 229 papers, published in more than 120 journals and conference proceedings, have been classified into 6 different optimisation algorithm approaches. The material cited includes some of the most well-established authors and publications in the field of aerodynamic optimisation. This paper aims to eliminate bias toward certain algorithms by analysing the limitations, drawbacks, and the benefits of the most utilised optimisation approaches. This review provides comprehensive but straightforward insight for non-specialists and reference detailing the current state for specialist practitioners

    Application of Firefly Algorithm and Its Parameter Setting for Job Shop Scheduling

    Get PDF
    AbstractJob shop scheduling problem (JSSP) is one of the most famous scheduling problems, most of which are categorisedinto Non-deterministic Polynomial (NP) hard problem. The objectives of this paper are to i) present the application of a recent developed metaheuristic called Firefly Algorithm (FA) for solving JSSP; ii) investigate the parameter setting of the proposed algorithm; and iii) compare the FA performance using various parameter settings. The computational experiment was designed and conducted using five benchmarking JSSP datasets from a classical OR-Library. The analysis of the experimental results on the FA performance comparison between with and without using optimised parameter settings was carried out. The FA with appropriate parameters setting that got from the experiment analysis produced the best-so-far schedule better than the FA withoutadopting parameter settings

    Nature-inspired optimization algorithms: challenges and open problems

    Get PDF
    Many problems in science and engineering can be formulated as optimization problems, subject to complex nonlinear constraints. The solutions of highly nonlinear problems usually require sophisticated optimization algorithms, and traditional algorithms may struggle to deal with such problems. A current trend is to use nature-inspired algorithms due to their flexibility and effectiveness. However, there are some key issues concerning nature-inspired computation and swarm intelligence. This paper provides an in-depth review of some recent nature-inspired algorithms with the emphasis on their search mechanisms and mathematical foundations. Some challenging issues are identified and five open problems are highlighted, concerning the analysis of algorithmic convergence and stability, parameter tuning, mathematical framework, role of benchmarking and scalability. These problems are discussed with the directions for future research
    • â€Ķ
    corecore