52 research outputs found

    Evolving test instances of the Hamiltonian completion problem

    Full text link
    Predicting and comparing algorithm performance on graph instances is challenging for multiple reasons. First, there is usually no standard set of instances to benchmark performance. Second, using existing graph generators results in a restricted spectrum of difficulty and the resulting graphs are usually not diverse enough to draw sound conclusions. That is why recent work proposes a new methodology to generate a diverse set of instances by using an evolutionary algorithm. We can then analyze the resulting graphs and get key insights into which attributes are most related to algorithm performance. We can also fill observed gaps in the instance space in order to generate graphs with previously unseen combinations of features. This methodology is applied to the instance space of the Hamiltonian completion problem using two different solvers, namely the Concorde TSP Solver and a multi-start local search algorithm.Comment: 12 pages, 12 figures, minor revisions in section

    From algorithm selection to generation using deep learning

    Get PDF
    Algorithm selection and generation techniques are two methods that can be used to exploit the performance complementarity of different algorithms when applied to large diverse sets of combinatorial problem instances. As there is no single algorithm that dominates all others on all problem instances, algorithm selection automatically selects an algorithm expected to perform best for each problem instance. Meanwhile, algorithm generation refers to combining different algorithms in a manner that allows the resulting method to improve the efficacy of a pool of algorithms. This thesis examines algorithm selection and generation within a single streaming problem domain, that is Bin-Packing, where novel approaches are proposed and evaluated on large problem sets. This research starts with presenting a novel feature-free approach to select the best performing heuristic by capturing the sequential information implicit in a streaming instance and using this as direct input to two Deep Learning (DL) models, Long-Short-Term Memory (LSTM) or Gated Recurrent Unit (GRU), to learn a mapping from an instance to an algorithm. Results obtained using the proposed approach show that the performance of the feature-free selectors significantly outperforms the performance of both the single best solver and the classical feature-based approach using well-known Machine Learning (ML) classifiers when applied to large sets of diverse problem instances. Next, a more radical approach is proposed: bypass algorithm selection altogether by training encoder-decoder LSTM using solutions obtained from a set of algorithms to directly predict a solution from the instance data behaving as an automatically generated algorithm. Experiments conducted on large datasets using problem batches of varying sizes show that the generated algorithm is able to accurately predict solutions, particularly with small batch sizes. Finally, the thesis develops the proposed encoder-decoder approach by introducing a novel neural approach for generating algorithms, in which a neural network acts as an algorithm by generating decisions. Two architectures are evaluated, an encoder-decoder LSTM and a feed-forward Neural Network (NN), and trained using the decisions output from existing algorithms on a large set of instances. Experiments show that the new generated algorithms are capable of solving a subset of instances better than the well-known bin-packing algorithms, and hence they can significantly improve the overall performance when they are added to a pool of algorithms

    Review, challenges, design, and development

    Get PDF
    Peres, F., & Castelli, M. (2021). Combinatorial optimization problems and metaheuristics: Review, challenges, design, and development. Applied Sciences (Switzerland), 11(14), 1-39. [6449]. https://doi.org/10.3390/app11146449In the past few decades, metaheuristics have demonstrated their suitability in addressing complex problems over different domains. This success drives the scientific community towards the definition of new and better-performing heuristics and results in an increased interest in this research field. Nevertheless, new studies have been focused on developing new algorithms without providing consolidation of the existing knowledge. Furthermore, the absence of rigor and formalism to classify, design, and develop combinatorial optimization problems and metaheuristics represents a challenge to the field’s progress. This study discusses the main concepts and challenges in this area and proposes a formalism to classify, design, and code combinatorial optimization problems and metaheuristics. We believe these contributions may support the progress of the field and increase the maturity of metaheuristics as problem solvers analogous to other machine learning algorithms.publishersversionpublishe

    Nature-inspired Methods for Stochastic, Robust and Dynamic Optimization

    Get PDF
    Nature-inspired algorithms have a great popularity in the current scientific community, being the focused scope of many research contributions in the literature year by year. The rationale behind the acquired momentum by this broad family of methods lies on their outstanding performance evinced in hundreds of research fields and problem instances. This book gravitates on the development of nature-inspired methods and their application to stochastic, dynamic and robust optimization. Topics covered by this book include the design and development of evolutionary algorithms, bio-inspired metaheuristics, or memetic methods, with empirical, innovative findings when used in different subfields of mathematical optimization, such as stochastic, dynamic, multimodal and robust optimization, as well as noisy optimization and dynamic and constraint satisfaction problems

    Optimality, flexibility and efficiency for cell formation in group technology

    Get PDF

    Optimality, flexibility and efficiency for cell formation in group technology

    Get PDF

    Optimality, flexibility and efficiency for cell formation in group technology

    Get PDF
    • …
    corecore