283 research outputs found

    Discrete self-organizing migration algorithm and p-location problems

    Get PDF
    Mathematical modelling, and integer programming generally, has many practical applications in different areas of human life. Effective and fast solving approaches for various optimization problems play an important role in the decision-making process and therefore, big attention is paid to the development of many exact and approximate algorithms. This paper deals only with a special class of location problems in which given number of facilities are to be chosen to minimize the objective function value. Since the exact methods are not suitable for their unpredictable computational time or memory demands, we focus here on possible usage of a special type of a particle swarm optimization algorithm transformed by discretization and meme usage into so-called discrete self-organizing migrating algorithm. In the paper, there is confirmed that it is possible to suggest a sophisticated heuristic for zero-one programming problem, which can produce near-to-optimal solution in much smaller time than the time demanded by exact methods. We introduce a special adaptation of the discrete self-organizing migration algorithm to the pp-location problem making use of the path-relinking method. In the theoretical part of this paper, we introduce several strategies of the migration process. To verify their features and effectiveness, a computational study with real-sized benchmarks was performed. The main goal of the experiments was to find the most efficient version of the suggested solving tool

    AN INVESTIGATION OF METAHEURISTICS USING PATH- RELINKING ON THE QUADRATIC ASSIGNMENT PROBLEM

    Get PDF
    The Quadratic Assignment Problem (QAP) is a widely researched, yet complex, combinatorial optimization problem that is applicable in modeling many real-world problems. Specifically, many optimization problems are formulated as QAPs. To resolve QAPs, the recent trends have been to use metaheuristics rather than exact or heuristic methods, and many researchers have found that the use of hybrid metaheuristics is actually more effective. A newly proposed hybrid metaheuristic is path relinking (PR), which is used to generate solutions by combining two or more reference solutions. In this dissertation, we investigated these diversification and intensification mechanisms using QAP. To satisfy the extensive demands of the computational resources, we utilized a High Throughput Computing (HTC) environment and test cases from the QAPLIB (QAP test case repository). This dissertation consists of three integrated studies that are built upon each other. The first phase explores the effects of the parameter tuning, metaheuristic design, and representation schemes (random keys and permutation solution encoding procedures) of two path-based metaheuristics (Tabu Search and Simulated Annealing) and two population-based metaheuristics (Genetic Algorithms and Artificial Immune Algorithms) using QAP as a testbed. In the second phase of the study, we examined eight tuned metaheuristics representing two representation schemes using problem characteristics. We use problem size, flow and distance dominance measures, sparsity (number of zero entries in the matrices), and the coefficient of correlation measures of the matrices to build search trajectories. The third phase of the dissertation focuses on intensification and diversification mechanisms using path-relinking (PR) procedures (the two variants of position-based path relinking) to enhance the performance of path-based and population-based metaheuristics. The current research in this field has explored the unusual effectiveness of PR algorithms in variety of applications and has emphasized the significance of future research incorporating more sophisticated strategies and frameworks. In addition to addressing these issues, we also examined the effects of solution representations on PR augmentation. For future research, we propose metaheuristic studies using fitness landscape analysis to investigate particular metaheuristics\u27 fitness landscapes and evolution through parameter tuning, solution representation, and PR augmentation. The main research contributions of this dissertation are to widen the knowledge domains of metaheuristic design, representation schemes, parameter tuning, PR mechanism viability, and search trajectory analysis of the fitness landscape using QAPs

    A Pareto-metaheuristic for a bi-objective winner determination problem in a combinatorial reverse auction

    Get PDF
    The bi-objective winner determination problem (2WDP-SC) of a combinatorial procurement auction for transport contracts comes up to a multi-criteria set covering problem. We are given a set B of bundle bids. A bundle bid b in B consists of a bidding carrier c_b, a bid price p_b, and a set tau_b of transport contracts which is a subset of the set T of tendered transport contracts. Additionally, the transport quality q_t,c_b is given which is expected to be realized when a transport contract t is executed by a carrier c_b. The task of the auctioneer is to find a set X of winning bids (X is subset of B), such that each transport contract is part of at least one winning bid, the total procurement costs are minimized, and the total transport quality is maximized. This article presents a metaheuristic approach for the 2WDP-SC which integrates the greedy randomized adaptive search procedure, large neighborhood search, and self-adaptive parameter setting in order to find a competitive set of non-dominated solutions. The procedure outperforms existing heuristics. Computational experiments performed on a set of benchmark instances show that, for small instances, the presented procedure is the sole approach that succeeds to find all Pareto-optimal solutions. For each of the large benchmark instances, according to common multi-criteria quality indicators of the literature, it attains new best-known solution sets.Pareto optimization; multi-criteria winner determination; combinatorial auction; GRASP; LNS

    Improved Neighbourhood Search-Based Methods for Graph Layout

    Get PDF
    Graph drawing, or the automatic layout of graphs, is a challenging problem. There are several search-based methods for graph drawing that are based on optimising a fitness function which is formed from a weighted sum of multiple criteria. This thesis proposes a new neighbourhood search-based method that uses a tabu search coupled with path relinking in order to optimise such fitness functions for general graph layouts with undirected straight lines. None of these methods have been previously used in general multi-criteria graph drawing. Tabu search uses a memory list to speed up searching by avoiding previously tested solutions, while the path relinking method generates new solutions by exploring paths that connect high quality solutions. We use path relinking periodically within the tabu search procedure to speed up the identification of good solutions. We have evaluated our new method against the commonly used neighbourhood search optimisation techniques: hill climbing and simulated annealing. Our evaluation examines the quality of the graph layout (fitness function's value) and the speed of the layout in terms of the number of the evaluated solutions required to draw a graph. We also examine the relative scalability of our method. Our experimental results were applied to both random graphs and a real-world dataset. We show that our method outperforms both hill climbing and simulated annealing by producing a better layout in a lower number of evaluated solutions. In addition, we demonstrate that our method has greater scalability as it can lay out larger graphs than the state-of-the-art neighbourhood search-based methods. Finally, we show that similar results can be produced in a real world setting by testing our method against a standard public graph dataset

    Graph drawing using tabu search coupled with path relinking

    Get PDF
    Graph drawing, or the automatic layout of graphs, is a challenging problem. There are several search based methods for graph drawing which are based on optimizing an objective function which is formed from a weighted sum of multiple criteria. In this paper, we propose a new neighbourhood search method which uses a tabu search coupled with path relinking to optimize such objective functions for general graph layouts with undirected straight lines. To our knowledge, before our work, neither of these methods have been previously used in general multi-criteria graph drawing. Tabu search uses a memory list to speed up searching by avoiding previously tested solutions, while the path relinking method generates new solutions by exploring paths that connect high quality solutions. We use path relinking periodically within the tabu search procedure to speed up the identification of good solutions. We have evaluated our new method against the commonly used neighbourhood search optimization techniques: hill climbing and simulated annealing. Our evaluation examines the quality of the graph layout (objective function’s value) and the speed of layout in terms of the number of evaluated solutions required to draw a graph. We also examine the relative scalability of each method. Our experimental results were applied to both random graphs and a real-world dataset. We show that our method outperforms both hill climbing and simulated annealing by producing a better layout in a lower number of evaluated solutions. In addition, we demonstrate that our method has greater scalability as it can layout larger graphs than the state-of-the-art neighbourhood search methods. Finally, we show that similar results can be produced in a real world setting by testing our method against a standard public graph dataset

    Solving the waste collection problem from a multiobjective perspective: New methodologies and case studies

    Get PDF
    Fecha de lectura Tesis Doctoral: 19 de marzo de 2018.Economía Aplicada ( Matemáticas) Resumen tesis: El tratamiento de residuos es un tema de estudio por parte de las administraciones locales a nivel mundial. Distintos factores han de tenerse en cuenta para realizar un servicio eficiente. En este trabajo se desarrolla una herramienta para analizar y resolver el problema de la recogida de residuos sólidos en Málaga. Tras un análisis exhaustivo de los datos, se aborda el problema real como un problema de rutas multiobjetivo con capacidad limitada. Para los problemas multiobjetivo, no suele existir una única solución óptima, sino un conjunto de soluciones eficientes de Pareto. Las características del problema hacen inviable su resolución de forma exacta, por lo que se aplican distintas estrategias metaheurísticas para obtener una buena aproximación. En particular, se combinan las técnicas de GRASP, Path Relinking y Variable Neighborhood Search, que son adaptadas a la perspectiva multicriterio. Se trata de una aproximación en dos fases: una primera aproximación de la frontera eficiente se genera mediante un GRASP multiobjetivo. Tres son los métodos propuestos para la primera aproximación, dos de ellos derivados de la publicación de Martí et al. (2015) y el último se apoya en la función escalarizada de logro de Wierzbicki (Wierzbicki, 1980) para distintas combinaciones de pesos. A continuación, esta aproximación es mejorada con una versión de Path Relinking o Variable Neighborhood Search, con un punto de referencia diseñado para problemas multiobjetivo. Una vez generada la aproximación de la frontera eficiente, el proceso de obtención de la solución que más se adecúa a las preferencias de los gestores se basa en el desarrollo de un método interactivo sin trade – off, derivado de la filosofía NAUTILUS (Miettinen et al. 2010). Para evitar gastos de cómputo extensos, esta metodología se apoya en una pre - computación de los elementos de la frontera eficiente

    Técnicas heurísticas para instâncias de grande porte do problema cabo-trincheira

    Get PDF
    Orientadores: Flávio Keidi Miyazawa, Eduardo Candido XavierDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de ComputaçãoResumo: O problema cabo trincheira foi apresentado em 2002 para modelar redes cabeadas. Esse problema pode ser visto como a união do problema de caminhos mínimos com o problema da árvore geradora mínima. Como entrada do problema temos um grafo G=(V,E)G=(V,E) com pesos nas arestas que indicam a distância entre os vértices incidentes na mesma. Há um vértice especial que representa uma instalação e demais vértices representam clientes. Uma solução para o problema é uma árvore geradora enraizada na instalação. O custo da solução é o custo da árvore geradora multiplicado por um fator de custo de trincheira mais os custos de cabos. Para cada cliente, o seu custo de cabo é dado pelo custo do caminho do cliente até a instalação multiplicado por um fator de custo de cabo. Esse problema modela cenários onde cada cliente deve ser conectado a uma instalação central através de um cabo dedicado. Cada cabo deve estar acomodado em uma trincheira e cada trincheira pode conter um número ilimitado de cabos. Sabendo que o custo dos cabos e trincheiras é proporcional a seu comprimento multiplicado por um fator de custo, o problema é encontrar uma rede com custo mínimo. Trabalhos anteriores  utilizaram o problema cabo trincheira para modelar problemas em telecomunicações, distribuição de energia, redes ferroviárias e até para reconstrução de vasos sanguíneos em exames de tomografia computadorizada. O trabalho foca na resolução do problema em instâncias de grande porte (superiores a 10 mil vértices). Foram desenvolvidas várias heurísticas para o problema. Na busca por simplificações de instâncias, foram demonstradas regras seguras, ou seja, que não comprometem nenhuma solução ótima, e heurísticas para a remoção de arestas eliminando aquelas que dificilmente estariam em ''boas soluções" de uma instância. Foi apresentado um algoritmo rápido para busca local capaz de ser executado mesmo em instâncias de grande porte. Foram desenvolvidos também algoritmos baseados em Greedy Randomized Adaptive Search Procedure (GRASP) e formulada uma heurística que contrai vértices. Com a contração de vértices, foram criadas instâncias do problema Cabo Trincheira com Demandas nos Vértices (CTDV). Essa versão com demandas tem um número menor de vértices que o problema original, o que viabiliza o uso de algoritmos baseados em programação linear para resolvê-lo. Foi demonstrado como é possível, ao resolver essa versão reduzida com demandas, remontar uma solução viável para o problema cabo trincheira original. Foram obtidos, com essas heurísticas, resultados melhores do que trabalhos anteriores encontrados na literatura do problema. Para além disso, foi demonstrado como essa técnica de contração de vértices tem o potencial para resolver instâncias de tamanhos ainda maior para o problema cabo trincheiraAbstract: The Cable Trench Problem (CTP) was presented in 2002 to model wired networks. This problem can be seen as the combination of the shortest path problem with the minimum spanning tree problem. An instance of the problem is composed by a graph G=(V,E)G=(V, E) with weigths, representing the distance between a pair of vertices. A special vertex represents a facility, and all others are clients. A solution to the problem is a spanning tree rooted in the facility. The solution's cost is given by the spanning tree cost multiplied by a trench cost factor, added by the cables cost reaching the root from each vertex in the graph. For each client, its cable cost is given by the path in the spanning tree, from the client to the root, multiplied by a cable cost factor. The CTP models a scenario where each client must be connected through a dedicated cable to a central facility. Each cable must be laying on a trench and a trench may hold an unlimited number of cables. Knowing that the cost of cables and trenches are proportional to its lengths multiplied by a cost factor, the problem is to find a network of minimum cost. Previous works in the literature used the CTP to model telecommunication problems, power distribution, rail networks, and even a blood vessel networks for computed tomography exams. In this research, we focused on large-scale instances of the problem (above 10 thousand vertices), achieving better results than previous works found in the literature. We developed a series of heuristics for the problem. Searching for a simplification for those instances, we present safe reductions, that do not affect any optimal solution, and heuristic reduction rules that are capable of removing edges unlikely to be part of ''good'' solutions in an instance. We present a fast local search algorithm, capable of improving even solutions for large-scale instances. We developed an algorithm based on a Greedy Randomized Adaptive Search Procedure (GRASP) and formulated a heuristic to cluster vertices. By clustering vertices, we represent a CTP instance as an instance of the Cable Trench Problem with Demands (CTPD). We represent the large-scale CTP instance into a vertex-wise smaller one adding demands to its vertices. Dealing with smaller instances, we enable a new range of techniques such as linear programming based algorithms to solve it. We demonstrate how this instances with demands can be used to build a viable solution for the original CTP instance. We also demonstrate how this vertex clustering technique has the potential to solve even larger scale instances for the CTPMestradoCiência da ComputaçãoMestre em Ciência da Computação133323/2018-8, 131175/2017-3CNP
    corecore