23 research outputs found

    그래프 최적화 문제를 위한 점진적 유전 알고리즘

    Get PDF
    학위논문 (박사)-- 서울대학교 대학원 : 전기·컴퓨터공학부, 2016. 8. 문병로.A combinatorial optimization problem is an optimization problem having a discrete solution space. Lots of the graph problems belong to this category as graphs are discrete objects. Graphs are widely used in the various field and there are lots of real world combinatorial optimization problems which take the graphs as their input. For some of these problems, the magnitude of the solution space is exponential to the size of the problem, and thereby efficient space search algorithms are required to deal with them. Genetic algorithms are widely used to solve combinatorial optimization problems, and incremental genetic algorithms could be used to efficiently solve graph optimization problems.We define subproblems and solve them step by step instead of tackling the problems directly. A subproblem solved by an incremental genetic algorithm deals with a restriction of the original graph structure. The subproblems are solved in the intermediate steps and the size of the subproblem is gradually increased. We apply the same genetic algorithm to each subproblem, and it is initialized with the evolved population of the previous step. We propose incremental genetic algorithms for two different combinatorial optimization problemsthe subgraph isomorphism problem and graph cut optimization problem. We devise an optimal substructure on the subproblem sequence and explain how it is related to the optimality of the process, along with other related factors. We present graph expansion methodologies and vertex reordering schemes to define an appropriate sequence of subproblems. We combine the proposed incremental approach with a hybrid genetic algorithm for the subgraph isomorphism problem, and the algorithm was further developed for nearly perfect results. Based on our analysis, we also propose an incremental genetic algorithm to solve graph cut optimization problems. We tested the implementation of the algorithm on benchmark graph instances for the graph partitioning problem and the maximum cut problem. Through experiments, we investigate and analyze how the sequence of subproblems affects the search space landscape. The performance of a genetic algorithm makes an improvement when the incremental approach is applied with respect to an appropriate sequence of subproblems.Chapter I. Introduction 1 Chapter II. Incremental Genetic Algorithm 6 2.1 Overview and Traditional Applications 6 2.2 Application on Graph Optimization Problems 9 2.2.1 Formalization of the Incremental Process 9 2.2.2 Theoretical Background 12 2.2.3 Sequence of Subproblems 15 Chapter III. Subgraph Isomorphism Problem 19 3.1 Introduction 19 3.2 The Proposed Algorithm 21 3.2.1 The Structure of the Incremental Genetic Algorithm 21 3.2.2 Design Issues 25 3.2.3 Genetic Framework 28 3.3 Experimental Results 31 3.3.1 Dataset and Evaluation 31 3.3.2 Results and Discussions 33 3.3.3 Overall Results 39 3.4 Further Improvement 42 3.4.1 New Operators 43 3.4.2 Improvements by New Operators 45 3.4.3 Overall Result 46 Chapter IV. Graph Cut Optimization Problems 50 4.1 Introduction 50 4.2 The Proposed Algorithm 51 4.2.1 Subproblem Structure 51 4.2.2 Reordering Schemes 54 4.2.3 Genetic Framework 55 4.3 Experimental Results 57 4.3.1 Dataset and Evaluation 57 4.3.2 Results on Graph Partitioning Problem 58 4.3.3 Results on Maximum Cut Problem 66 4.3.4 Results on Problem Variants 70 Chapter V. Related Applications 75 5.1 Measuring Source Code Similarity with an Incremental Genetic Algorithm 75 5.1.1 Introduction 75 5.1.2 The Proposed System 76 5.1.3 Experimental Results 80 5.1.4 Discussion 88 5.2 Linear Ordering Problem and an Approximate Fitness Evaluation 88 5.2.1 Introduction 88 5.2.2 The Proposed Method 89 5.2.3 Experimental Results 91 Chapter VI. Conclusions 94 Bibliography 96 국문 초록 106Docto

    Digital Filter Design Using Improved Artificial Bee Colony Algorithms

    Get PDF
    Digital filters are often used in digital signal processing applications. The design objective of a digital filter is to find the optimal set of filter coefficients, which satisfies the desired specifications of magnitude and group delay responses. Evolutionary algorithms are population-based meta-heuristic algorithms inspired by the biological behaviors of species. Compared to gradient-based optimization algorithms such as steepest descent and Newton’s like methods, these bio-inspired algorithms have the advantages of not getting stuck at local optima and being independent of the starting point in the solution space. The limitations of evolutionary algorithms include the presence of control parameters, problem specific tuning procedure, premature convergence and slower convergence rate. The artificial bee colony (ABC) algorithm is a swarm-based search meta-heuristic algorithm inspired by the foraging behaviors of honey bee colonies, with the benefit of a relatively fewer control parameters. In its original form, the ABC algorithm has certain limitations such as low convergence rate, and insufficient balance between exploration and exploitation in the search equations. In this dissertation, an ABC-AMR algorithm is proposed by incorporating an adaptive modification rate (AMR) into the original ABC algorithm to increase convergence rate by adjusting the balance between exploration and exploitation in the search equations through an adaptive determination of the number of parameters to be updated in every iteration. A constrained ABC-AMR algorithm is also developed for solving constrained optimization problems.There are many real-world problems requiring simultaneous optimizations of more than one conflicting objectives. Multiobjective (MO) optimization produces a set of feasible solutions called the Pareto front instead of a single optimum solution. For multiobjective optimization, if a decision maker’s preferences can be incorporated during the optimization process, the search process can be confined to the region of interest instead of searching the entire region. In this dissertation, two algorithms are developed for such incorporation. The first one is a reference-point-based MOABC algorithm in which a decision maker’s preferences are included in the optimization process as the reference point. The second one is a physical-programming-based MOABC algorithm in which physical programming is used for setting the region of interest of a decision maker. In this dissertation, the four developed algorithms are applied to solve digital filter design problems. The ABC-AMR algorithm is used to design Types 3 and 4 linear phase FIR differentiators, and the results are compared to those obtained by the original ABC algorithm, three improved ABC algorithms, and the Parks-McClellan algorithm. The constrained ABC-AMR algorithm is applied to the design of sparse Type 1 linear phase FIR filters of filter orders 60, 70 and 80, and the results are compared to three state-of-the-art design methods. The reference-point-based multiobjective ABC algorithm is used to design of asymmetric lowpass, highpass, bandpass and bandstop FIR filters, and the results are compared to those obtained by the preference-based multiobjective differential evolution algorithm. The physical-programming-based multiobjective ABC algorithm is used to design IIR lowpass, highpass and bandpass filters, and the results are compared to three state-of-the-art design methods. Based on the obtained design results, the four design algorithms are shown to be competitive as compared to the state-of-the-art design methods

    Preventing premature convergence and proving the optimality in evolutionary algorithms

    Get PDF
    http://ea2013.inria.fr//proceedings.pdfInternational audienceEvolutionary Algorithms (EA) usually carry out an efficient exploration of the search-space, but get often trapped in local minima and do not prove the optimality of the solution. Interval-based techniques, on the other hand, yield a numerical proof of optimality of the solution. However, they may fail to converge within a reasonable time due to their inability to quickly compute a good approximation of the global minimum and their exponential complexity. The contribution of this paper is a hybrid algorithm called Charibde in which a particular EA, Differential Evolution, cooperates with a Branch and Bound algorithm endowed with interval propagation techniques. It prevents premature convergence toward local optima and outperforms both deterministic and stochastic existing approaches. We demonstrate its efficiency on a benchmark of highly multimodal problems, for which we provide previously unknown global minima and certification of optimality

    Large-scale unit commitment under uncertainty: an updated literature survey

    Get PDF
    The Unit Commitment problem in energy management aims at finding the optimal production schedule of a set of generation units, while meeting various system-wide constraints. It has always been a large-scale, non-convex, difficult problem, especially in view of the fact that, due to operational requirements, it has to be solved in an unreasonably small time for its size. Recently, growing renewable energy shares have strongly increased the level of uncertainty in the system, making the (ideal) Unit Commitment model a large-scale, non-convex and uncertain (stochastic, robust, chance-constrained) program. We provide a survey of the literature on methods for the Uncertain Unit Commitment problem, in all its variants. We start with a review of the main contributions on solution methods for the deterministic versions of the problem, focussing on those based on mathematical programming techniques that are more relevant for the uncertain versions of the problem. We then present and categorize the approaches to the latter, while providing entry points to the relevant literature on optimization under uncertainty. This is an updated version of the paper "Large-scale Unit Commitment under uncertainty: a literature survey" that appeared in 4OR 13(2), 115--171 (2015); this version has over 170 more citations, most of which appeared in the last three years, proving how fast the literature on uncertain Unit Commitment evolves, and therefore the interest in this subject

    Optimization of storage and picking systems in warehouses

    Get PDF
    La croissance du commerce électronique exige une hausse des performances des systèmes d'entreposage, qui sont maintenant repensés pour faire face à un volume massif de demandes à être satisfait le plus rapidement possible. Le système manuel et le système à robots mobile (SRM) sont parmi les plus utilisés pour ces activités. Le premier est un système centré sur l'humain pour réaliser des opérations complexes que les robots actuels ne peuvent pas effectuer. Cependant, les nouvelles générations de robots autonomes mènent à un remplacement progressif par le dernier pour augmenter la productivité. Quel que soit le système utilisé, plusieurs problèmes interdépendants doivent être résolus pour avoir des processus de stockage et de prélèvement efficaces. Les problèmes de stockage concernent les décisions d'où stocker les produits dans l'entrepôt. Les problèmes de prélèvement incluent le regroupement des commandes à exécuter ensemble et les itinéraires que les cueilleurs et les robots doivent suivre pour récupérer les produits demandés. Dans le système manuel, ces problèmes sont traditionnellement résolus à l'aide de politiques simples que les préparateurs peuvent facilement suivre. Malgré l'utilisation de robots, la même stratégie de solution est répliquée aux problèmes équivalents trouvés dans le SRM. Dans cette recherche, nous étudions les problèmes de stockage et de prélèvement rencontrés lors de la conception du système manuel et du SRM. Nous développons des outils d'optimisation pour aider à la prise de décision pour mettre en place leurs processus, en améliorant les mesures de performance typiques de ces systèmes. Certains problèmes traditionnels sont résolus avec des techniques améliorées, tandis que d'autres sont intégrés pour être résolus ensemble au lieu d'optimiser chaque sous-système de manière indépendante. Nous considérons d'abord un système manuel avec un ensemble connu de commandes et intégrons les décisions de stockage et de routage. Le problème intégré et certaines variantes tenant compte des politiques de routage communes sont modélisés mathématiquement. Une métaheuristique générale de recherche de voisinage variable est présentée pour traiter des instances de taille réelle. Des expériences attestent de l'efficience de la métaheuristique proposée par rapport aux modèles exacts et aux politiques de stockage communes. Lorsque les demandes futures sont incertaines, il est courant d'utiliser une stratégie de zonage qui divise la zone de stockage en zones et attribue les produits les plus demandés aux meilleures zones. Les tailles des zones sont à déterminer. Généralement, des dimensions arbitraires sont choisies, mais elles ignorent les caractéristiques de l'entrepôt et des demandes. Nous abordons le problème de dimensionnement des zones pour déterminer quels facteurs sont pertinents pour choisir de meilleures tailles de zone. Les données générées à partir de simulations exhaustives sont utilisées pour trainer quatre modèles de régression d'apprentissage automatique - moindres carrés ordinaire, arbre de régression, forêt aléatoire et perceptron multicouche - afin de prédire les dimensions optimales des zones en fonction de l'ensemble de facteurs pertinents identifiés. Nous montrons que tous les modèles entraînés suggèrent des dimensions sur mesure des zones qui performent meilleur que les dimensions arbitraires couramment utilisées. Une autre approche pour résoudre les problèmes de stockage pour le système manuel et pour le SRM considère les corrélations entre les produits. L'idée est que les produits régulièrement demandés ensemble doivent être stockés près pour réduire les coûts de routage. Cette politique de stockage peut être modélisée comme une variante du problème d'affectation quadratique (PAQ). Le PAQ est un problème combinatoire traditionnel et l'un des plus difficiles à résoudre. Nous examinons les variantes les plus connues du PAQ et développons une puissante métaheuristique itérative de recherche tabou mémétique en parallèle capable de les résoudre. La métaheuristique proposée s'avère être parmi les plus performantes pour le PAQ et surpasse considérablement l'état de l'art pour ses variantes. Les SRM permettent de repositionner facilement les pods d'inventaire pendant les opérations, ce qui peut conduire à un processus de prélèvement plus économe en énergie. Nous intégrons les décisions de repositionnement des pods à l'attribution des commandes et à la sélection des pods à l'aide d'une stratégie de prélèvement par vague. Les pods sont réorganisés en tenant compte du moment et de l'endroit où ils devraient être demandés au futur. Nous résolvons ce problème en utilisant la programmation stochastique en tenant compte de l'incertitude sur les demandes futures et suggérons une matheuristique de recherche locale pour résoudre des instances de taille réelle. Nous montrons que notre schéma d'approximation moyenne de l'échantillon est efficace pour simuler les demandes futures puisque nos méthodes améliorent les solutions trouvées lorsque les vagues sont planifiées sans tenir compte de l'avenir. Cette thèse est structurée comme suit. Après un chapitre d'introduction, nous présentons une revue de la littérature sur le système manuel et le SRM, et les décisions communes prises pour mettre en place leurs processus de stockage et de prélèvement. Les quatre chapitres suivants détaillent les études pour le problème de stockage et de routage intégré, le problème de dimensionnement des zones, le PAQ et le problème de repositionnement de pod. Nos conclusions sont résumées dans le dernier chapitre.The rising of e-commerce is demanding an increase in the performance of warehousing systems, which are being redesigned to deal with a mass volume of demands to be fulfilled as fast as possible. The manual system and the robotic mobile fulfillment system (RMFS) are among the most commonly used for these activities. The former is a human-centered system that handles complex operations that current robots cannot perform. However, newer generations of autonomous robots are leading to a gradual replacement by the latter to increase productivity. Regardless of the system used, several interdependent problems have to be solved to have efficient storage and picking processes. Storage problems concern decisions on where to store products within the warehouse. Picking problems include the batching of orders to be fulfilled together and the routes the pickers and robots should follow to retrieve the products demanded. In the manual system, these problems are traditionally solved using simple policies that pickers can easily follow. Despite using robots, the same solution strategy is being replicated to the equivalent problems found in the RMFS. In this research, we investigate storage and picking problems faced when designing manual and RMFS warehouses. We develop optimization tools to help in the decision-making process to set up their processes and improve typical performance measures considered in these systems. Some classic problems are solved with improved techniques, while others are integrated to be solved together instead of optimizing each subsystem sequentially. We first consider a manual system with a known set of orders and integrate storage and routing decisions. The integrated problem and some variants considering common routing policies are modeled mathematically. A general variable neighborhood search metaheuristic is presented to deal with real-size instances. Computational experiments attest to the effectiveness of the metaheuristic proposed compared to the exact models and common storage policies. When future demands are uncertain, it is common to use a zoning strategy to divide the storage area into zones and assign the most-demanded products to the best zones. Zone sizes are to be determined. Commonly, arbitrary sizes are chosen, which ignore the characteristics of the warehouse and the demands. We approach the zone sizing problem to determine which factors are relevant to choosing better zone sizes. Data generated from exhaustive simulations are used to train four machine learning regression models - ordinary least squares, regression tree, random forest, and multilayer perceptron - to predict the optimal zone sizes given the set of relevant factors identified. We show that all trained models suggest tailor-made zone sizes with better picking performance than the arbitrary ones commonly used. Another approach to solving storage problems, both in the manual and RMFS, considers the correlations between products. The idea is that products constantly demanded together should be stored closer to reduce routing costs. This storage policy can be modeled as a quadratic assignment problem (QAP) variant. The QAP is a traditional combinatorial problem and one of the hardest to solve. We survey the most traditional QAP variants and develop a powerful parallel memetic iterated tabu search metaheuristic capable of solving them. The proposed metaheuristic is shown to be among the best performing ones for the QAP and significantly outperforms the state-of-the-art for its variants. The RMFS allows easy repositioning of inventory pods during operations that can lead to a more energy-efficient picking process. We integrate pod repositioning decisions with order assignment and pod selection using a wave picking strategy such that pods are parked after being requested considering when and where they are expected to be requested next. We solve this integrated problem using stochastic programming considering the uncertainty about future demands and suggest a local search matheuristic to solve real-size instances. We show that our sample average approximation scheme is effective to simulate future demands since our methods improve solutions found when waves are planned without considering the future demands. This thesis is structured as follows. After an introductory chapter, we present a literature review on the manual and RMFS, and common decisions made to set up their storage and picking processes. The next four chapters detail the studies for the integrated storage and routing problem, the zone sizing problem, the QAP, and the pod repositioning problem. Our findings are summarized in the last chapter

    Algoritmos evolutivos para alguns problemas em telecomunicações

    Get PDF
    Orientadores: Flavio Keidi Miyazawa, Mauricio Guilherme de Carvalho ResendeTese (doutorado) - Universidade Estadual de Campinas, Instituto de ComputaçãoResumo: Nos últimos anos, as redes de telecomunicação tem experienciado um grande aumento no fluxo de dados. Desde a utilização massiva de vídeo sob demanda até o incontável número de dispositivos móveis trocando texto e vídeo, o tráfego alcançou uma escala capaz de superar a capacidade das redes atuais. Portanto, as companhias de telecomunicação ao redor do mundo tem sido forçadas a aumentar a capacidade de suas redes para servir esta crescente demanda. Como o custo de instalar uma infraestrutura de rede é geralmente muito grande, o projeto de redes usa fortemente ferramentas de otimização para manter os custos tão baixos quanto possível. Nesta tese, nós analisamos vários aspectos do projeto e implementação de redes de telecomunicação. Primeiramente, nós apresentamos um novo problema de projeto de redes usado para servir demandas sem fio de dispositivos móveis e rotear tal tráfego para a rede principal. Tais redes de acesso são baseadas em tecnologias sem fio modernos como Wi-Fi, LTE e HSPA. Este problema consideramos várias restrições reais e é difícil de ser resolvido. Nós estudamos casos reais nas vizinhanças de uma grande cidade nos Estados Unidos. Em seguida, nós apresentamos uma variação do problema de localização de hubs usado para modelar as redes principais (backbones ou laços centrais). Este problema também pode ser utilizado para modelar redes de transporte de cargas e passageiros. Nós também estudamos o problema de clusterização correlacionada com sobreposições usado para modelar o comportamento dos usuários quando utilizam seus equipamentos móveis. Neste problema, nós podemos rotular um objeto usando múltiplos rótulos e analisar a conexão entre eles. Este problema é adequado para análise de mobilidade de equipamentos que pode ser usada para estimar o tráfego em uma dada região. E finalmente, nós analisamos o licenciamento de espectro sobre uma perspectiva governamental. Nestes casos, uma agência do governo deseja vender licenças para companhias de telecomunicação para que operem em uma dada faixa de espectro. Este processo usualmente é conduzido usando leilões combinatoriais. Para todos problemas, nós propomos algoritmos genéticos de chaves aleatórias viciadas e modelos de programação linear inteira mista para resolvê-los (exceto para o problema de clusterização correlacionada com sobreposição, devido sua natureza não-linear). Os algoritmos que propusemos foram capazes de superar algoritmos do estado da arte para todos problemasAbstract: Cutting and packing problems are common problems that occur in many industry and business process. Their optimized resolution leads to great profits in several sectors. A common problem, that occur in textil and paper industries, is to cut a strip of some material to obtain several small items, using the minimum length of material. This problem, known by Two Dimensional Strip Packing Problem (2SP), is a hard combinatorial optimization problem. In this work, we present an exact algorithm to 2SP, restricted to two staged cuts (known by Two Dimensional Level Strip Packing, 2LSP). The algorithm uses the branch-and-price technique, and heuristics based on approximation algorithms to obtain upper bounds. The algorithm obtained optimal or almost optimal for small and moderate sized instancesAbstract: In last twenty years, telecommunication networks have experienced a huge increase in data utilization. From massive on-demand video to uncountable mobile devices exchanging text and video, traffic reached scales that overcame the network capacities. Therefore, telecommunication companies around the world have been forced to increase their capacity to serve this increasing demand. As the cost to deploy network infrastructure is usually very large, the design of a network heavily uses optimization tools to keep costs as low as possible. In this thesis, we analyze several aspects of the design and deployment of communication networks. First, we present a new network design problem used to serve wireless demands from mobile devices and route the traffic to the core network. Such access networks are based on modern wireless access technologies such as Wi-Fi, LTE, and HSPA. This problem has several real world constraints and it is hard to solve. We study real cases of the vicinity of a large city in the United States. Following, we present a variation of the hub-location problem used to model these core networks. Such problem is also suitable to model transportation networks. We also study the overlapping correlation clustering problem used to model the user's behavior when using their mobile devices. In such problem, one can label an object with multiple labels and analyzes the connections between them. Although this problem is very generic, it is suitable to analyze device mobility which can be used to estimate traffic in geographical regions. Finally, we analyze spectrum licensing from a governmental perspective. In these cases, a governmental agency wants to sell rights for telecommunication companies to operate over a given spectrum range. This process usually is conducted using combinatorial auctions. For all problems we propose biased random-key genetic algorithms and mixed integer linear programming models (except in the case of the overlapping correlation clustering problem due its non-linear nature). Our algorithms were able to overcome the state of the art algorithms for all problemsDoutoradoCiência da ComputaçãoDoutor em Ciência da Computaçã

    Routing and scheduling optimisation under uncertainty for engineering applications

    Get PDF
    The thesis aims to develop a viable computational approach suitable for solving large vehicle routing and scheduling optimisation problems affected by uncertainty. The modelling framework is built upon recent advances in Stochastic Optimisation, Robust Optimisation and Distributionally Robust Optimization. The utility of the methodology is presented on two classes of discrete optimisation problems: scheduling satellite communication, which is a variant of Machine Scheduling, and the Vehicle Routing Problem with Time Windows and Synchronised Visits. For each problem class, a practical engineering application is formulated using data coming from the real world. The significant size of the problem instances reinforced the need to apply a different computational approach for each problem class. Satellite communication is scheduled using a Mixed-Integer Programming solver. In contrast, the vehicle routing problem with synchronised visits is solved using a hybrid method that combines Iterated Local Search, Constraint Programming and the Guided Local Search metaheuristic. The featured application of scheduling satellite communication is the Satellite Quantum Key Distribution for a system that consists of one spacecraft placed in the Lower Earth Orbit and a network of optical ground stations located in the United Kingdom. The satellite generates cryptographic keys and transmits them to individual ground stations. Each ground station should receive the number of keys in proportion to the importance of the ground station in the network. As clouds containing water attenuate the signal, reliable scheduling needs to account for cloud cover predictions, which are naturally affected by uncertainty. A new uncertainty sets tailored for modelling uncertainty in predictions of atmospheric phenomena is the main contribution to the methodology. The uncertainty set models the evolution of uncertain parameters using a Multivariate Vector Auto-Regressive Time Series, which preserves correlations over time and space. The problem formulation employing the new uncertainty set compares favourably to a suite of alternative models adapted from the literature considering both the computational time and the cost-effectiveness of the schedule evaluated in the cloud cover conditions observed in the real world. The other contribution of the thesis in the satellite scheduling domain is the formulation of the Satellite Quantum Key Distribution problem. The proof of computational complexity and thorough performance analysis of an example Satellite Quantum Key Distribution system accompany the formulation. The Home Care Scheduling and Routing Problem, which instances are solved for the largest provider of such services in Scotland, is the application of the Vehicle Routing Problem with Time Windows and Synchronised Visits. The problem instances contain over 500 visits. Around 20% of them require two carers simultaneously. Such problem instances are well beyond the scalability limitations of the exact method and considerably larger than instances of similar problems considered in the literature. The optimisation approach proposed in the thesis found effective solutions in attractive computational time (i.e., less than 30 minutes) and the solutions reduced the total travel time threefold compared to alternative schedules computed by human planners. The Essential Riskiness Index Optimisation was incorporated into the Constraint Programming model to address uncertainty in visits' duration. Besides solving large problem instances from the real world, the solution method reproduced the majority of the best results reported in the literature and strictly improved the solutions for several instances of a well-known benchmark for the Vehicle Routing Problem with Time Windows and Synchronised Visits.The thesis aims to develop a viable computational approach suitable for solving large vehicle routing and scheduling optimisation problems affected by uncertainty. The modelling framework is built upon recent advances in Stochastic Optimisation, Robust Optimisation and Distributionally Robust Optimization. The utility of the methodology is presented on two classes of discrete optimisation problems: scheduling satellite communication, which is a variant of Machine Scheduling, and the Vehicle Routing Problem with Time Windows and Synchronised Visits. For each problem class, a practical engineering application is formulated using data coming from the real world. The significant size of the problem instances reinforced the need to apply a different computational approach for each problem class. Satellite communication is scheduled using a Mixed-Integer Programming solver. In contrast, the vehicle routing problem with synchronised visits is solved using a hybrid method that combines Iterated Local Search, Constraint Programming and the Guided Local Search metaheuristic. The featured application of scheduling satellite communication is the Satellite Quantum Key Distribution for a system that consists of one spacecraft placed in the Lower Earth Orbit and a network of optical ground stations located in the United Kingdom. The satellite generates cryptographic keys and transmits them to individual ground stations. Each ground station should receive the number of keys in proportion to the importance of the ground station in the network. As clouds containing water attenuate the signal, reliable scheduling needs to account for cloud cover predictions, which are naturally affected by uncertainty. A new uncertainty sets tailored for modelling uncertainty in predictions of atmospheric phenomena is the main contribution to the methodology. The uncertainty set models the evolution of uncertain parameters using a Multivariate Vector Auto-Regressive Time Series, which preserves correlations over time and space. The problem formulation employing the new uncertainty set compares favourably to a suite of alternative models adapted from the literature considering both the computational time and the cost-effectiveness of the schedule evaluated in the cloud cover conditions observed in the real world. The other contribution of the thesis in the satellite scheduling domain is the formulation of the Satellite Quantum Key Distribution problem. The proof of computational complexity and thorough performance analysis of an example Satellite Quantum Key Distribution system accompany the formulation. The Home Care Scheduling and Routing Problem, which instances are solved for the largest provider of such services in Scotland, is the application of the Vehicle Routing Problem with Time Windows and Synchronised Visits. The problem instances contain over 500 visits. Around 20% of them require two carers simultaneously. Such problem instances are well beyond the scalability limitations of the exact method and considerably larger than instances of similar problems considered in the literature. The optimisation approach proposed in the thesis found effective solutions in attractive computational time (i.e., less than 30 minutes) and the solutions reduced the total travel time threefold compared to alternative schedules computed by human planners. The Essential Riskiness Index Optimisation was incorporated into the Constraint Programming model to address uncertainty in visits' duration. Besides solving large problem instances from the real world, the solution method reproduced the majority of the best results reported in the literature and strictly improved the solutions for several instances of a well-known benchmark for the Vehicle Routing Problem with Time Windows and Synchronised Visits
    corecore