650 research outputs found

    GRASP Heuristics for the stochastic weighted graph fragmentation problem

    Get PDF
    Critical nodes play a major role in network connectivity. Identifying them is important to design efficient strategies to prevent malware or epidemics spread through a network. In this context, the Stochastic Weighted Graph Fragmentation Problem (SWGFP) is a combinatorial optimization problem that belongs to the N P − Complete class. Its objective consists in minimizing the impact of a random attack on a singleton, choosing appropiately a set of nodes to immunize given a restricted budget. In the SWGFP, it is assumed that the attack follows a known probability law and that it affects the whole connected component of the attacked node. In this thesis, a GRASP enriched with Path Relinking algorithm is developed to solve the SWGFP. Its performance is studied under three attack scenarios and compared with a GRASP variant that was previously developed in literature and with a Random heuristic for the problem that picks a set of nodes uniformly at random. Computational experiments show that the algorithm based on Independent Sets which is developed in this thesis, outperforms the other two, with lower expected loss scores and higher robustness.Los nodos críticos juegan un rol fundamental en la conectividad de las redes. Su identificación es importante para el diseño de estrategias eficientes para prevenir que tanto un software malicioso como una epidemia se propaguen por la red. En este contexto, el Stochastic Weighted Graph Fragmentation Problem (SWGFP) es un problema de optimización combinatoria perteneciente a la clase de problemas NP−Completos. El objetivo consiste en miniminizar el impacto de un ataque aleatorio en un nodo de la red, seleccionando adecuadamente nodos a inmunizar con un presupuesto acotado. En el SWGFP se asume que el ataque sigue una ley de probabilidad conocida en los nodos, y que afecta a toda la componente conexa del nodo seleccionado. En esta tesis se desarrolla una solución GRASP enriquecida con Path-Relinking para abordar el SWGFP. Se estudia el rendimiento de la propuesta ante tres escenarios de ataque, en comparación con una variante de GRASP anteriormente desarrollada de la literatura y una heurística aleatoria o Random para el problema en la cual los nodos son elegidos al azar. Los experimentos computacionales muestran que el algoritmo basado en Conjuntos Independientes que se desarrolla en esta tesis, presenta un mejor desempeño que los dos restantes, con valores inferiores del número esperado de pérdidas y mayor robustez

    Analysis and optimization of highly reliable systems

    Get PDF
    In the field of network design, the survivability property enables the network to maintain a certain level of network connectivity and quality of service under failure conditions. In this thesis, survivability aspects of communication systems are studied. Aspects of reliability and vulnerability of network design are also addressed. The contributions are three-fold. First, a Hop Constrained node Survivable Network Design Problem (HCSNDP) with optional (Steiner) nodes is modelled. This kind of problems are N P-Hard. An exact integer linear model is built, focused on networks represented by graphs without rooted demands, considering costs in arcs and in Steiner nodes. In addition to the exact model, the calculation of lower and upper bounds to the optimal solution is included. Models were tested over several graphs and instances, in order to validate it in cases with known solution. An Approximation Algorithm is also developed in order to address a particular case of SNDP: the Two Node Survivable Star Problem (2NCSP) with optional nodes. This problem belongs to the class of N P-Hard computational problems too. Second, the research is focused on cascading failures and target/random attacks. The Graph Fragmentation Problem (GFP) is the result of a worst case analysis of a random attack. A fixed number of individuals for protection can be chosen, and a non-protected target node immediately destroys all reachable nodes. The goal is to minimize the expected number of destroyed nodes in the network. This problem belongs to the N P-Hard class. A mathematical programming formulation is introduced and exact resolution for small instances as well as lower and upper bounds to the optimal solution. In addition to exact methods, we address the GFP by several approaches: metaheuristics, approximation algorithms, polytime methods for specific instances and exact methods in exponential time. Finally, the concept of separability in stochastic binary systems is here introduced. Stochastic Binary Systems (SBS) represent a mathematical model of a multi-component on-off system subject to independent failures. The reliability evaluation of an SBS belongs to the N P-Hard class. Therefore, we fully characterize separable systems using Han-Banach separation theorem for convex sets. Using this new concept of separable systems and Markov inequality, reliability bounds are provided for arbitrary SBS

    Static reliability and resilience in dynamic systems

    Get PDF
    Two systems are modeled in this thesis. First, we consider a multi-component stochastic monotone binary system, or SMBS for short. The reliability of an SMBS is the probability of correct operation. A statistical approximation of the system reliability is provided for these systems, inspired in Monte Carlo Methods. Then, we are focused on the diameter constrained reliability model (DCR), which was originally developed for delay sensitive applications over the Internet infrastructure. The computational complexity of the DCR is analyzed. Networks with an efficient (i.e., polynomial time) DCR computation are offered, termed Weak graphs. Second, we model the effect of a dynamic epidemic propagation. Our first approach is to develop a SIR-based simulation, where unrealistic assumptions for SIR model (infinite, homogeneous, fully-mixed population) are discarded. Finally, we formalize a stochastic rocess that counts infected individuals, and further investigate node-immunization strategies, subject to a budget nstraint. A combinatorial optimization problem is here introduced, called Graph Fragmentation Problem. There, the impact of a highly virulent epidemic propagation is analyzed, and we mathematically prove that Greedy heuristic is suboptimal

    Spatial optimization for land use allocation: accounting for sustainability concerns

    Get PDF
    Land-use allocation has long been an important area of research in regional science. Land-use patterns are fundamental to the functions of the biosphere, creating interactions that have substantial impacts on the environment. The spatial arrangement of land uses therefore has implications for activity and travel within a region. Balancing development, economic growth, social interaction, and the protection of the natural environment is at the heart of long-term sustainability. Since land-use patterns are spatially explicit in nature, planning and management necessarily must integrate geographical information system and spatial optimization in meaningful ways if efficiency goals and objectives are to be achieved. This article reviews spatial optimization approaches that have been relied upon to support land-use planning. Characteristics of sustainable land use, particularly compactness, contiguity, and compatibility, are discussed and how spatial optimization techniques have addressed these characteristics are detailed. In particular, objectives and constraints in spatial optimization approaches are examined

    The distance-based critical node detection problem : models and algorithms

    Get PDF
    In the wake of terrorism and natural disasters, assessing networked systems for vulnerability to failures that arise from these events is essential to maintaining the operations of the systems. This is very crucial given the heavy dependence of daily social and economic activities on networked systems such as transport, telecommunication and energy networks as well as the interdependence of these networks. In this thesis, we explore methods to assess the vulnerability of networked systems to element failures which employ connectivity as the performance measure for vulnerability. The associated optimisation problem termed the critical node (edge) detection problem seeks to identify a subset of nodes (edges) of a network whose deletion (failure) optimises a network connectivity objective. Traditional connectivity measures employed in most studies of the critical node detection problem overlook internal cohesiveness of networks and the extent of connectivity in the network. This limits the effectiveness of the developed methods in uncovering vulnerability with regards to network connectivity. Our work therefore focuses on distance-based connectivity which is a fairly new class of connectivity introduced for studying the critical node detection problem to overcome the limitations of traditional connectivity measures. In Chapter 1, we provide an introduction outlining the motivations and the methods related to our study. In Chapter 2, we review the literature on the critical node detection problem as well as its application areas and related problems. Following this, we formally introduce the distance-based critical node detection problem in Chapter 3 where we propose new integer programming models for the case of hop-based distances and an efficient algorithm for the separation problems associated with the models. We also propose two families of valid inequalities. In Chapter 4, we consider the distance-based critical node detection problem using a heuristic approach in which we propose a centrality-based heuristic that employs a backbone crossover and a centrality-based neighbourhood search. In Chapter 5, we present generalisations of the methods proposed in Chapter 3 to edge-weighted graphs. We also introduce the edge-deletion version of the problem which we term the distance based critical edge detection problem. Throughout Chapters 3, 4 and 5, we provide computational experiments. Finally, in Chapter 6 we present conclusions as well future research directions. Keywords: Network Vulnerability, Critical Node Detection Problem, Distance-based Connectivity, Integer Programming, Lazy Constraints, Branch-and-cut, Heuristics.In the wake of terrorism and natural disasters, assessing networked systems for vulnerability to failures that arise from these events is essential to maintaining the operations of the systems. This is very crucial given the heavy dependence of daily social and economic activities on networked systems such as transport, telecommunication and energy networks as well as the interdependence of these networks. In this thesis, we explore methods to assess the vulnerability of networked systems to element failures which employ connectivity as the performance measure for vulnerability. The associated optimisation problem termed the critical node (edge) detection problem seeks to identify a subset of nodes (edges) of a network whose deletion (failure) optimises a network connectivity objective. Traditional connectivity measures employed in most studies of the critical node detection problem overlook internal cohesiveness of networks and the extent of connectivity in the network. This limits the effectiveness of the developed methods in uncovering vulnerability with regards to network connectivity. Our work therefore focuses on distance-based connectivity which is a fairly new class of connectivity introduced for studying the critical node detection problem to overcome the limitations of traditional connectivity measures. In Chapter 1, we provide an introduction outlining the motivations and the methods related to our study. In Chapter 2, we review the literature on the critical node detection problem as well as its application areas and related problems. Following this, we formally introduce the distance-based critical node detection problem in Chapter 3 where we propose new integer programming models for the case of hop-based distances and an efficient algorithm for the separation problems associated with the models. We also propose two families of valid inequalities. In Chapter 4, we consider the distance-based critical node detection problem using a heuristic approach in which we propose a centrality-based heuristic that employs a backbone crossover and a centrality-based neighbourhood search. In Chapter 5, we present generalisations of the methods proposed in Chapter 3 to edge-weighted graphs. We also introduce the edge-deletion version of the problem which we term the distance based critical edge detection problem. Throughout Chapters 3, 4 and 5, we provide computational experiments. Finally, in Chapter 6 we present conclusions as well future research directions. Keywords: Network Vulnerability, Critical Node Detection Problem, Distance-based Connectivity, Integer Programming, Lazy Constraints, Branch-and-cut, Heuristics

    A Polyhedral Study of Mixed 0-1 Set

    Get PDF
    We consider a variant of the well-known single node fixed charge network flow set with constant capacities. This set arises from the relaxation of more general mixed integer sets such as lot-sizing problems with multiple suppliers. We provide a complete polyhedral characterization of the convex hull of the given set

    Revisiting the Evolution and Application of Assignment Problem: A Brief Overview

    Get PDF
    The assignment problem (AP) is incredibly challenging that can model many real-life problems. This paper provides a limited review of the recent developments that have appeared in the literature, meaning of assignment problem as well as solving techniques and will provide a review on   a lot of research studies on different types of assignment problem taking place in present day real life situation in order to capture the variations in different types of assignment techniques. Keywords: Assignment problem, Quadratic Assignment, Vehicle Routing, Exact Algorithm, Bound, Heuristic etc

    Traveling Salesman Problem

    Get PDF
    This book is a collection of current research in the application of evolutionary algorithms and other optimal algorithms to solving the TSP problem. It brings together researchers with applications in Artificial Immune Systems, Genetic Algorithms, Neural Networks and Differential Evolution Algorithm. Hybrid systems, like Fuzzy Maps, Chaotic Maps and Parallelized TSP are also presented. Most importantly, this book presents both theoretical as well as practical applications of TSP, which will be a vital tool for researchers and graduate entry students in the field of applied Mathematics, Computing Science and Engineering

    Models and heuristics for forest management with environmental restrictions

    Get PDF
    Tese de doutoramento, Estatística e Investigação Operacional (Otimização), Universidade de Lisboa, Faculdade de Ciências, 2018The main focus of this thesis was to develop mathematical models and methods in integer programming for solving harvest scheduling problems with environmental restrictions. Constraints on maximum clearcut area, minimum total habitat area, minimum total core area and inter-habitat connectivity were addressed for this purpose. The research was structured in a collection of three papers, each one describing the study of a different forest harvest scheduling problem with respect to the environmental constraints. Problems of papers 1 and 2 aim at maximizing the net present value. A bi objective problem is considered in paper 3. The objectives are the maximization of the net present value and the maximization of the inter-habitat connectivity. The tree search methods branch-and-bound and multiobjective Monte Carlo tree search were designed specifically to solve the problems. The methods could be used as heuristics, as a time limit of 2 hours was imposed. All harvest scheduling problems were based on the socalled cluster formulation. The proposed models and methods were tested with sixteen real and hypothetical instances ranging from small to large. The results obtained for branch-and-bound and Monte Carlo tree search show that these methods were able to find solutions for all instances. The results suggest that it is possible to address the environmental restrictions with small reductions of the net present value. With respect to the forestry fragmentation caused by harvestings, the results suggest that, although clearcut size constraints tend to disperse clearcuts across the forest, compromising the development of large habitats, close to each other, the proposed models, with the other environmental constraints, attempt to mitigate this effect

    Occlusion reasoning for multiple object visual tracking

    Full text link
    Thesis (Ph.D.)--Boston UniversityOcclusion reasoning for visual object tracking in uncontrolled environments is a challenging problem. It becomes significantly more difficult when dense groups of indistinguishable objects are present in the scene that cause frequent inter-object interactions and occlusions. We present several practical solutions that tackle the inter-object occlusions for video surveillance applications. In particular, this thesis proposes three methods. First, we propose "reconstruction-tracking," an online multi-camera spatial-temporal data association method for tracking large groups of objects imaged with low resolution. As a variant of the well-known Multiple-Hypothesis-Tracker, our approach localizes the positions of objects in 3D space with possibly occluded observations from multiple camera views and performs temporal data association in 3D. Second, we develop "track linking," a class of offline batch processing algorithms for long-term occlusions, where the decision has to be made based on the observations from the entire tracking sequence. We construct a graph representation to characterize occlusion events and propose an efficient graph-based/combinatorial algorithm to resolve occlusions. Third, we propose a novel Bayesian framework where detection and data association are combined into a single module and solved jointly. Almost all traditional tracking systems address the detection and data association tasks separately in sequential order. Such a design implies that the output of the detector has to be reliable in order to make the data association work. Our framework takes advantage of the often complementary nature of the two subproblems, which not only avoids the error propagation issue from which traditional "detection-tracking approaches" suffer but also eschews common heuristics such as "nonmaximum suppression" of hypotheses by modeling the likelihood of the entire image. The thesis describes a substantial number of experiments, involving challenging, notably distinct simulated and real data, including infrared and visible-light data sets recorded ourselves or taken from data sets publicly available. In these videos, the number of objects ranges from a dozen to a hundred per frame in both monocular and multiple views. The experiments demonstrate that our approaches achieve results comparable to those of state-of-the-art approaches
    corecore