11 research outputs found
Breaching the 2-Approximation Barrier for Connectivity Augmentation: a Reduction to Steiner Tree
The basic goal of survivable network design is to build a cheap network that
maintains the connectivity between given sets of nodes despite the failure of a
few edges/nodes. The Connectivity Augmentation Problem (CAP) is arguably one of
the most basic problems in this area: given a (-edge)-connected graph
and a set of extra edges (links), select a minimum cardinality subset of
links such that adding to increases its edge connectivity to .
Intuitively, one wants to make an existing network more reliable by augmenting
it with extra edges. The best known approximation factor for this NP-hard
problem is , and this can be achieved with multiple approaches (the first
such result is in [Frederickson and J\'aj\'a'81]).
It is known [Dinitz et al.'76] that CAP can be reduced to the case ,
a.k.a. the Tree Augmentation Problem (TAP), for odd , and to the case ,
a.k.a. the Cactus Augmentation Problem (CacAP), for even . Several better
than approximation algorithms are known for TAP, culminating with a recent
approximation [Grandoni et al.'18]. However, for CacAP the best known
approximation is .
In this paper we breach the approximation barrier for CacAP, hence for
CAP, by presenting a polynomial-time
approximation. Previous approaches exploit properties of TAP that do not seem
to generalize to CacAP. We instead use a reduction to the Steiner tree problem
which was previously used in parameterized algorithms [Basavaraju et al.'14].
This reduction is not approximation preserving, and using the current best
approximation factor for Steiner tree [Byrka et al.'13] as a black-box would
not be good enough to improve on . To achieve the latter goal, we ``open the
box'' and exploit the specific properties of the instances of Steiner tree
arising from CacAP.Comment: Corrected a typo in the abstract (in metadata
Improved Approximation for Tree Augmentation: Saving by Rewiring
The Tree Augmentation Problem (TAP) is a fundamental network design problem
in which we are given a tree and a set of additional edges, also called
\emph{links}. The task is to find a set of links, of minimum size, whose
addition to the tree leads to a -edge-connected graph. A long line of
results on TAP culminated in the previously best known approximation guarantee
of achieved by a combinatorial approach due to Kortsarz and Nutov [ACM
Transactions on Algorithms 2016], and also by an SDP-based approach by Cheriyan
and Gao [Algorithmica 2017]. Moreover, an elegant LP-based
-approximation has also been found very recently by Fiorini,
Gro\ss, K\"onemann, and Sanit\'a [SODA 2018]. In this paper, we show that an
approximation factor below can be achieved, by presenting a
-approximation that is based on several new techniques
Bridging the Gap Between Tree and Connectivity Augmentation: Unified and Stronger Approaches
We consider the Connectivity Augmentation Problem (CAP), a classical problem
in the area of Survivable Network Design. It is about increasing the
edge-connectivity of a graph by one unit in the cheapest possible way. More
precisely, given a -edge-connected graph and a set of extra edges,
the task is to find a minimum cardinality subset of extra edges whose addition
to makes the graph -edge-connected. If is odd, the problem is
known to reduce to the Tree Augmentation Problem (TAP) -- i.e., is a
spanning tree -- for which significant progress has been achieved recently,
leading to approximation factors below (the currently best factor is
). However, advances on TAP did not carry over to CAP so far. Indeed,
only very recently, Byrka, Grandoni, and Ameli (STOC 2020) managed to obtain
the first approximation factor below for CAP by presenting a
-approximation algorithm based on a method that is disjoint from recent
advances for TAP.
We first bridge the gap between TAP and CAP, by presenting techniques that
allow for leveraging insights and methods from TAP to approach CAP. We then
introduce a new way to get approximation factors below , based on a new
analysis technique. Through these ingredients, we obtain a
-approximation algorithm for CAP, and therefore also TAP. This leads to
the currently best approximation result for both problems in a unified way, by
significantly improving on the above-mentioned -approximation for CAP and
also the previously best approximation factor of for TAP by Grandoni,
Kalaitzis, and Zenklusen (STOC 2018). Additionally, a feature we inherit from
recent TAP advances is that our approach can deal with the weighted setting
when the ratio between the largest to smallest cost on extra links is bounded,
in which case we obtain approximation factors below
Augmenting Trees to Achieve 2-Node-Connectivity
This thesis focuses on the Node-Connectivity Tree Augmentation Problem (NC-TAP), formally defined as follows. The first input of the problem is a graph G which has vertex set V and edge set E. We require |V| >= 3 to avoid degenerate cases. The edge set E is a disjoint union of two sets T and L where the subgraph (V,T) is connected and acyclic. We call the edges in T the tree edges and the edges in L are called links. The second input is a vector c in R^L, c >= 0 (a vector of nonnegative real numbers indexed by the links), which is called the cost of the links. We often refer to this graph G and cost vector c as an instance of NC-TAP. Given an instance G = (V, T U L) and c to NC-TAP, a feasible solution to that instance is a set of links F such that the graph (V, T U F) is 2-connected. The cost of a set of links. The goal of NC-TAP is to find a feasible solution F^* to the given instance such that the the cost of F^* is minimum among all feasible solutions to the instance.
This thesis is mainly expository and it has two goals. First, we present the current best-known algorithms for NC-TAP. The second goal of this thesis is to explore new directions in the study of NC-TAP in the last chapter. This is an exploratory chapter where the goal is to use the state of the art techniques for TAP to develop an algorithm for NC-TAP which has an approximation guarantee better than factor 2
Decomposition-based methods for Connectivity Augmentation Problems
In this thesis, we study approximation algorithms for Connectivity Augmentation and related problems.
In the Connectivity Augmentation problem, one is given a base graph G=(V,E) that is k-edge-connected, and an additional set of edges that we refer to as links.
The task is to find a minimum cost subset of links such that adding F to G makes the graph (k+1)-edge-connected.
We first study a special case when k=1, which is equivalent to the Tree Augmentation problem.
We present a breakthrough result by Adjiashvili that gives an approximation algorithm for Tree Augmentation with approximation guarantee below 2, under the assumption that the cost of every link is bounded by a constant.
The algorithm is based on an elegant decomposition based method and uses a novel linear programming relaxation called the -bundle LP.
We then present a subsequent result by Fiorini, Gross, Konemann and Sanita who give a approximation algorithm for the same problem.
This result uses what are known as Chvatal-Gomory cuts to strengthen the linear programming relaxation used by Adjiashvili, and uses results from the theory of binet matrices to give an improved algorithm that is able to attain a significantly better approximation ratio.
Next, we look at the special case when k=2. This case is equivalent to what is known as the Cactus Augmentation problem.
A recent result by Cecchetto, Traub and Zenklusen give a 1.393-approximation algorithm for this problem using the same decomposition based algorithmic framework given by Adjiashvili.
We present a slightly weaker result that uses the same ideas and obtains a approximation ratio for the Cactus Augmentation problem.
Next, we take a look at the integrality ratio of the natural linear programming relaxation for Tree Augmentation, and present a result by Nutov that bounds this integrality gap by 28/15.
Finally, we study the related Forest Augmentation problem that is a generalization of Tree Augmentation.
There is no approximation algorithm for Forest Augmentation known that obtains an approximation ratio below 2.
We show that we can obtain a 29/15-approximation algorithm for Forest Augmentation under the assumption that the LP solution is half-integral via a reduction to Tree Augmentation.
We also study the structure of extreme points of the natural linear programming relaxation for Forest Augmentation and prove several properties that these extreme points satisfy
Bulk-robust assignment problems: hardness, approximability and algorithms
This thesis studies robust assignment problems with focus on computational complexity. Assignment problems are well-studied combinatorial optimization problems with numerous practical applications, for instance in production planning.
Classical approaches to optimization expect the input data for a problem to be given precisely.
In contrast, real-life optimization problems are modeled using forecasts resulting in uncertain problem parameters. This fact can be taken into account using the framework of robust optimization.
An instance of the classical assignment problem is represented using a bipartite graph accompanied by a cost function. The goal is to find a minimum-cost assignment, i.e., a set of resources (edges or nodes in the graph) defining a maximum matching. Most models for robust assignment problems suggested in the literature capture only uncertainty in the costs, i.e., the task is to find an assignment minimizing the cost in a worst-case scenario. The contribution of this thesis is the introduction and investigation of the Robust Assignment Problem (RAP) which models edge and node failures while the costs are deterministic. A scenario is defined by a set of resources that may fail simultaneously.
If a scenario emerges, the corresponding resources are deleted from the graph. RAP seeks to find a set of resources of minimal cost which is robust against all possible incidents, i.e., a set of resources containing an assignment for all scenarios. In production planning for example, lack of materials needed to complete an order can be encoded as an edge failure and production line maintenance corresponds to a node failure.
The main findings of this thesis are hardness of approximation and NP-hardness results for both versions of RAP, even in case of single edge (or node) failures. These results are complemented by approximation algorithms matching the theoretical lower bounds asymptotically. Additionally, we study a new related problem concerning k-robust matchings. A perfect matching in a graph is -robust if the graph remains perfectly matchable after the deletion of any k matching edges from the graph. We address the following question: How many edges have to be added to a graph to make a fixed perfect matching k-robust? We show that, in general, this problem is as hard as both aforementioned variants of RAP.
From an application point of view, this result implies that robustification of an existent infrastructure is not easier than designing a new one from scratch.Diese Dissertation behandelt robuste Zuordnungsprobleme mit dem Schwerpunkt auf deren komlexitÀtstheoretischen Eigenschaften. Zuordnungsprobleme sind gut untersuchte kombinatorische Optimierungsprobleme mit vielen praktischen Anwendungen, z. B. in der Produktionsplanung.
Klassische AnsÀtze der Optimierung gehen davon aus, dass die Inputdaten eines Problems exakt gegeben sind, wohingegen Optimierungsprobleme aus der Praxis mit Hilfe von Voraussagen modelliert werden. Daraus folgen unsichere Problemparameter, woran die Robuste Optimierung ansetzt. Die Unsicherheit wird mit Hilfe einer Szenarienmenge modelliert, die alle möglichen AusprÀgungen der Problemparameter beschreibt.
Eine Instanz des klassischen Zordnungsproblems wird mit Hilfe eines Graphen und einer Kostenfunktion beschrieben. Die Aufgabe besteht darin, eine Zuordnung mit minimalen Kosten zu finden. Eine Zuordnung ist eine Teilmenge an Ressourcen (Kanten oder Knoten des Graphen), die ein kardinalitĂ€tsmaximales Matching induziert. In der Literatur sind ĂŒberwiegend robuste Zuordnungsprobleme untersucht, die Unsicherheit in den Kosten behandeln, in diesem Fall besteht die Aufgabe darin, eine Zuordnung mit minimalen Kosten im Worst-Case-Szenario zu finden. Diese Dissertation dient der EinfĂŒhrung und Untersuchung des Robust Assignment Problem (RAP) welches Kanten- und KnotenausfĂ€lle modelliert; wobei die Kosten determinisitsch sind. Ein Szenario ist durch jene Teilmenge an Ressourcen definiert, welche gleichzeitig ausfallen können. Wenn ein Szenario eintritt, werden die jeweils ausfallenden Ressourcen aus dem Graphen entfernt.
In RAP besteht das Ziel darin, eine Menge an Ressourcen mit minimalen Kosten zu finden, die robust gegenĂŒber allen möglichen Ereignissen ist, d. h. eine Ressourcenmenge die fĂŒr alle Szenarien eine gĂŒltige Zuordnung enthĂ€lt. So kann beispielsweise in der Produktionsplanung der Mangel an Materialien, die fĂŒr einen Auftrag benötigt werden, als Kantenausfall und die wartungsbedingte Abschaltung einer Produktionslinie als Knotenausfall modelliert werden.
Die Hauptergebnisse dieser Arbeit sind Nichtapproximierbarkeits- und NP-Schwierigkeitsresultate beider RAP-Versionen, die bereits fĂŒr die EinschrĂ€nkung zutreffen, dass nur einzelne Kanten oder Knoten ausfallen können. Diese Ergebnisse werden durch Approximationsalgorithmen ergĂ€nzt, die die theoretischen Approximationsschranken asymptotisch erreichen. ZusĂ€tzlich wird ein neues, verwandtes Optimierungsproblem untersucht, welches sich mit k-robusten Matchings beschĂ€ftigt. Ein perfektes Matching in einem Graphen ist k-robust, wenn der Graph nach dem Löschen von k Matchingkanten weiterhin ein perfektes Matching besitzt. Es wird der Frage nachgegangen, wie viele Kanten zum Graphen hinzugefĂŒgt werden mĂŒssen, um ein gegebenes Matching k-robust zu machen. Dabei wird gezeigt, dass dieses Problem im Allgemeinen aus komplexitĂ€tstheoretischer Sicht genauso schwierig ist, wie die zuvor erwĂ€hnten RAP-Varianten. Aus der Anwendungsperspektive bedeutet dieses Resultat, dass die Robustifikation einer bestehender Infrastruktur nicht einfacher ist, als sie von Grund auf neu zu entwerfen