402 research outputs found
Relaxing the Irrevocability Requirement for Online Graph Algorithms
Online graph problems are considered in models where the irrevocability
requirement is relaxed. Motivated by practical examples where, for example,
there is a cost associated with building a facility and no extra cost
associated with doing it later, we consider the Late Accept model, where a
request can be accepted at a later point, but any acceptance is irrevocable.
Similarly, we also consider a Late Reject model, where an accepted request can
later be rejected, but any rejection is irrevocable (this is sometimes called
preemption). Finally, we consider the Late Accept/Reject model, where late
accepts and rejects are both allowed, but any late reject is irrevocable. For
Independent Set, the Late Accept/Reject model is necessary to obtain a constant
competitive ratio, but for Vertex Cover the Late Accept model is sufficient and
for Minimum Spanning Forest the Late Reject model is sufficient. The Matching
problem has a competitive ratio of 2, but in the Late Accept/Reject model, its
competitive ratio is 3/2
Optimal competitiveness for Symmetric Rectilinear Steiner Arborescence and related problems
We present optimal competitive algorithms for two interrelated known problems
involving Steiner Arborescence. One is the continuous problem of the Symmetric
Rectilinear Steiner Arborescence (SRSA), studied by Berman and Coulston.
A very related, but discrete problem (studied separately in the past) is the
online Multimedia Content Delivery (MCD) problem on line networks, presented
originally by Papadimitriu, Ramanathan, and Rangan. An efficient content
delivery was modeled as a low cost Steiner arborescence in a grid of
network*time they defined. We study here the version studied by Charikar,
Halperin, and Motwani (who used the same problem definitions, but removed some
constraints on the inputs).
The bounds on the competitive ratios introduced separately in the above
papers are similar for the two problems: O(log N) for the continuous problem
and O(log n) for the network problem, where N was the number of terminals to
serve, and n was the size of the network. The lower bounds were Omega(sqrt{log
N}) and Omega(sqrt{log n}) correspondingly. Berman and Coulston conjectured
that both the upper bound and the lower bound could be improved.
We disprove this conjecture and close these quadratic gaps for both problems.
We first present an O(sqrt{log n}) deterministic competitive algorithm for MCD
on the line, matching the lower bound. We then translate this algorithm to
become a competitive optimal algorithm O(sqrt{log N}) for SRSA. Finally, we
translate the latter back to solve MCD problem, this time competitive optimally
even in the case that the number of requests is small (that is, O(min{sqrt{log
n},sqrt{log N}})). We also present a Omega(sqrt[3]{log n}) lower bound on the
competitiveness of any randomized algorithm. Some of the techniques may be
useful in other contexts
Thresholded Covering Algorithms for Robust and Max-Min Optimization
The general problem of robust optimization is this: one of several possible
scenarios will appear tomorrow, but things are more expensive tomorrow than
they are today. What should you anticipatorily buy today, so that the
worst-case cost (summed over both days) is minimized? Feige et al. and
Khandekar et al. considered the k-robust model where the possible outcomes
tomorrow are given by all demand-subsets of size k, and gave algorithms for the
set cover problem, and the Steiner tree and facility location problems in this
model, respectively.
In this paper, we give the following simple and intuitive template for
k-robust problems: "having built some anticipatory solution, if there exists a
single demand whose augmentation cost is larger than some threshold, augment
the anticipatory solution to cover this demand as well, and repeat". In this
paper we show that this template gives us improved approximation algorithms for
k-robust Steiner tree and set cover, and the first approximation algorithms for
k-robust Steiner forest, minimum-cut and multicut. All our approximation ratios
(except for multicut) are almost best possible.
As a by-product of our techniques, we also get algorithms for max-min
problems of the form: "given a covering problem instance, which k of the
elements are costliest to cover?".Comment: 24 page
Parameterized Analysis of Online Steiner Tree Problems
Steiner tree problems occupy a central place in both areas of approximation and on-line algorithms. Many variants have been studied from the point of view of competitive analysis, and for several of these variants tight bounds are known. However, in several cases, worst-case analysis is overly pessimistic, which fails to explain the relative performance of algorithms. We show how adaptive analysis can help resolve this problem. As case studies, we consider the Steiner tree problem in directed graphs, and the Priority Steiner tree problem
Recommended from our members
Approximation Algorithms for NP-Hard Problems
The workshop was concerned with the most important recent developments in the area of efficient approximation algorithms for NP-hard optimization problems as well as with new techniques for proving intrinsic lower bounds for efficient approximation
Improved and Deterministic Online Service with Deadlines or Delay
We consider the problem of online service with delay on a general metric
space, first presented by Azar, Ganesh, Ge and Panigrahi (STOC 2017). The best
known randomized algorithm for this problem, by Azar and Touitou (FOCS 2019),
is -competitive, where is the number of points in the metric
space. This is also the best known result for the special case of online
service with deadlines, which is of independent interest.
In this paper, we present -competitive deterministic algorithms
for online service with deadlines or delay, improving upon the results from
FOCS 2019. Furthermore, our algorithms are the first deterministic algorithms
for online service with deadlines or delay which apply to general metric spaces
and have sub-polynomial competitiveness.Comment: Appears in STOC 202
Greedy Algorithms for Online Survivable Network Design
In an instance of the network design problem, we are given a graph G=(V,E), an edge-cost function c:E -> R^{>= 0}, and a connectivity criterion. The goal is to find a minimum-cost subgraph H of G that meets the connectivity requirements. An important family of this class is the survivable network design problem (SNDP): given non-negative integers r_{u v} for each pair u,v in V, the solution subgraph H should contain r_{u v} edge-disjoint paths for each pair u and v.
While this problem is known to admit good approximation algorithms in the offline case, the problem is much harder in the online setting. Gupta, Krishnaswamy, and Ravi [Gupta et al., 2012] (STOC\u2709) are the first to consider the online survivable network design problem. They demonstrate an algorithm with competitive ratio of O(k log^3 n), where k=max_{u,v} r_{u v}. Note that the competitive ratio of the algorithm by Gupta et al. grows linearly in k. Since then, an important open problem in the online community [Naor et al., 2011; Gupta et al., 2012] is whether the linear dependence on k can be reduced to a logarithmic dependency.
Consider an online greedy algorithm that connects every demand by adding a minimum cost set of edges to H. Surprisingly, we show that this greedy algorithm significantly improves the competitive ratio when a congestion of 2 is allowed on the edges or when the model is stochastic. While our algorithm is fairly simple, our analysis requires a deep understanding of k-connected graphs. In particular, we prove that the greedy algorithm is O(log^2 n log k)-competitive if one satisfies every demand between u and v by r_{uv}/2 edge-disjoint paths. The spirit of our result is similar to the work of Chuzhoy and Li [Chuzhoy and Li, 2012] (FOCS\u2712), in which the authors give a polylogarithmic approximation algorithm for edge-disjoint paths with congestion 2.
Moreover, we study the greedy algorithm in the online stochastic setting. We consider the i.i.d. model, where each online demand is drawn from a single probability distribution, the unknown i.i.d. model, where every demand is drawn from a single but unknown probability distribution, and the prophet model in which online demands are drawn from (possibly) different probability distributions. Through a different analysis, we prove that a similar greedy algorithm is constant competitive for the i.i.d. and the prophet models. Also, the greedy algorithm is O(log n)-competitive for the unknown i.i.d. model, which is almost tight due to the lower bound of [Garg et al., 2008] for single connectivity
- …