15 research outputs found

    On the Approximability and Hardness of the Minimum Connected Dominating Set with Routing Cost Constraint

    Full text link
    In the problem of minimum connected dominating set with routing cost constraint, we are given a graph G=(V,E)G=(V,E), and the goal is to find the smallest connected dominating set DD of GG such that, for any two non-adjacent vertices uu and vv in GG, the number of internal nodes on the shortest path between uu and vv in the subgraph of GG induced by D{u,v}D \cup \{u,v\} is at most α\alpha times that in GG. For general graphs, the only known previous approximability result is an O(logn)O(\log n)-approximation algorithm (n=Vn=|V|) for α=1\alpha = 1 by Ding et al. For any constant α>1\alpha > 1, we give an O(n11α(logn)1α)O(n^{1-\frac{1}{\alpha}}(\log n)^{\frac{1}{\alpha}})-approximation algorithm. When α5\alpha \geq 5, we give an O(nlogn)O(\sqrt{n}\log n)-approximation algorithm. Finally, we prove that, when α=2\alpha =2, unless NPDTIME(npolylogn)NP \subseteq DTIME(n^{poly\log n}), for any constant ϵ>0\epsilon > 0, the problem admits no polynomial-time 2log1ϵn2^{\log^{1-\epsilon}n}-approximation algorithm, improving upon the Ω(logn)\Omega(\log n) bound by Du et al. (albeit under a stronger hardness assumption)

    Almost Stable Matchings by Truncating the Gale–Shapley Algorithm

    Get PDF
    We show that the ratio of matched individuals to blocking pairs grows linearly with the number of propose–accept rounds executed by the Gale–Shapley algorithm for the stable marriage problem. Consequently, the participants can arrive at an almost stable matching even without full information about the problem instance; for each participant, knowing only its local neighbourhood is enough. In distributed-systems parlance, this means that if each person has only a constant number of acceptable partners, an almost stable matching emerges after a constant number of synchronous communication rounds. We apply our results to give a distributed (2 + ε)-approximation algorithm for maximum-weight matching in bicoloured graphs and a centralised randomised constant-time approximation scheme for estimating the size of a stable matching.Peer reviewe

    Approximating set multi-covers

    Get PDF
    Johnson and Lov\'asz and Stein proved independently that any hypergraph satisfies τ(1+lnΔ)τ\tau\leq (1+\ln \Delta)\tau^{\ast}, where τ\tau is the transversal number, τ\tau^{\ast} is its fractional version, and Δ\Delta denotes the maximum degree. We prove τfcτmax{lnΔ,f}\tau_f\leq c \tau^{\ast}\max\{\ln \Delta, f\} for the ff-fold transversal number τf\tau_f. Similarly to Johnson, Lov\'asz and Stein, we also show that this bound can be achieved non-probabilistically, using a greedy algorithm. As a combinatorial application, we prove an estimate on how fast τf/f\tau_f/f converges to τ\tau^{\ast}. As a geometric application, we obtain an upper bound on the minimal density of an ff-fold covering of the dd-dimensional Euclidean space by translates of any convex body.Comment: THE TITLE CHANGED! This is the final version. 7 page

    Approximate Clustering via Metric Partitioning

    Get PDF
    In this paper we consider two metric covering/clustering problems - \textit{Minimum Cost Covering Problem} (MCC) and kk-clustering. In the MCC problem, we are given two point sets XX (clients) and YY (servers), and a metric on XYX \cup Y. We would like to cover the clients by balls centered at the servers. The objective function to minimize is the sum of the α\alpha-th power of the radii of the balls. Here α1\alpha \geq 1 is a parameter of the problem (but not of a problem instance). MCC is closely related to the kk-clustering problem. The main difference between kk-clustering and MCC is that in kk-clustering one needs to select kk balls to cover the clients. For any \eps > 0, we describe quasi-polynomial time (1 + \eps) approximation algorithms for both of the problems. However, in case of kk-clustering the algorithm uses (1 + \eps)k balls. Prior to our work, a 3α3^{\alpha} and a cα{c}^{\alpha} approximation were achieved by polynomial-time algorithms for MCC and kk-clustering, respectively, where c>1c > 1 is an absolute constant. These two problems are thus interesting examples of metric covering/clustering problems that admit (1 + \eps)-approximation (using (1+\eps)k balls in case of kk-clustering), if one is willing to settle for quasi-polynomial time. In contrast, for the variant of MCC where α\alpha is part of the input, we show under standard assumptions that no polynomial time algorithm can achieve an approximation factor better than O(logX)O(\log |X|) for αlogX\alpha \geq \log |X|.Comment: 19 page

    Fault Tolerant Max-Cut

    Get PDF
    In this work, we initiate the study of fault tolerant Max-Cut, where given an edge-weighted undirected graph G = (V,E), the goal is to find a cut S ? V that maximizes the total weight of edges that cross S even after an adversary removes k vertices from G. We consider two types of adversaries: an adaptive adversary that sees the outcome of the random coin tosses used by the algorithm, and an oblivious adversary that does not. For any constant number of failures k we present an approximation of (0.878-?) against an adaptive adversary and of ?_{GW}? 0.8786 against an oblivious adversary (here ?_{GW} is the approximation achieved by the random hyperplane algorithm of [Goemans-Williamson J. ACM `95]). Additionally, we present a hardness of approximation of ?_{GW} against both types of adversaries, rendering our results (virtually) tight. The non-linear nature of the fault tolerant objective makes the design and analysis of algorithms harder when compared to the classic Max-Cut. Hence, we employ approaches ranging from multi-objective optimization to LP duality and the ellipsoid algorithm to obtain our results

    Veröffentlichungen und Vorträge 2006 der Mitglieder der Fakultät für Informatik

    Get PDF
    corecore