10 research outputs found

    Partial resampling to approximate covering integer programs

    Full text link
    We consider column-sparse covering integer programs, a generalization of set cover, which have a long line of research of (randomized) approximation algorithms. We develop a new rounding scheme based on the Partial Resampling variant of the Lov\'{a}sz Local Lemma developed by Harris & Srinivasan (2019). This achieves an approximation ratio of 1+ln(Δ1+1)amin+O(log(1+log(Δ1+1)amin)1 + \frac{\ln (\Delta_1+1)}{a_{\min}} + O\Big( \log(1 + \sqrt{ \frac{\log (\Delta_1+1)}{a_{\min}}} \Big), where amina_{\min} is the minimum covering constraint and Δ1\Delta_1 is the maximum 1\ell_1-norm of any column of the covering matrix (whose entries are scaled to lie in [0,1][0,1]). When there are additional constraints on the variable sizes, we show an approximation ratio of lnΔ0+O(loglogΔ0)\ln \Delta_0 + O(\log \log \Delta_0) (where Δ0\Delta_0 is the maximum number of non-zero entries in any column of the covering matrix). These results improve asymptotically, in several different ways, over results of Srinivasan (2006) and Kolliopoulos & Young (2005). We show nearly-matching inapproximability and integrality-gap lower bounds. We also show that the rounding process leads to negative correlation among the variables, which allows us to handle multi-criteria programs

    The Moser-Tardos Framework with Partial Resampling

    Full text link
    The resampling algorithm of Moser \& Tardos is a powerful approach to develop constructive versions of the Lov\'{a}sz Local Lemma (LLL). We generalize this to partial resampling: when a bad event holds, we resample an appropriately-random subset of the variables that define this event, rather than the entire set as in Moser & Tardos. This is particularly useful when the bad events are determined by sums of random variables. This leads to several improved algorithmic applications in scheduling, graph transversals, packet routing etc. For instance, we settle a conjecture of Szab\'{o} & Tardos (2006) on graph transversals asymptotically, and obtain improved approximation ratios for a packet routing problem of Leighton, Maggs, & Rao (1994)

    Algorithms for covering multiple submodular constraints and applications

    Get PDF
    We consider the problem of covering multiple submodular constraints. Given a finite ground set N, a weight function w:NR+w: N \rightarrow \mathbb {R}_+, r monotone submodular functions f1,f2,,frf_1,f_2,\ldots ,f_r over N and requirements k1,k2,,krk_1,k_2,\ldots ,k_r the goal is to find a minimum weight subset SNS \subseteq N such that fi(S)kif_i(S) \ge k_i for 1ir1 \le i \le r. We refer to this problem as Multi-Submod-Cover and it was recently considered by Har-Peled and Jones (Few cuts meet many point sets. CoRR. arxiv:abs1808.03260Har-Peled and Jones 2018) who were motivated by an application in geometry. Even with r=1r=1 Multi-Submod-Cover generalizes the well-known Submodular Set Cover problem (Submod-SC), and it can also be easily reduced to Submod-SC. A simple greedy algorithm gives an O(log(kr))O(\log (kr)) approximation where k=ikik = \sum _i k_i and this ratio cannot be improved in the general case. In this paper, motivated by several concrete applications, we consider two ways to improve upon the approximation given by the greedy algorithm. First, we give a bicriteria approximation algorithm for Multi-Submod-Cover that covers each constraint to within a factor of (11/eε)(1-1/e-\varepsilon ) while incurring an approximation of O(1ϵlogr)O(\frac{1}{\epsilon }\log r) in the cost. Second, we consider the special case when each fif_i is a obtained from a truncated coverage function and obtain an algorithm that generalizes previous work on partial set cover (Partial-SC), covering integer programs (CIPs) and multiple vertex cover constraints Bera et al. (Theoret Comput Sci 555:2–8 Bera et al. 2014). Both these algorithms are based on mathematical programming relaxations that avoid the limitations of the greedy algorithm. We demonstrate the implications of our algorithms and related ideas to several applications ranging from geometric covering problems to clustering with outliers. Our work highlights the utility of the high-level model and the lens of submodularity in addressing this class of covering problems.publishedVersio

    LIPIcs

    Get PDF
    The Lovász Local Lemma (LLL) is a powerful tool in probabilistic combinatorics which can be used to establish the existence of objects that satisfy certain properties. The breakthrough paper of Moser and Tardos and follow-up works revealed that the LLL has intimate connections with a class of stochastic local search algorithms for finding such desirable objects. In particular, it can be seen as a sufficient condition for this type of algorithms to converge fast. Besides conditions for existence of and fast convergence to desirable objects, one may naturally ask further questions regarding properties of these algorithms. For instance, "are they parallelizable?", "how many solutions can they output?", "what is the expected "weight" of a solution?", etc. These questions and more have been answered for a class of LLL-inspired algorithms called commutative. In this paper we introduce a new, very natural and more general notion of commutativity (essentially matrix commutativity) which allows us to show a number of new refined properties of LLL-inspired local search algorithms with significantly simpler proofs

    A new notion of commutativity for the algorithmic Lov\'{a}sz Local Lemma

    Full text link
    The Lov\'{a}sz Local Lemma (LLL) is a powerful tool in probabilistic combinatorics which can be used to establish the existence of objects that satisfy certain properties. The breakthrough paper of Moser and Tardos and follow-up works revealed that the LLL has intimate connections with a class of stochastic local search algorithms for finding such desirable objects. In particular, it can be seen as a sufficient condition for this type of algorithms to converge fast. Besides conditions for existence of and fast convergence to desirable objects, one may naturally ask further questions regarding properties of these algorithms. For instance, "are they parallelizable?", "how many solutions can they output?", "what is the expected "weight" of a solution?", etc. These questions and more have been answered for a class of LLL-inspired algorithms called commutative. In this paper we introduce a new, very natural and more general notion of commutativity (essentially matrix commutativity) which allows us to show a number of new refined properties of LLL-inspired local search algorithms with significantly simpler proofs

    Algorithms and Generalizations for the Lovasz Local Lemma

    Get PDF
    The Lovasz Local Lemma (LLL) is a cornerstone principle of the probabilistic method for combinatorics. This shows that one can avoid a large of set of “bad-events” (forbidden configurations of variables), provided the local conditions are satisfied. The original probabilistic formulation of this principle did not give efficient algorithms. A breakthrough result of Moser & Tardos led to an framework based on resampling variables which turns nearly all applications of the LLL into efficient algorithms. We extend and generalize the algorithm of Moser & Tardos in a variety of ways. We show tighter bounds on the complexity of the Moser-Tardos algorithm, particularly its parallel form. We also give a new, faster parallel algorithm for the LLL. We show that in some cases, the Moser-Tardos algorithm can converge even thoughthe LLL itself does not apply; we give a new criterion (comparable to the LLL) for determining when this occurs. This leads to improved bounds for k-SAT and hypergraph coloring among other applications. We describe an extension of the Moser-Tardos algorithm based on partial resampling, and use this to obtain better bounds for problems involving sums of independent random variables, such as column-sparse packing and packet-routing. We describe a variant of the partial resampling algorithm specialized to approximating column-sparse covering integer programs, a generalization of set-cover. We also give hardness reductions and integrality gaps, showing that our partial resampling based algorithm obtains nearly optimal approximation factors. We give a variant of the Moser-Tardos algorithm for random permutations, one of the few cases of the LLL not covered by the original algorithm of Moser & Tardos. We use this to develop the first constructive algorithms for Latin transversals and hypergraph packing, including parallel algorithms. We analyze the distribution of variables induced by the Moser-Tardos algorithm. We show it has a random-like structure, which can be used to accelerate the Moser-Tardos algorithm itself as well as to cover problems such as MAX k-SAT in which we only partially avoid bad-events
    corecore