14 research outputs found

    Adversarially Robust Submodular Maximization under Knapsack Constraints

    Full text link
    We propose the first adversarially robust algorithm for monotone submodular maximization under single and multiple knapsack constraints with scalable implementations in distributed and streaming settings. For a single knapsack constraint, our algorithm outputs a robust summary of almost optimal (up to polylogarithmic factors) size, from which a constant-factor approximation to the optimal solution can be constructed. For multiple knapsack constraints, our approximation is within a constant-factor of the best known non-robust solution. We evaluate the performance of our algorithms by comparison to natural robustifications of existing non-robust algorithms under two objectives: 1) dominating set for large social network graphs from Facebook and Twitter collected by the Stanford Network Analysis Project (SNAP), 2) movie recommendations on a dataset from MovieLens. Experimental results show that our algorithms give the best objective for a majority of the inputs and show strong performance even compared to offline algorithms that are given the set of removals in advance.Comment: To appear in KDD 201

    Submodular Maximization with Nearly Optimal Approximation, Adaptivity and Query Complexity

    Full text link
    Submodular optimization generalizes many classic problems in combinatorial optimization and has recently found a wide range of applications in machine learning (e.g., feature engineering and active learning). For many large-scale optimization problems, we are often concerned with the adaptivity complexity of an algorithm, which quantifies the number of sequential rounds where polynomially-many independent function evaluations can be executed in parallel. While low adaptivity is ideal, it is not sufficient for a distributed algorithm to be efficient, since in many practical applications of submodular optimization the number of function evaluations becomes prohibitively expensive. Motivated by these applications, we study the adaptivity and query complexity of adaptive submodular optimization. Our main result is a distributed algorithm for maximizing a monotone submodular function with cardinality constraint kk that achieves a (11/eε)(1-1/e-\varepsilon)-approximation in expectation. This algorithm runs in O(log(n))O(\log(n)) adaptive rounds and makes O(n)O(n) calls to the function evaluation oracle in expectation. The approximation guarantee and query complexity are optimal, and the adaptivity is nearly optimal. Moreover, the number of queries is substantially less than in previous works. Last, we extend our results to the submodular cover problem to demonstrate the generality of our algorithm and techniques.Comment: 30 pages, Proceedings of the Thirtieth Annual ACM-SIAM Symposium on Discrete Algorithms (SODA 2019

    Submodular Maximization with Matroid and Packing Constraints in Parallel

    Full text link
    We consider the problem of maximizing the multilinear extension of a submodular function subject a single matroid constraint or multiple packing constraints with a small number of adaptive rounds of evaluation queries. We obtain the first algorithms with low adaptivity for submodular maximization with a matroid constraint. Our algorithms achieve a 11/eϵ1-1/e-\epsilon approximation for monotone functions and a 1/eϵ1/e-\epsilon approximation for non-monotone functions, which nearly matches the best guarantees known in the fully adaptive setting. The number of rounds of adaptivity is O(log2n/ϵ3)O(\log^2{n}/\epsilon^3), which is an exponential speedup over the existing algorithms. We obtain the first parallel algorithm for non-monotone submodular maximization subject to packing constraints. Our algorithm achieves a 1/eϵ1/e-\epsilon approximation using O(log(n/ϵ)log(1/ϵ)log(n+m)/ϵ2)O(\log(n/\epsilon) \log(1/\epsilon) \log(n+m)/ \epsilon^2) parallel rounds, which is again an exponential speedup in parallel time over the existing algorithms. For monotone functions, we obtain a 11/eϵ1-1/e-\epsilon approximation in O(log(n/ϵ)log(m)/ϵ2)O(\log(n/\epsilon)\log(m)/\epsilon^2) parallel rounds. The number of parallel rounds of our algorithm matches that of the state of the art algorithm for solving packing LPs with a linear objective. Our results apply more generally to the problem of maximizing a diminishing returns submodular (DR-submodular) function

    Approximability of Monotone Submodular Function Maximization under Cardinality and Matroid Constraints in the Streaming Model

    Get PDF
    Maximizing a monotone submodular function under various constraints is a classical and intensively studied problem. However, in the single-pass streaming model, where the elements arrive one by one and an algorithm can store only a small fraction of input elements, there is much gap in our knowledge, even though several approximation algorithms have been proposed in the literature. In this work, we present the first lower bound on the approximation ratios for cardinality and matroid constraints that beat 11e1-\frac{1}{e} in the single-pass streaming model. Let nn be the number of elements in the stream. Then, we prove that any (randomized) streaming algorithm for a cardinality constraint with approximation ratio 22+2+ε\frac{2}{2+\sqrt{2}}+\varepsilon requires Ω(nK2)\Omega\left(\frac{n}{K^2}\right) space for any ε>0\varepsilon>0, where KK is the size limit of the output set. We also prove that any (randomized) streaming algorithm for a (partition) matroid constraint with approximation ratio K2K1+ε\frac{K}{2K-1}+\varepsilon requires Ω(nK)\Omega\left(\frac{n}{K}\right) space for any ε>0\varepsilon>0, where KK is the rank of the given matroid. In addition, we give streaming algorithms when we only have a weak oracle with which we can only evaluate function values on feasible sets. Specifically, we show weak-oracle streaming algorithms for cardinality and matroid constraints with approximation ratios K2K1\frac{K}{2K-1} and 12\frac{1}{2}, respectively, whose space complexity is exponential in KK but is independent of nn. The former one exactly matches the known inapproximability result for a cardinality constraint in the weak oracle model. The latter one almost matches our lower bound of K2K1\frac{K}{2K-1} for a matroid constraint, which almost settles the approximation ratio for a matroid constraint that can be obtained by a streaming algorithm whose space complexity is independent of nn
    corecore