1,304 research outputs found

    Symmetric Submodular Function Minimization Under Hereditary Family Constraints

    Full text link
    We present an efficient algorithm to find non-empty minimizers of a symmetric submodular function over any family of sets closed under inclusion. This for example includes families defined by a cardinality constraint, a knapsack constraint, a matroid independence constraint, or any combination of such constraints. Our algorithm make O(n3)O(n^3) oracle calls to the submodular function where nn is the cardinality of the ground set. In contrast, the problem of minimizing a general submodular function under a cardinality constraint is known to be inapproximable within o(n/logn)o(\sqrt{n/\log n}) (Svitkina and Fleischer [2008]). The algorithm is similar to an algorithm of Nagamochi and Ibaraki [1998] to find all nontrivial inclusionwise minimal minimizers of a symmetric submodular function over a set of cardinality nn using O(n3)O(n^3) oracle calls. Their procedure in turn is based on Queyranne's algorithm [1998] to minimize a symmetric submodularComment: 13 pages, Submitted to SODA 201

    Convex Analysis and Optimization with Submodular Functions: a Tutorial

    Get PDF
    Set-functions appear in many areas of computer science and applied mathematics, such as machine learning, computer vision, operations research or electrical networks. Among these set-functions, submodular functions play an important role, similar to convex functions on vector spaces. In this tutorial, the theory of submodular functions is presented, in a self-contained way, with all results shown from first principles. A good knowledge of convex analysis is assumed

    Differentially Private Empirical Risk Minimization with Sparsity-Inducing Norms

    Get PDF
    Differential privacy is concerned about the prediction quality while measuring the privacy impact on individuals whose information is contained in the data. We consider differentially private risk minimization problems with regularizers that induce structured sparsity. These regularizers are known to be convex but they are often non-differentiable. We analyze the standard differentially private algorithms, such as output perturbation, Frank-Wolfe and objective perturbation. Output perturbation is a differentially private algorithm that is known to perform well for minimizing risks that are strongly convex. Previous works have derived excess risk bounds that are independent of the dimensionality. In this paper, we assume a particular class of convex but non-smooth regularizers that induce structured sparsity and loss functions for generalized linear models. We also consider differentially private Frank-Wolfe algorithms to optimize the dual of the risk minimization problem. We derive excess risk bounds for both these algorithms. Both the bounds depend on the Gaussian width of the unit ball of the dual norm. We also show that objective perturbation of the risk minimization problems is equivalent to the output perturbation of a dual optimization problem. This is the first work that analyzes the dual optimization problems of risk minimization problems in the context of differential privacy
    corecore