107,741 research outputs found

    A polynomial oracle-time algorithm for convex integer minimization

    Full text link
    In this paper we consider the solution of certain convex integer minimization problems via greedy augmentation procedures. We show that a greedy augmentation procedure that employs only directions from certain Graver bases needs only polynomially many augmentation steps to solve the given problem. We extend these results to convex NN-fold integer minimization problems and to convex 2-stage stochastic integer minimization problems. Finally, we present some applications of convex NN-fold integer minimization problems for which our approach provides polynomial time solution algorithms.Comment: 19 pages, 1 figur

    Private Multiplicative Weights Beyond Linear Queries

    Full text link
    A wide variety of fundamental data analyses in machine learning, such as linear and logistic regression, require minimizing a convex function defined by the data. Since the data may contain sensitive information about individuals, and these analyses can leak that sensitive information, it is important to be able to solve convex minimization in a privacy-preserving way. A series of recent results show how to accurately solve a single convex minimization problem in a differentially private manner. However, the same data is often analyzed repeatedly, and little is known about solving multiple convex minimization problems with differential privacy. For simpler data analyses, such as linear queries, there are remarkable differentially private algorithms such as the private multiplicative weights mechanism (Hardt and Rothblum, FOCS 2010) that accurately answer exponentially many distinct queries. In this work, we extend these results to the case of convex minimization and show how to give accurate and differentially private solutions to *exponentially many* convex minimization problems on a sensitive dataset

    A convex extension of lower semicontinuous functions defined on normal Hausdorff space

    Full text link
    We prove that, any problem of minimization of proper lower semicontinuous function defined on a normal Hausdorff space, is canonically equivalent to a problem of minimization of a proper weak * lower semicontinuous convex function defined on a weak * convex compact subset of some dual Banach space. We estalish the existence of an bijective operator between the two classes of functions which preserves the problems of minimization

    Precise Phase Transition of Total Variation Minimization

    Full text link
    Characterizing the phase transitions of convex optimizations in recovering structured signals or data is of central importance in compressed sensing, machine learning and statistics. The phase transitions of many convex optimization signal recovery methods such as 1\ell_1 minimization and nuclear norm minimization are well understood through recent years' research. However, rigorously characterizing the phase transition of total variation (TV) minimization in recovering sparse-gradient signal is still open. In this paper, we fully characterize the phase transition curve of the TV minimization. Our proof builds on Donoho, Johnstone and Montanari's conjectured phase transition curve for the TV approximate message passing algorithm (AMP), together with the linkage between the minmax Mean Square Error of a denoising problem and the high-dimensional convex geometry for TV minimization.Comment: 6 page
    corecore