14,604 research outputs found

    Lagrangian Relaxation and Partial Cover

    Full text link
    Lagrangian relaxation has been used extensively in the design of approximation algorithms. This paper studies its strengths and limitations when applied to Partial Cover.Comment: 20 pages, extended abstract appeared in STACS 200

    Approximating Hereditary Discrepancy via Small Width Ellipsoids

    Full text link
    The Discrepancy of a hypergraph is the minimum attainable value, over two-colorings of its vertices, of the maximum absolute imbalance of any hyperedge. The Hereditary Discrepancy of a hypergraph, defined as the maximum discrepancy of a restriction of the hypergraph to a subset of its vertices, is a measure of its complexity. Lovasz, Spencer and Vesztergombi (1986) related the natural extension of this quantity to matrices to rounding algorithms for linear programs, and gave a determinant based lower bound on the hereditary discrepancy. Matousek (2011) showed that this bound is tight up to a polylogarithmic factor, leaving open the question of actually computing this bound. Recent work by Nikolov, Talwar and Zhang (2013) showed a polynomial time O~(log3n)\tilde{O}(\log^3 n)-approximation to hereditary discrepancy, as a by-product of their work in differential privacy. In this paper, we give a direct simple O(log3/2n)O(\log^{3/2} n)-approximation algorithm for this problem. We show that up to this approximation factor, the hereditary discrepancy of a matrix AA is characterized by the optimal value of simple geometric convex program that seeks to minimize the largest \ell_{\infty} norm of any point in a ellipsoid containing the columns of AA. This characterization promises to be a useful tool in discrepancy theory

    Guaranteed bounds on the Kullback-Leibler divergence of univariate mixtures using piecewise log-sum-exp inequalities

    Full text link
    Information-theoretic measures such as the entropy, cross-entropy and the Kullback-Leibler divergence between two mixture models is a core primitive in many signal processing tasks. Since the Kullback-Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte-Carlo stochastic integration, approximated, or bounded using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy and the Kullback-Leibler divergence of mixtures. We illustrate the versatile method by reporting on our experiments for approximating the Kullback-Leibler divergence between univariate exponential mixtures, Gaussian mixtures, Rayleigh mixtures, and Gamma mixtures.Comment: 20 pages, 3 figure

    Improved Bounds on the Phase Transition for the Hard-Core Model in 2-Dimensions

    Full text link
    For the hard-core lattice gas model defined on independent sets weighted by an activity λ\lambda, we study the critical activity λc(Z2)\lambda_c(\mathbb{Z}^2) for the uniqueness/non-uniqueness threshold on the 2-dimensional integer lattice Z2\mathbb{Z}^2. The conjectured value of the critical activity is approximately 3.7963.796. Until recently, the best lower bound followed from algorithmic results of Weitz (2006). Weitz presented an FPTAS for approximating the partition function for graphs of constant maximum degree Δ\Delta when λ<λc(TΔ)\lambda<\lambda_c(\mathbb{T}_\Delta) where TΔ\mathbb{T}_\Delta is the infinite, regular tree of degree Δ\Delta. His result established a certain decay of correlations property called strong spatial mixing (SSM) on Z2\mathbb{Z}^2 by proving that SSM holds on its self-avoiding walk tree Tsawσ(Z2)T_{\mathrm{saw}}^\sigma(\mathbb{Z}^2) where σ=(σv)vZ2\sigma=(\sigma_v)_{v\in \mathbb{Z}^2} and σv\sigma_v is an ordering on the neighbors of vertex vv. As a consequence he obtained that λc(Z2)λc(T4)=1.675\lambda_c(\mathbb{Z}^2)\geq\lambda_c( \mathbb{T}_4) = 1.675. Restrepo et al. (2011) improved Weitz's approach for the particular case of Z2\mathbb{Z}^2 and obtained that λc(Z2)>2.388\lambda_c(\mathbb{Z}^2)>2.388. In this paper, we establish an upper bound for this approach, by showing that, for all σ\sigma, SSM does not hold on Tsawσ(Z2)T_{\mathrm{saw}}^\sigma(\mathbb{Z}^2) when λ>3.4\lambda>3.4. We also present a refinement of the approach of Restrepo et al. which improves the lower bound to λc(Z2)>2.48\lambda_c(\mathbb{Z}^2)>2.48.Comment: 19 pages, 1 figure. Polished proofs and examples compared to earlier versio
    corecore