864 research outputs found

    Optimal vector quantization in terms of Wasserstein distance

    Get PDF
    The optimal quantizer in memory-size constrained vector quantization induces a quantization error which is equal to a Wasserstein distortion. However, for the optimal (Shannon-)entropy constrained quantization error a proof for a similar identity is still missing. Relying on principal results of the optimal mass transportation theory, we will prove that the optimal quantization error is equal to a Wasserstein distance. Since we will state the quantization problem in a very general setting, our approach includes the R\'enyi-α\alpha-entropy as a complexity constraint, which includes the special case of (Shannon-)entropy constrained (α=1)(\alpha = 1) and memory-size constrained (α=0)(\alpha = 0) quantization. Additionally, we will derive for certain distance functions codecell convexity for quantizers with a finite codebook. Using other methods, this regularity in codecell geometry has already been proved earlier by Gy\"{o}rgy and Linder

    Constructive quantization: approximation by empirical measures

    Get PDF
    In this article, we study the approximation of a probability measure μ\mu on Rd\mathbb{R}^{d} by its empirical measure μ^N\hat{\mu}_{N} interpreted as a random quantization. As error criterion we consider an averaged pp-th moment Wasserstein metric. In the case where 2p<d2p<d, we establish refined upper and lower bounds for the error, a high-resolution formula. Moreover, we provide a universal estimate based on moments, a so-called Pierce type estimate. In particular, we show that quantization by empirical measures is of optimal order under weak assumptions.Comment: 22 page

    Learning Probability Measures with respect to Optimal Transport Metrics

    Full text link
    We study the problem of estimating, in the sense of optimal transport metrics, a measure which is assumed supported on a manifold embedded in a Hilbert space. By establishing a precise connection between optimal transport metrics, optimal quantization, and learning theory, we derive new probabilistic bounds for the performance of a classic algorithm in unsupervised learning (k-means), when used to produce a probability measure derived from the data. In the course of the analysis, we arrive at new lower bounds, as well as probabilistic upper bounds on the convergence rate of the empirical law of large numbers, which, unlike existing bounds, are applicable to a wide class of measures.Comment: 13 pages, 2 figures. Advances in Neural Information Processing Systems, NIPS 201

    Minimal geodesics along volume preserving maps, through semi-discrete optimal transport

    Get PDF
    We introduce a numerical method for extracting minimal geodesics along the group of volume preserving maps, equipped with the L2 metric, which as observed by Arnold solve Euler's equations of inviscid incompressible fluids. The method relies on the generalized polar decomposition of Brenier, numerically implemented through semi-discrete optimal transport. It is robust enough to extract non-classical, multi-valued solutions of Euler's equations, for which the flow dimension is higher than the domain dimension, a striking and unavoidable consequence of this model. Our convergence results encompass this generalized model, and our numerical experiments illustrate it for the first time in two space dimensions.Comment: 21 pages, 9 figure
    • …
    corecore