14 research outputs found

    Optimal vector quantization in terms of Wasserstein distance

    Get PDF
    The optimal quantizer in memory-size constrained vector quantization induces a quantization error which is equal to a Wasserstein distortion. However, for the optimal (Shannon-)entropy constrained quantization error a proof for a similar identity is still missing. Relying on principal results of the optimal mass transportation theory, we will prove that the optimal quantization error is equal to a Wasserstein distance. Since we will state the quantization problem in a very general setting, our approach includes the R\'enyi-α\alpha-entropy as a complexity constraint, which includes the special case of (Shannon-)entropy constrained (α=1)(\alpha = 1) and memory-size constrained (α=0)(\alpha = 0) quantization. Additionally, we will derive for certain distance functions codecell convexity for quantizers with a finite codebook. Using other methods, this regularity in codecell geometry has already been proved earlier by Gy\"{o}rgy and Linder

    Quantization as Histogram Segmentation: Optimal Scalar Quantizer Design in Network Systems

    Get PDF
    An algorithm for scalar quantizer design on discrete-alphabet sources is proposed. The proposed algorithm can be used to design fixed-rate and entropy-constrained conventional scalar quantizers, multiresolution scalar quantizers, multiple description scalar quantizers, and Wyner–Ziv scalar quantizers. The algorithm guarantees globally optimal solutions for conventional fixed-rate scalar quantizers and entropy-constrained scalar quantizers. For the other coding scenarios, the algorithm yields the best code among all codes that meet a given convexity constraint. In all cases, the algorithm run-time is polynomial in the size of the source alphabet. The algorithm derivation arises from a demonstration of the connection between scalar quantization, histogram segmentation, and the shortest path problem in a certain directed acyclic graph

    Codecell convexity in optimal entropy-constrained vector quantization

    Full text link

    Quantization as Histogram Segmentation: Optimal Scalar Quantizer Design in Network Systems

    Full text link

    Optimal quantization for the one-dimensional uniform distribution with Rényi -α-entropy constraints

    Get PDF
    We establish the optimal quantization problem for probabilities under constrained Rényi-α-entropy of the quantizers. We determine the optimal quantizers and the optimal quantization error of one-dimensional uniform distributions including the known special cases α = 0 (restricted codebook size) and α = 1 (restricted Shannon entropy)

    Multiresolution vector quantization

    Get PDF
    Multiresolution source codes are data compression algorithms yielding embedded source descriptions. The decoder of a multiresolution code can build a source reproduction by decoding the embedded bit stream in part or in whole. All decoding procedures start at the beginning of the binary source description and decode some fraction of that string. Decoding a small portion of the binary string gives a low-resolution reproduction; decoding more yields a higher resolution reproduction; and so on. Multiresolution vector quantizers are block multiresolution source codes. This paper introduces algorithms for designing fixed- and variable-rate multiresolution vector quantizers. Experiments on synthetic data demonstrate performance close to the theoretical performance limit. Experiments on natural images demonstrate performance improvements of up to 8 dB over tree-structured vector quantizers. Some of the lessons learned through multiresolution vector quantizer design lend insight into the design of more sophisticated multiresolution codes

    High-Resolution Scalar Quantization with Rényi Entropy Constraint

    Get PDF
    We consider optimal scalar quantization with rrth power distortion and constrained R\'enyi entropy of order α\alpha. For sources with absolutely continuous distributions the high rate asymptotics of the quantizer distortion has long been known for α=0\alpha=0 (fixed-rate quantization) and α=1\alpha=1 (entropy-constrained quantization). These results have recently been extended to quantization with R\'enyi entropy constraint of order αr+1\alpha \ge r+1. Here we consider the more challenging case α[,0)(0,1)\alpha\in [-\infty,0)\cup (0,1) and for a large class of absolutely continuous source distributions we determine the sharp asymptotics of the optimal quantization distortion. The achievability proof is based on finding (asymptotically) optimal quantizers via the companding approach, and is thus constructive
    corecore