2 research outputs found

    Prefix Codes for Power Laws with Countable Support

    Full text link
    In prefix coding over an infinite alphabet, methods that consider specific distributions generally consider those that decline more quickly than a power law (e.g., Golomb coding). Particular power-law distributions, however, model many random variables encountered in practice. For such random variables, compression performance is judged via estimates of expected bits per input symbol. This correspondence introduces a family of prefix codes with an eye towards near-optimal coding of known distributions. Compression performance is precisely estimated for well-known probability distributions using these codes and using previously known prefix codes. One application of these near-optimal codes is an improved representation of rational numbers.Comment: 5 pages, 2 tables, submitted to Transactions on Information Theor

    Structured prefix codes for quantized low-shape-parameter generalized Gaussian sources

    No full text
    The highly peaked, wide-tailed pdfs that are encountered in many image coding algorithms are often modeled using the family of generalized Gaussian (GG) pdfs. We study entropy coding of quantized GG sources using prefix codes that are highly structured, and which therefore involve low computational complexity to utilize. We provide bounds for the redundancy associated with applying these codes to quantized GG sources. We also explore code efficiency and code choice for a wide range of GG source and quantizer parameters
    corecore