353 research outputs found

    Lossy Source Coding with Gaussian or Erased Side-Information

    Get PDF
    In this paper we find properties that are shared between two seemingly unrelated lossy source coding setups with side information. The first setup is when the source and side information are jointly Gaussian and the distortion measure is quadratic. The second setup is when the side information is an erased version of the source. We begin with the observation that in both these cases the Wyner-Ziv and conditional rate-distortion functions are equal. We further find that there is a continuum of optimal strategies for the conditional rate distortion problem in both these setups. Next, we consider the case when there are two decoders with access to different side-information sources. For the case when the encoder has access to the side information we establish bounds on the rate-distortion function and a sufficient condition for tightness. Under this condition, we find a characterization of the rate-distortion function for physically degraded side information. This characterization holds for both the Gaussian and erasure setups

    Nonasymptotic noisy lossy source coding

    Get PDF
    This paper shows new general nonasymptotic achievability and converse bounds and performs their dispersion analysis for the lossy compression problem in which the compressor observes the source through a noisy channel. While this problem is asymptotically equivalent to a noiseless lossy source coding problem with a modified distortion function, nonasymptotically there is a noticeable gap in how fast their minimum achievable coding rates approach the common rate-distortion function, as evidenced both by the refined asymptotic analysis (dispersion) and the numerical results. The size of the gap between the dispersions of the noisy problem and the asymptotically equivalent noiseless problem depends on the stochastic variability of the channel through which the compressor observes the source.Comment: IEEE Transactions on Information Theory, 201

    Source Coding with Fixed Lag Side Information

    Full text link
    We consider source coding with fixed lag side information at the decoder. We focus on the special case of perfect side information with unit lag corresponding to source coding with feedforward (the dual of channel coding with feedback) introduced by Pradhan. We use this duality to develop a linear complexity algorithm which achieves the rate-distortion bound for any memoryless finite alphabet source and distortion measure.Comment: 10 pages, 3 figure

    Fixed-length lossy compression in the finite blocklength regime

    Get PDF
    This paper studies the minimum achievable source coding rate as a function of blocklength nn and probability ϵ\epsilon that the distortion exceeds a given level dd. Tight general achievability and converse bounds are derived that hold at arbitrary fixed blocklength. For stationary memoryless sources with separable distortion, the minimum rate achievable is shown to be closely approximated by R(d)+V(d)nQ−1(ϵ)R(d) + \sqrt{\frac{V(d)}{n}} Q^{-1}(\epsilon), where R(d)R(d) is the rate-distortion function, V(d)V(d) is the rate dispersion, a characteristic of the source which measures its stochastic variability, and Q−1(ϵ)Q^{-1}(\epsilon) is the inverse of the standard Gaussian complementary cdf

    Heegard-Berger and Cascade Source Coding Problems with Common Reconstruction Constraints

    Full text link
    For the HB problem with the CR constraint, the rate-distortion function is derived under the assumption that the side information sequences are (stochastically) degraded. The rate-distortion function is also calculated explicitly for three examples, namely Gaussian source and side information with quadratic distortion metric, and binary source and side information with erasure and Hamming distortion metrics. The rate-distortion function is then characterized for the HB problem with cooperating decoders and (physically) degraded side information. For the cascade problem with the CR constraint, the rate-distortion region is obtained under the assumption that side information at the final node is physically degraded with respect to that at the intermediate node. For the latter two cases, it is worth emphasizing that the corresponding problem without the CR constraint is still open. Outer and inner bounds on the rate-distortion region are also obtained for the cascade problem under the assumption that the side information at the intermediate node is physically degraded with respect to that at the final node. For the three examples mentioned above, the bounds are shown to coincide. Finally, for the HB problem, the rate-distortion function is obtained under the more general requirement of constrained reconstruction, whereby the decoder's estimate must be recovered at the encoder only within some distortion.Comment: to appear in IEEE Trans. Inform. Theor

    Network coding meets TCP

    Full text link
    We propose a mechanism that incorporates network coding into TCP with only minor changes to the protocol stack, thereby allowing incremental deployment. In our scheme, the source transmits random linear combinations of packets currently in the congestion window. At the heart of our scheme is a new interpretation of ACKs - the sink acknowledges every degree of freedom (i.e., a linear combination that reveals one unit of new information) even if it does not reveal an original packet immediately. Such ACKs enable a TCP-like sliding-window approach to network coding. Our scheme has the nice property that packet losses are essentially masked from the congestion control algorithm. Our algorithm therefore reacts to packet drops in a smooth manner, resulting in a novel and effective approach for congestion control over networks involving lossy links such as wireless links. Our experiments show that our algorithm achieves higher throughput compared to TCP in the presence of lossy wireless links. We also establish the soundness and fairness properties of our algorithm.Comment: 9 pages, 9 figures, submitted to IEEE INFOCOM 200
    • …
    corecore