2,271 research outputs found

    Fixed-length lossy compression in the finite blocklength regime

    Get PDF
    This paper studies the minimum achievable source coding rate as a function of blocklength nn and probability ϵ\epsilon that the distortion exceeds a given level dd. Tight general achievability and converse bounds are derived that hold at arbitrary fixed blocklength. For stationary memoryless sources with separable distortion, the minimum rate achievable is shown to be closely approximated by R(d)+V(d)nQ−1(ϵ)R(d) + \sqrt{\frac{V(d)}{n}} Q^{-1}(\epsilon), where R(d)R(d) is the rate-distortion function, V(d)V(d) is the rate dispersion, a characteristic of the source which measures its stochastic variability, and Q−1(ϵ)Q^{-1}(\epsilon) is the inverse of the standard Gaussian complementary cdf

    Secure Multiterminal Source Coding with Side Information at the Eavesdropper

    Full text link
    The problem of secure multiterminal source coding with side information at the eavesdropper is investigated. This scenario consists of a main encoder (referred to as Alice) that wishes to compress a single source but simultaneously satisfying the desired requirements on the distortion level at a legitimate receiver (referred to as Bob) and the equivocation rate --average uncertainty-- at an eavesdropper (referred to as Eve). It is further assumed the presence of a (public) rate-limited link between Alice and Bob. In this setting, Eve perfectly observes the information bits sent by Alice to Bob and has also access to a correlated source which can be used as side information. A second encoder (referred to as Charlie) helps Bob in estimating Alice's source by sending a compressed version of its own correlated observation via a (private) rate-limited link, which is only observed by Bob. For instance, the problem at hands can be seen as the unification between the Berger-Tung and the secure source coding setups. Inner and outer bounds on the so called rates-distortion-equivocation region are derived. The inner region turns to be tight for two cases: (i) uncoded side information at Bob and (ii) lossless reconstruction of both sources at Bob --secure distributed lossless compression. Application examples to secure lossy source coding of Gaussian and binary sources in the presence of Gaussian and binary/ternary (resp.) side informations are also considered. Optimal coding schemes are characterized for some cases of interest where the statistical differences between the side information at the decoders and the presence of a non-zero distortion at Bob can be fully exploited to guarantee secrecy.Comment: 26 pages, 16 figures, 2 table

    Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

    Full text link
    This monograph presents a unified treatment of single- and multi-user problems in Shannon's information theory where we depart from the requirement that the error probability decays asymptotically in the blocklength. Instead, the error probabilities for various problems are bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklengths. This represents the study of asymptotic estimates with non-vanishing error probabilities. In Part I, after reviewing the fundamentals of information theory, we discuss Strassen's seminal result for binary hypothesis testing where the type-I error probability is non-vanishing and the rate of decay of the type-II error probability with growing number of independent observations is characterized. In Part II, we use this basic hypothesis testing result to develop second- and sometimes, even third-order asymptotic expansions for point-to-point communication. Finally in Part III, we consider network information theory problems for which the second-order asymptotics are known. These problems include some classes of channels with random state, the multiple-encoder distributed lossless source coding (Slepian-Wolf) problem and special cases of the Gaussian interference and multiple-access channels. Finally, we discuss avenues for further research.Comment: Further comments welcom

    Lossy joint source-channel coding in the finite blocklength regime

    Get PDF
    This paper finds new tight finite-blocklength bounds for the best achievable lossy joint source-channel code rate, and demonstrates that joint source-channel code design brings considerable performance advantage over a separate one in the non-asymptotic regime. A joint source-channel code maps a block of kk source symbols onto a length−n-n channel codeword, and the fidelity of reproduction at the receiver end is measured by the probability ϵ\epsilon that the distortion exceeds a given threshold dd. For memoryless sources and channels, it is demonstrated that the parameters of the best joint source-channel code must satisfy nC−kR(d)≈nV+kV(d)Q(ϵ)nC - kR(d) \approx \sqrt{nV + k \mathcal V(d)} Q(\epsilon), where CC and VV are the channel capacity and channel dispersion, respectively; R(d)R(d) and V(d)\mathcal V(d) are the source rate-distortion and rate-dispersion functions; and QQ is the standard Gaussian complementary cdf. Symbol-by-symbol (uncoded) transmission is known to achieve the Shannon limit when the source and channel satisfy a certain probabilistic matching condition. In this paper we show that even when this condition is not satisfied, symbol-by-symbol transmission is, in some cases, the best known strategy in the non-asymptotic regime

    Second-Order Coding Rates for Conditional Rate-Distortion

    Full text link
    This paper characterizes the second-order coding rates for lossy source coding with side information available at both the encoder and the decoder. We first provide non-asymptotic bounds for this problem and then specialize the non-asymptotic bounds for three different scenarios: discrete memoryless sources, Gaussian sources, and Markov sources. We obtain the second-order coding rates for these settings. It is interesting to observe that the second-order coding rate for Gaussian source coding with Gaussian side information available at both the encoder and the decoder is the same as that for Gaussian source coding without side information. Furthermore, regardless of the variance of the side information, the dispersion is 1/21/2 nats squared per source symbol.Comment: 20 pages, 2 figures, second-order coding rates, finite blocklength, network information theor
    • …
    corecore