4 research outputs found

    On the Smooth Renyi Entropy and Variable-Length Source Coding Allowing Errors

    Full text link
    In this paper, we consider the problem of variable-length source coding allowing errors. The exponential moment of the codeword length is analyzed in the non-asymptotic regime and in the asymptotic regime. Our results show that the smooth Renyi entropy characterizes the optimal exponential moment of the codeword length.Comment: 19 page

    Strong converses for group testing in the finite blocklength regime

    Full text link
    We prove new strong converse results in a variety of group testing settings, generalizing a result of Baldassini, Johnson and Aldridge. These results are proved by two distinct approaches, corresponding to the non-adaptive and adaptive cases. In the non-adaptive case, we mimic the hypothesis testing argument introduced in the finite blocklength channel coding regime by Polyanskiy, Poor and Verd\'{u}. In the adaptive case, we combine a formulation based on directed information theory with ideas of Kemperman, Kesten and Wolfowitz from the problem of channel coding with feedback. In both cases, we prove results which are valid for finite sized problems, and imply capacity results in the asymptotic regime. These results are illustrated graphically for a range of models

    On the Conditional Smooth Renyi Entropy and its Applications in Guessing and Source Coding

    Full text link
    A novel definition of the conditional smooth Renyi entropy, which is different from that of Renner and Wolf, is introduced. It is shown that our definition of the conditional smooth Renyi entropy is appropriate to give lower and upper bounds on the optimal guessing moment in a guessing problem where the guesser is allowed to stop guessing and declare an error. Further a general formula for the optimal guessing exponent is given. In particular, a single-letterized formula for mixture of i.i.d. sources is obtained. Another application in the problem of source coding with the common side-information available at the encoder and decoder is also demonstrated.Comment: 31 Page

    An Information-Spectrum Approach to Weak Variable-Length Source Coding with Side-Information

    Full text link
    This paper studies variable-length (VL) source coding of general sources with side-information. Novel one-shot coding theorems for coding with common side-information available at the encoder and the decoder and Slepian- Wolf (SW) coding (i.e., with side-information only at the decoder) are given, and then, are applied to asymptotic analyses of these coding problems. Especially, a general formula for the infimum of the coding rate asymptotically achievable by weak VL-SW coding (i.e., VL-SW coding with vanishing error probability) is derived. Further, the general formula is applied to investigating weak VL-SW coding of mixed sources. Our results derive and extend several known results on SW coding and weak VL coding, e.g., the optimal achievable rate of VL-SW coding for mixture of i.i.d. sources is given for countably infinite alphabet case with mild condition. In addition, the usefulness of the encoder side-information is investigated. Our result shows that if the encoder side-information is useless in weak VL coding then it is also useless even in the case where the error probability may be positive asymptotically.Comment: 54 pages, 2 figur
    corecore