94 research outputs found

    Lecture Notes on Network Information Theory

    Full text link
    These lecture notes have been converted to a book titled Network Information Theory published recently by Cambridge University Press. This book provides a significantly expanded exposition of the material in the lecture notes as well as problems and bibliographic notes at the end of each chapter. The authors are currently preparing a set of slides based on the book that will be posted in the second half of 2012. More information about the book can be found at http://www.cambridge.org/9781107008731/. The previous (and obsolete) version of the lecture notes can be found at http://arxiv.org/abs/1001.3404v4/

    On the Reliability Function of Distributed Hypothesis Testing Under Optimal Detection

    Full text link
    The distributed hypothesis testing problem with full side-information is studied. The trade-off (reliability function) between the two types of error exponents under limited rate is studied in the following way. First, the problem is reduced to the problem of determining the reliability function of channel codes designed for detection (in analogy to a similar result which connects the reliability function of distributed lossless compression and ordinary channel codes). Second, a single-letter random-coding bound based on a hierarchical ensemble, as well as a single-letter expurgated bound, are derived for the reliability of channel-detection codes. Both bounds are derived for a system which employs the optimal detection rule. We conjecture that the resulting random-coding bound is ensemble-tight, and consequently optimal within the class of quantization-and-binning schemes

    Distributed Binary Detection with Lossy Data Compression

    Full text link
    Consider the problem where a statistician in a two-node system receives rate-limited information from a transmitter about marginal observations of a memoryless process generated from two possible distributions. Using its own observations, this receiver is required to first identify the legitimacy of its sender by declaring the joint distribution of the process, and then depending on such authentication it generates the adequate reconstruction of the observations satisfying an average per-letter distortion. The performance of this setup is investigated through the corresponding rate-error-distortion region describing the trade-off between: the communication rate, the error exponent induced by the detection and the distortion incurred by the source reconstruction. In the special case of testing against independence, where the alternative hypothesis implies that the sources are independent, the optimal rate-error-distortion region is characterized. An application example to binary symmetric sources is given subsequently and the explicit expression for the rate-error-distortion region is provided as well. The case of "general hypotheses" is also investigated. A new achievable rate-error-distortion region is derived based on the use of non-asymptotic binning, improving the quality of communicated descriptions. Further improvement of performance in the general case is shown to be possible when the requirement of source reconstruction is relaxed, which stands in contrast to the case of general hypotheses.Comment: to appear on IEEE Trans. Information Theor

    Distributed Hypothesis Testing with Privacy Constraints

    Full text link
    We revisit the distributed hypothesis testing (or hypothesis testing with communication constraints) problem from the viewpoint of privacy. Instead of observing the raw data directly, the transmitter observes a sanitized or randomized version of it. We impose an upper bound on the mutual information between the raw and randomized data. Under this scenario, the receiver, which is also provided with side information, is required to make a decision on whether the null or alternative hypothesis is in effect. We first provide a general lower bound on the type-II exponent for an arbitrary pair of hypotheses. Next, we show that if the distribution under the alternative hypothesis is the product of the marginals of the distribution under the null (i.e., testing against independence), then the exponent is known exactly. Moreover, we show that the strong converse property holds. Using ideas from Euclidean information theory, we also provide an approximate expression for the exponent when the communication rate is low and the privacy level is high. Finally, we illustrate our results with a binary and a Gaussian example

    Interactive hypothesis testing with communication constraints

    Full text link
    Abstract—This paper studies the problem of interactive hypothesis testing with communication constraints, in which two communication nodes separately observe one of two correlated sources and interact with each other to decide between two hypotheses on the joint distribution of the sources. When testing against independence, that is, the joint distribution of the sources under the alternative hypothesis is the product of the marginal distributions under the null hypothesis, a computable characterization is provided for the optimal tradeoff between the communication rates in two-round interaction and the testing performance measured by the type II error exponent such that the type I error probability asymptotically vanishes. An example is provided to show that interaction is strictly helpful. I
    corecore