10 research outputs found

    Hypothesis testing via a comparator

    Get PDF
    This paper investigates the best achievable performance by a hypothesis test satisfying a structural constraint: two functions are computed at two different terminals and the detector consists of a simple comparator verifying whether the functions agree. Such tests arise as part of study of fundamental limits of channel coding, but are also useful in other contexts. A simple expression for the Stein exponent is found and applied to showing a strong converse in the problem of multi-terminal hypothesis testing with rate constraints. Connections to the Gács-Körner common information and to spectral properties of conditional expectation operator are identified. Further tightening of results hinges on finding λ-blocks of minimal weight. Application of Delsarte's linear programming method to this problem is described.Center for Science of Information (Grant Agreement CCF-09-39370

    On the Reliability Function of Distributed Hypothesis Testing Under Optimal Detection

    Full text link
    The distributed hypothesis testing problem with full side-information is studied. The trade-off (reliability function) between the two types of error exponents under limited rate is studied in the following way. First, the problem is reduced to the problem of determining the reliability function of channel codes designed for detection (in analogy to a similar result which connects the reliability function of distributed lossless compression and ordinary channel codes). Second, a single-letter random-coding bound based on a hierarchical ensemble, as well as a single-letter expurgated bound, are derived for the reliability of channel-detection codes. Both bounds are derived for a system which employs the optimal detection rule. We conjecture that the resulting random-coding bound is ensemble-tight, and consequently optimal within the class of quantization-and-binning schemes

    Distributed Hypothesis Testing over a Noisy Channel: Error-exponents Trade-off

    Get PDF
    A two-terminal distributed binary hypothesis testing (HT) problem over a noisy channel is studied. The two terminals, called the observer and the decision maker, each has access to nn independent and identically distributed samples, denoted by U\mathbf{U} and V\mathbf{V}, respectively. The observer communicates to the decision maker over a discrete memoryless channel (DMC), and the decision maker performs a binary hypothesis test on the joint probability distribution of (U,V)(\mathbf{U},\mathbf{V}) based on V\mathbf{V} and the noisy information received from the observer. The trade-off between the exponents of the type I and type II error probabilities in HT is investigated. Two inner bounds are obtained, one using a separation-based scheme that involves type-based compression and unequal error-protection channel coding, and the other using a joint scheme that incorporates type-based hybrid coding. The separation-based scheme is shown to recover the inner bound obtained by Han and Kobayashi for the special case of a rate-limited noiseless channel, and also the one obtained by the authors previously for a corner point of the trade-off. Exact single-letter characterization of the optimal trade-off is established for the special case of testing for the marginal distribution of U\mathbf{U}, when V\mathbf{V} is unavailable. Our results imply that a separation holds in this case, in the sense that the optimal trade-off is achieved by a scheme that performs independent HT and channel coding. Finally, we show via an example that the joint scheme achieves a strictly tighter bound than the separation-based scheme for some points of the error-exponent trade-off
    corecore