2,173 research outputs found
Distributed Hypothesis Testing with Privacy Constraints
We revisit the distributed hypothesis testing (or hypothesis testing with
communication constraints) problem from the viewpoint of privacy. Instead of
observing the raw data directly, the transmitter observes a sanitized or
randomized version of it. We impose an upper bound on the mutual information
between the raw and randomized data. Under this scenario, the receiver, which
is also provided with side information, is required to make a decision on
whether the null or alternative hypothesis is in effect. We first provide a
general lower bound on the type-II exponent for an arbitrary pair of
hypotheses. Next, we show that if the distribution under the alternative
hypothesis is the product of the marginals of the distribution under the null
(i.e., testing against independence), then the exponent is known exactly.
Moreover, we show that the strong converse property holds. Using ideas from
Euclidean information theory, we also provide an approximate expression for the
exponent when the communication rate is low and the privacy level is high.
Finally, we illustrate our results with a binary and a Gaussian example
Near-Optimal Noisy Group Testing via Separate Decoding of Items
The group testing problem consists of determining a small set of defective
items from a larger set of items based on a number of tests, and is relevant in
applications such as medical testing, communication protocols, pattern
matching, and more. In this paper, we revisit an efficient algorithm for noisy
group testing in which each item is decoded separately (Malyutov and Mateev,
1980), and develop novel performance guarantees via an information-theoretic
framework for general noise models. For the special cases of no noise and
symmetric noise, we find that the asymptotic number of tests required for
vanishing error probability is within a factor of the
information-theoretic optimum at low sparsity levels, and that with a small
fraction of allowed incorrectly decoded items, this guarantee extends to all
sublinear sparsity levels. In addition, we provide a converse bound showing
that if one tries to move slightly beyond our low-sparsity achievability
threshold using separate decoding of items and i.i.d. randomized testing, the
average number of items decoded incorrectly approaches that of a trivial
decoder.Comment: Submitted to IEEE Journal of Selected Topics in Signal Processin
Finite-Block-Length Analysis in Classical and Quantum Information Theory
Coding technology is used in several information processing tasks. In
particular, when noise during transmission disturbs communications, coding
technology is employed to protect the information. However, there are two types
of coding technology: coding in classical information theory and coding in
quantum information theory. Although the physical media used to transmit
information ultimately obey quantum mechanics, we need to choose the type of
coding depending on the kind of information device, classical or quantum, that
is being used. In both branches of information theory, there are many elegant
theoretical results under the ideal assumption that an infinitely large system
is available. In a realistic situation, we need to account for finite size
effects. The present paper reviews finite size effects in classical and quantum
information theory with respect to various topics, including applied aspects
Distributed Hypothesis Testing over a Noisy Channel: Error-exponents Trade-off
A two-terminal distributed binary hypothesis testing (HT) problem over a
noisy channel is studied. The two terminals, called the observer and the
decision maker, each has access to independent and identically distributed
samples, denoted by and , respectively. The observer
communicates to the decision maker over a discrete memoryless channel (DMC),
and the decision maker performs a binary hypothesis test on the joint
probability distribution of based on and
the noisy information received from the observer. The trade-off between the
exponents of the type I and type II error probabilities in HT is investigated.
Two inner bounds are obtained, one using a separation-based scheme that
involves type-based compression and unequal error-protection channel coding,
and the other using a joint scheme that incorporates type-based hybrid coding.
The separation-based scheme is shown to recover the inner bound obtained by Han
and Kobayashi for the special case of a rate-limited noiseless channel, and
also the one obtained by the authors previously for a corner point of the
trade-off. Exact single-letter characterization of the optimal trade-off is
established for the special case of testing for the marginal distribution of
, when is unavailable. Our results imply that a
separation holds in this case, in the sense that the optimal trade-off is
achieved by a scheme that performs independent HT and channel coding. Finally,
we show via an example that the joint scheme achieves a strictly tighter bound
than the separation-based scheme for some points of the error-exponent
trade-off
Distributed hypothesis testing over discrete memoryless channels
A distributed binary hypothesis testing (HT) problem involving two parties, one referred to as the observer and the other as the detector is studied. The observer observes a discrete memoryless source (DMS) and communicates its observations to the detector over a discrete memoryless channel (DMC). The detector observes another DMS correlated with that at the observer, and performs a binary HT on the joint distribution of the two DMS’s using its own observed data and the information received from the observer. The trade-off between the type I error probability and the type II error-exponent of the HT is explored. Single-letter lower bounds on the optimal type II errorexponent are obtained by using two different coding schemes, a separate HT and channel coding scheme and a joint HT and channel coding scheme based on hybrid coding for the matched bandwidth case. Exact single-letter characterization of the same is established for the special case of testing against conditional independence, and it is shown to be achieved by the separate HT and channel coding scheme. An example is provided where the joint scheme achieves a strictly better performance than the separation based scheme
- …