3,708 research outputs found
Distributed Binary Detection with Lossy Data Compression
Consider the problem where a statistician in a two-node system receives
rate-limited information from a transmitter about marginal observations of a
memoryless process generated from two possible distributions. Using its own
observations, this receiver is required to first identify the legitimacy of its
sender by declaring the joint distribution of the process, and then depending
on such authentication it generates the adequate reconstruction of the
observations satisfying an average per-letter distortion. The performance of
this setup is investigated through the corresponding rate-error-distortion
region describing the trade-off between: the communication rate, the error
exponent induced by the detection and the distortion incurred by the source
reconstruction. In the special case of testing against independence, where the
alternative hypothesis implies that the sources are independent, the optimal
rate-error-distortion region is characterized. An application example to binary
symmetric sources is given subsequently and the explicit expression for the
rate-error-distortion region is provided as well. The case of "general
hypotheses" is also investigated. A new achievable rate-error-distortion region
is derived based on the use of non-asymptotic binning, improving the quality of
communicated descriptions. Further improvement of performance in the general
case is shown to be possible when the requirement of source reconstruction is
relaxed, which stands in contrast to the case of general hypotheses.Comment: to appear on IEEE Trans. Information Theor
Rate-Exponent Region for a Class of Distributed Hypothesis Testing Against Conditional Independence Problems
We study a class of -encoder hypothesis testing against conditional
independence problems. Under the criterion that stipulates minimization of the
Type II error subject to a (constant) upper bound on the Type I
error, we characterize the set of encoding rates and exponent for both discrete
memoryless and memoryless vector Gaussian settings. For the DM setting, we
provide a converse proof and show that it is achieved using the
Quantize-Bin-Test scheme of Rahman and Wagner. For the memoryless vector
Gaussian setting, we develop a tight outer bound by means of a technique that
relies on the de Bruijn identity and the properties of Fisher information. In
particular, the result shows that for memoryless vector Gaussian sources the
rate-exponent region is exhausted using the Quantize-Bin-Test scheme with
\textit{Gaussian} test channels; and there is \textit{no} loss in performance
caused by restricting the sensors' encoders not to employ time sharing.
Furthermore, we also study a variant of the problem in which the source, not
necessarily Gaussian, has finite differential entropy and the sensors'
observations noises under the null hypothesis are Gaussian. For this model, our
main result is an upper bound on the exponent-rate function. The bound is shown
to mirror a corresponding explicit lower bound, except that the lower bound
involves the source power (variance) whereas the upper bound has the source
entropy power. Part of the utility of the established bound is for
investigating asymptotic exponent/rates and losses incurred by distributed
detection as function of the number of sensors.Comment: Submitted for publication to the IEEE Transactions of Information
Theory. arXiv admin note: substantial text overlap with arXiv:1904.03028,
arXiv:1811.0393
Exponent Trade-off for Hypothesis Testing Over Noisy Channels
International audienceThe distributed hypothesis testing (DHT) problem is considered, in which the joint distribution of a pair of sequences present at separated terminals, is governed by one of two possible hypotheses. The decision needs to be made by one of the terminals (the "decoder"). The other terminal (the "encoder") uses a noisy channel in order to help the decoder with the decision. This problem can be seen as a generalization of the side-information variant of the DHT problem, where the rate-limited link is replaced by a noisy channel. A recent work by Salehkalaibar and Wigger has derived an achievable Stein exponent for this problem, by employing concepts from the DHT scheme of Shimokawa et al., and from unequal error protection coding for a single special message. In this work we extend the view to a trade-off between the two error exponents, additionally building on multiple codebooks and two special messages with unequal error protection. As a by product, we also present an achievable exponent trade-off for a rate-limited link, which generalizes Shimokawa et al.
- …