5 research outputs found
Distributed Binary Detection with Lossy Data Compression
Consider the problem where a statistician in a two-node system receives
rate-limited information from a transmitter about marginal observations of a
memoryless process generated from two possible distributions. Using its own
observations, this receiver is required to first identify the legitimacy of its
sender by declaring the joint distribution of the process, and then depending
on such authentication it generates the adequate reconstruction of the
observations satisfying an average per-letter distortion. The performance of
this setup is investigated through the corresponding rate-error-distortion
region describing the trade-off between: the communication rate, the error
exponent induced by the detection and the distortion incurred by the source
reconstruction. In the special case of testing against independence, where the
alternative hypothesis implies that the sources are independent, the optimal
rate-error-distortion region is characterized. An application example to binary
symmetric sources is given subsequently and the explicit expression for the
rate-error-distortion region is provided as well. The case of "general
hypotheses" is also investigated. A new achievable rate-error-distortion region
is derived based on the use of non-asymptotic binning, improving the quality of
communicated descriptions. Further improvement of performance in the general
case is shown to be possible when the requirement of source reconstruction is
relaxed, which stands in contrast to the case of general hypotheses.Comment: to appear on IEEE Trans. Information Theor
Joint Estimation and Detection Against Independence
International audienceA receiver in a two-node system is required to make a decision of relevance as to received information, using side information that may or may not be correlated with the received signal. In case the information is judged to be relevant, the receiver is then required to estimate the source with average distortion D. Focusing on the case of testing against independence, a single-letter expression for the rate-error-distortion region is proposed and proven. The resulting region ports a surprising resemblance to a seem-ingly non-associated classification problem, known as the information-bottleneck. The optimal region is then calculated for a binary symmetric example. Results demonstrate an interesting trade-off between the achievable error-exponent for the decision and the distortion at the decoder