1,052 research outputs found
Joint Source-Channel Coding with Time-Varying Channel and Side-Information
Transmission of a Gaussian source over a time-varying Gaussian channel is
studied in the presence of time-varying correlated side information at the
receiver. A block fading model is considered for both the channel and the side
information, whose states are assumed to be known only at the receiver. The
optimality of separate source and channel coding in terms of average end-to-end
distortion is shown when the channel is static while the side information state
follows a discrete or a continuous and quasiconcave distribution. When both the
channel and side information states are time-varying, separate source and
channel coding is suboptimal in general. A partially informed encoder lower
bound is studied by providing the channel state information to the encoder.
Several achievable transmission schemes are proposed based on uncoded
transmission, separate source and channel coding, joint decoding as well as
hybrid digital-analog transmission. Uncoded transmission is shown to be optimal
for a class of continuous and quasiconcave side information state
distributions, while the channel gain may have an arbitrary distribution. To
the best of our knowledge, this is the first example in which the uncoded
transmission achieves the optimal performance thanks to the time-varying nature
of the states, while it is suboptimal in the static version of the same
problem. Then, the optimal \emph{distortion exponent}, that quantifies the
exponential decay rate of the expected distortion in the high SNR regime, is
characterized for Nakagami distributed channel and side information states, and
it is shown to be achieved by hybrid digital-analog and joint decoding schemes
in certain cases, illustrating the suboptimality of pure digital or analog
transmission in general.Comment: Submitted to IEEE Transactions on Information Theor
Side information aware source and channel coding in wireless networks
Signals in communication networks exhibit significant correlation, which can stem from the physical nature of the underlying sources, or can be created within the system. Current layered network architectures, in which, based on Shannon’s separation theorem, data is compressed and transmitted over independent bit-pipes, are in general not able to exploit such correlation efficiently. Moreover, this strictly layered architecture was developed for wired networks and ignore the broadcast and highly dynamic nature of the wireless medium, creating a bottleneck in the wireless network design. Technologies that exploit correlated information and go beyond the layered network architecture can become a key feature of future wireless networks, as information theory promises significant gains. In this thesis, we study from an information theoretic perspective, three distinct, yet fundamental, problems involving the availability of correlated information in wireless networks and develop novel communication techniques to exploit it efficiently. We first look at two joint source-channel coding problems involving the lossy transmission of Gaussian sources in a multi-terminal and a time-varying setting in which correlated side information is present in the network. In these two problems, the optimality of Shannon’s separation breaks down and separate source and channel coding is shown to perform poorly compared to the proposed joint source-channel coding designs, which are shown to achieve the optimal performance in some setups. Then, we characterize the capacity of a class of orthogonal relay channels in the presence of channel side information at the destination, and show that joint decoding and compression of the received signal at the relay is required to optimally exploit the available side information. Our results in these three different scenarios emphasize the benefits of exploiting correlated side information at the destination when designing a communication system, even though the nature of the side information and the performance measure in the three scenarios are quite different.Open Acces
Distributed Binary Detection with Lossy Data Compression
Consider the problem where a statistician in a two-node system receives
rate-limited information from a transmitter about marginal observations of a
memoryless process generated from two possible distributions. Using its own
observations, this receiver is required to first identify the legitimacy of its
sender by declaring the joint distribution of the process, and then depending
on such authentication it generates the adequate reconstruction of the
observations satisfying an average per-letter distortion. The performance of
this setup is investigated through the corresponding rate-error-distortion
region describing the trade-off between: the communication rate, the error
exponent induced by the detection and the distortion incurred by the source
reconstruction. In the special case of testing against independence, where the
alternative hypothesis implies that the sources are independent, the optimal
rate-error-distortion region is characterized. An application example to binary
symmetric sources is given subsequently and the explicit expression for the
rate-error-distortion region is provided as well. The case of "general
hypotheses" is also investigated. A new achievable rate-error-distortion region
is derived based on the use of non-asymptotic binning, improving the quality of
communicated descriptions. Further improvement of performance in the general
case is shown to be possible when the requirement of source reconstruction is
relaxed, which stands in contrast to the case of general hypotheses.Comment: to appear on IEEE Trans. Information Theor
- …