439 research outputs found
Zero-Delay Joint Source-Channel Coding in the Presence of Interference Known at the Encoder
Zero-delay transmission of a Gaussian source over an additive white Gaussian noise (AWGN) channel is considered in the presence of an additive Gaussian interference signal. The mean squared error (MSE) distortion is minimized under an average power constraint assuming that the interference signal is known at the transmitter. Optimality of simple linear transmission does not hold in this setting due to the presence of the known interference signal. While the optimal encoder-decoder pair remains an open problem, various non-linear transmission schemes are proposed in this paper. In particular, interference concentration (ICO) and one-dimensional lattice (1DL) strategies, using both uniform and non-uniform quantization of the interference signal, are studied. It is shown that, in contrast to typical scalar quantization of Gaussian sources, a non-uniform quantizer, whose quantization intervals become smaller as we go further from zero, improves the performance. Given that the optimal decoder is the minimum MSE (MMSE) estimator, a necessary condition for the optimality of the encoder is derived, and the numerically optimized encoder (NOE) satisfying this condition is obtained. Based on the numerical results, it is shown that 1DL with nonuniform quantization performs closer (compared to the other schemes) to the numerically optimized encoder while requiring significantly lower complexity
Information Masking and Amplification: The Source Coding Setting
The complementary problems of masking and amplifying channel state
information in the Gel'fand-Pinsker channel have recently been solved by Merhav
and Shamai, and Kim et al., respectively. In this paper, we study a related
source coding problem. Specifically, we consider the two-encoder source coding
setting where one source is to be amplified, while the other source is to be
masked. In general, there is a tension between these two objectives which is
characterized by the amplification-masking tradeoff. In this paper, we give a
single-letter description of this tradeoff.
We apply this result, together with a recent theorem by Courtade and Weissman
on multiterminal source coding, to solve a fundamental entropy characterization
problem.Comment: 6 pages, 1 figure, to appear at the IEEE 2012 International Symposium
on Information Theory (ISIT 2012
Achievable Rates for K-user Gaussian Interference Channels
The aim of this paper is to study the achievable rates for a user
Gaussian interference channels for any SNR using a combination of lattice and
algebraic codes. Lattice codes are first used to transform the Gaussian
interference channel (G-IFC) into a discrete input-output noiseless channel,
and subsequently algebraic codes are developed to achieve good rates over this
new alphabet. In this context, a quantity called efficiency is introduced which
reflects the effectiveness of the algebraic coding strategy. The paper first
addresses the problem of finding high efficiency algebraic codes. A combination
of these codes with Construction-A lattices is then used to achieve non trivial
rates for the original Gaussian interference channel.Comment: IEEE Transactions on Information Theory, 201
Sparse Regression Codes for Multi-terminal Source and Channel Coding
We study a new class of codes for Gaussian multi-terminal source and channel
coding. These codes are designed using the statistical framework of
high-dimensional linear regression and are called Sparse Superposition or
Sparse Regression codes. Codewords are linear combinations of subsets of
columns of a design matrix. These codes were recently introduced by Barron and
Joseph and shown to achieve the channel capacity of AWGN channels with
computationally feasible decoding. They have also recently been shown to
achieve the optimal rate-distortion function for Gaussian sources. In this
paper, we demonstrate how to implement random binning and superposition coding
using sparse regression codes. In particular, with minimum-distance
encoding/decoding it is shown that sparse regression codes attain the optimal
information-theoretic limits for a variety of multi-terminal source and channel
coding problems.Comment: 9 pages, appeared in the Proceedings of the 50th Annual Allerton
Conference on Communication, Control, and Computing - 201
- …