31,884 research outputs found
The capacity region of broadcast channels with intersymbol interference and colored Gaussian noise
We derive the capacity region for a broadcast channel with intersymbol interference (ISI) and colored Gaussian noise under an input power constraint. The region is obtained by first defining a similar channel model, the circular broadcast channel, which can be decomposed into a set of parallel degraded broadcast channels. The capacity region for parallel degraded broadcast channels is known. We then show that the capacity region of the original broadcast channel equals that of the circular broadcast channel in the limit of infinite block length, and we obtain an explicit formula for the resulting capacity region. The coding strategy used to achieve each point on the convex hull of the capacity region uses superposition coding on some or all of the parallel channels and dedicated transmission on the others. The optimal power allocation for any point in the capacity region is obtained via a multilevel water-filling. We derive this optimal power allocation and the resulting capacity region for several broadcast channel models
Source-Channel Coding for the Multiple-Access Relay Channel
This work considers reliable transmission of general correlated sources over
the multiple-access relay channel (MARC) and the multiple-access broadcast
relay channel (MABRC). In MARCs only the destination is interested in a
reconstruction of the sources, while in MABRCs both the relay and the
destination want to reconstruct the sources. We assume that both the relay and
the destination have correlated side information. We find sufficient conditions
for reliable communication based on operational separation, as well as
necessary conditions on the achievable source-channel rate. For correlated
sources transmitted over fading Gaussian MARCs and MABRCs we find conditions
under which informational separation is optimal.Comment: Presented in ISWCS 2011, Aachen, German
Lossy Source Transmission over the Relay Channel
Lossy transmission over a relay channel in which the relay has access to
correlated side information is considered. First, a joint source-channel
decode-and-forward scheme is proposed for general discrete memoryless sources
and channels. Then the Gaussian relay channel where the source and the side
information are jointly Gaussian is analyzed. For this Gaussian model, several
new source-channel cooperation schemes are introduced and analyzed in terms of
the squared-error distortion at the destination. A comparison of the proposed
upper bounds with the cut-set lower bound is given, and it is seen that joint
source-channel cooperation improves the reconstruction quality significantly.
Moreover, the performance of the joint code is close to the lower bound on
distortion for a wide range of source and channel parameters.Comment: Proceedings of the 2008 IEEE International Symposium on Information
Theory, Toronto, ON, Canada, July 6 - 11, 200
Source-Channel Coding Theorems for the Multiple-Access Relay Channel
We study reliable transmission of arbitrarily correlated sources over
multiple-access relay channels (MARCs) and multiple-access broadcast relay
channels (MABRCs). In MARCs only the destination is interested in
reconstructing the sources, while in MABRCs both the relay and the destination
want to reconstruct them. In addition to arbitrary correlation among the source
signals at the users, both the relay and the destination have side information
correlated with the source signals. Our objective is to determine whether a
given pair of sources can be losslessly transmitted to the destination for a
given number of channel symbols per source sample, defined as the
source-channel rate. Sufficient conditions for reliable communication based on
operational separation, as well as necessary conditions on the achievable
source-channel rates are characterized. Since operational separation is
generally not optimal for MARCs and MABRCs, sufficient conditions for reliable
communication using joint source-channel coding schemes based on a combination
of the correlation preserving mapping technique with Slepian-Wolf source coding
are also derived. For correlated sources transmitted over fading Gaussian MARCs
and MABRCs, we present conditions under which separation (i.e., separate and
stand-alone source and channel codes) is optimal. This is the first time
optimality of separation is proved for MARCs and MABRCs.Comment: Accepted to IEEE Transaction on Information Theor
A simple importance sampling technique for orthogonal space-time block codes on Nakagami fading channels
In this contribution, we present a simple importance sampling technique to considerably speed up Monte Carlo simulations for bit error rate estimation of orthogonal space-time block coded systems on spatially correlated Nakagami fading channels
Integer-Forcing Source Coding
Integer-Forcing (IF) is a new framework, based on compute-and-forward, for
decoding multiple integer linear combinations from the output of a Gaussian
multiple-input multiple-output channel. This work applies the IF approach to
arrive at a new low-complexity scheme, IF source coding, for distributed lossy
compression of correlated Gaussian sources under a minimum mean squared error
distortion measure. All encoders use the same nested lattice codebook. Each
encoder quantizes its observation using the fine lattice as a quantizer and
reduces the result modulo the coarse lattice, which plays the role of binning.
Rather than directly recovering the individual quantized signals, the decoder
first recovers a full-rank set of judiciously chosen integer linear
combinations of the quantized signals, and then inverts it. In general, the
linear combinations have smaller average powers than the original signals. This
allows to increase the density of the coarse lattice, which in turn translates
to smaller compression rates. We also propose and analyze a one-shot version of
IF source coding, that is simple enough to potentially lead to a new design
principle for analog-to-digital converters that can exploit spatial
correlations between the sampled signals.Comment: Submitted to IEEE Transactions on Information Theor
- …