485,745 research outputs found
Noisy Network Coding
A noisy network coding scheme for sending multiple sources over a general
noisy network is presented. For multi-source multicast networks, the scheme
naturally extends both network coding over noiseless networks by Ahlswede, Cai,
Li, and Yeung, and compress-forward coding for the relay channel by Cover and
El Gamal to general discrete memoryless and Gaussian networks. The scheme also
recovers as special cases the results on coding for wireless relay networks and
deterministic networks by Avestimehr, Diggavi, and Tse, and coding for wireless
erasure networks by Dana, Gowaikar, Palanki, Hassibi, and Effros. The scheme
involves message repetition coding, relay signal compression, and simultaneous
decoding. Unlike previous compress--forward schemes, where independent messages
are sent over multiple blocks, the same message is sent multiple times using
independent codebooks as in the network coding scheme for cyclic networks.
Furthermore, the relays do not use Wyner--Ziv binning as in previous
compress-forward schemes, and each decoder performs simultaneous joint
typicality decoding on the received signals from all the blocks without
explicitly decoding the compression indices. A consequence of this new scheme
is that achievability is proved simply and more generally without resorting to
time expansion to extend results for acyclic networks to networks with cycles.
The noisy network coding scheme is then extended to general multi-source
networks by combining it with decoding techniques for interference channels.
For the Gaussian multicast network, noisy network coding improves the
previously established gap to the cutset bound. We also demonstrate through two
popular AWGN network examples that noisy network coding can outperform
conventional compress-forward, amplify-forward, and hash-forward schemes.Comment: 33 pages, 4 figures, submitted to IEEE Transactions on Information
Theor
An Achievable Rate-Distortion Region for the Multiple Descriptions Problem
A multiple-descriptions (MD) coding strategy is proposed and an inner bound
to the achievable rate-distortion region is derived. The scheme utilizes linear
codes. It is shown in two different MD set-ups that the linear coding scheme
achieves a larger rate-distortion region than previously known random coding
strategies. Furthermore, it is shown via an example that the best known random
coding scheme for the set-up can be improved by including additional randomly
generated codebooks
Successive Wyner-Ziv Coding Scheme and its Application to the Quadratic Gaussian CEO Problem
We introduce a distributed source coding scheme called successive Wyner-Ziv
coding. We show that any point in the rate region of the quadratic Gaussian CEO
problem can be achieved via the successive Wyner-Ziv coding. The concept of
successive refinement in the single source coding is generalized to the
distributed source coding scenario, which we refer to as distributed successive
refinement. For the quadratic Gaussian CEO problem, we establish a necessary
and sufficient condition for distributed successive refinement, where the
successive Wyner-Ziv coding scheme plays an important role.Comment: 28 pages, submitted to the IEEE Transactions on Information Theor
Compress-and-Estimate Source Coding for a Vector Gaussian Source
We consider the remote vector source coding problem in which a vector
Gaussian source is to be estimated from noisy linear measurements. For this
problem, we derive the performance of the compress-and-estimate (CE) coding
scheme and compare it to the optimal performance. In the CE coding scheme, the
remote encoder compresses the noisy source observations so as to minimize the
local distortion measure, independent from the joint distribution between the
source and the observations. In reconstruction, the decoder estimates the
original source realization from the lossy-compressed noisy observations. For
the CE coding in the Gaussian vector case, we show that, if the code rate is
less than a threshold, then the CE coding scheme attains the same performance
as the optimal coding scheme. We also introduce lower and upper bounds for the
performance gap above this threshold. In addition, an example with two
observations and two sources is studied to illustrate the behavior of the
performance gap
Coding Scheme for Negative Utterances
This document contains an abbreviated version of a coding scheme employed for the pragmatic 2-coder analysis of negation types and their felicity. It was used for the coding of negative utterances originating from human-robot dialogues gathered in the experiments described in articles contained in the reference list. Some theoretical parts as well as sections on future work have been removed for space reasons. The complete scheme is contained in the author's thesis. The scheme was devised by the author who also acted as first coder. Additionally a second coder was employed, and those parts of the coding scheme handed to the latter as coding manual are marked as such.Downloa
On the Capacity of Symmetric Gaussian Interference Channels with Feedback
In this paper, we propose a new coding scheme for symmetric Gaussian
interference channels with feedback based on the ideas of time-varying coding
schemes. The proposed scheme improves the Suh-Tse and Kramer inner bounds of
the channel capacity for the cases of weak and not very strong interference.
This improvement is more significant when the signal-to-noise ratio (SNR) is
not very high. It is shown theoretically and numerically that our coding scheme
can outperform the Kramer code. In addition, the generalized degrees-of-freedom
of our proposed coding scheme is equal to the Suh-Tse scheme in the strong
interference case. The numerical results show that our coding scheme can attain
better performance than the Suh-Tse coding scheme for all channel parameters.
Furthermore, the simplicity of the encoding/decoding algorithms is another
strong point of our proposed coding scheme compared with the Suh-Tse coding
scheme. More importantly, our results show that an optimal coding scheme for
the symmetric Gaussian interference channels with feedback can be achieved by
using only marginal posterior distributions under a better cooperation strategy
between transmitters.Comment: To appear in Proc. of IEEE International Symposium on Information
Theory (ISIT), Hong Kong, June 14-19, 201
- …
