25,880 research outputs found
Multiaccess Channels with State Known to Some Encoders and Independent Messages
We consider a state-dependent multiaccess channel (MAC) with state
non-causally known to some encoders. We derive an inner bound for the capacity
region in the general discrete memoryless case and specialize to a binary
noiseless case. In the case of maximum entropy channel state, we obtain the
capacity region for binary noiseless MAC with one informed encoder by deriving
a non-trivial outer bound for this case. For a Gaussian state-dependent MAC
with one encoder being informed of the channel state, we present an inner bound
by applying a slightly generalized dirty paper coding (GDPC) at the informed
encoder that allows for partial state cancellation, and a trivial outer bound
by providing channel state to the decoder also. The uninformed encoders benefit
from the state cancellation in terms of achievable rates, however, appears that
GDPC cannot completely eliminate the effect of the channel state on the
achievable rate region, in contrast to the case of all encoders being informed.
In the case of infinite state variance, we analyze how the uninformed encoder
benefits from the informed encoder's actions using the inner bound and also
provide a non-trivial outer bound for this case which is better than the
trivial outer bound.Comment: Accepted to EURASIP Journal on Wireless Communication and Networking,
Feb. 200
The Capacity of the Quantum Multiple Access Channel
We define classical-quantum multiway channels for transmission of classical
information, after recent work by Allahverdyan and Saakian. Bounds on the
capacity region are derived in a uniform way, which are analogous to the
classically known ones, simply replacing Shannon entropy with von Neumann
entropy. For the single receiver case (multiple access channel) the exact
capacity region is determined. These results are applied to the case of noisy
channels, with arbitrary input signal states. A second issue of this work is
the presentation of a calculus of quantum information quantities, based on the
algebraic formulation of quantum theory.Comment: 7 pages, requires IEEEtran2e.cl
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
This monograph presents a unified treatment of single- and multi-user
problems in Shannon's information theory where we depart from the requirement
that the error probability decays asymptotically in the blocklength. Instead,
the error probabilities for various problems are bounded above by a
non-vanishing constant and the spotlight is shone on achievable coding rates as
functions of the growing blocklengths. This represents the study of asymptotic
estimates with non-vanishing error probabilities.
In Part I, after reviewing the fundamentals of information theory, we discuss
Strassen's seminal result for binary hypothesis testing where the type-I error
probability is non-vanishing and the rate of decay of the type-II error
probability with growing number of independent observations is characterized.
In Part II, we use this basic hypothesis testing result to develop second- and
sometimes, even third-order asymptotic expansions for point-to-point
communication. Finally in Part III, we consider network information theory
problems for which the second-order asymptotics are known. These problems
include some classes of channels with random state, the multiple-encoder
distributed lossless source coding (Slepian-Wolf) problem and special cases of
the Gaussian interference and multiple-access channels. Finally, we discuss
avenues for further research.Comment: Further comments welcom
Compute-and-Forward: Harnessing Interference through Structured Codes
Interference is usually viewed as an obstacle to communication in wireless
networks. This paper proposes a new strategy, compute-and-forward, that
exploits interference to obtain significantly higher rates between users in a
network. The key idea is that relays should decode linear functions of
transmitted messages according to their observed channel coefficients rather
than ignoring the interference as noise. After decoding these linear equations,
the relays simply send them towards the destinations, which given enough
equations, can recover their desired messages. The underlying codes are based
on nested lattices whose algebraic structure ensures that integer combinations
of codewords can be decoded reliably. Encoders map messages from a finite field
to a lattice and decoders recover equations of lattice points which are then
mapped back to equations over the finite field. This scheme is applicable even
if the transmitters lack channel state information.Comment: IEEE Trans. Info Theory, to appear. 23 pages, 13 figure
- …