1,270 research outputs found
Capacity of The Discrete-Time Non-Coherent Memoryless Gaussian Channels at Low SNR
We address the capacity of a discrete-time memoryless Gaussian channel, where
the channel state information (CSI) is neither available at the transmitter nor
at the receiver. The optimal capacity-achieving input distribution at low
signal-to-noise ratio (SNR) is precisely characterized, and the exact capacity
of a non-coherent channel is derived. The derived relations allow to better
understanding the capacity of non-coherent channels at low SNR. Then, we
compute the non-coherence penalty and give a more precise characterization of
the sub-linear term in SNR. Finally, in order to get more insight on how the
optimal input varies with SNR, upper and lower bounds on the non-zero mass
point location of the capacity-achieving input are given.Comment: 5 pages and 4 figures. To appear in Proceeding of International
Symposium on Information Theory (ISIT 2008
High-SNR Capacity of Wireless Communication Channels in the Noncoherent Setting: A Primer
This paper, mostly tutorial in nature, deals with the problem of
characterizing the capacity of fading channels in the high signal-to-noise
ratio (SNR) regime. We focus on the practically relevant noncoherent setting,
where neither transmitter nor receiver know the channel realizations, but both
are aware of the channel law. We present, in an intuitive and accessible form,
two tools, first proposed by Lapidoth & Moser (2003), of fundamental importance
to high-SNR capacity analysis: the duality approach and the escape-to-infinity
property of capacity-achieving distributions. Furthermore, we apply these tools
to refine some of the results that appeared previously in the literature and to
simplify the corresponding proofs.Comment: To appear in Int. J. Electron. Commun. (AE\"U), Aug. 201
On Marton's inner bound for broadcast channels
Marton's inner bound is the best known achievable region for a general
discrete memoryless broadcast channel. To compute Marton's inner bound one has
to solve an optimization problem over a set of joint distributions on the input
and auxiliary random variables. The optimizers turn out to be structured in
many cases. Finding properties of optimizers not only results in efficient
evaluation of the region, but it may also help one to prove factorization of
Marton's inner bound (and thus its optimality). The first part of this paper
formulates this factorization approach explicitly and states some conjectures
and results along this line. The second part of this paper focuses primarily on
the structure of the optimizers. This section is inspired by a new binary
inequality that recently resulted in a very simple characterization of the
sum-rate of Marton's inner bound for binary input broadcast channels. This
prompted us to investigate whether this inequality can be extended to larger
cardinality input alphabets. We show that several of the results for the binary
input case do carry over for higher cardinality alphabets and we present a
collection of results that help restrict the search space of probability
distributions to evaluate the boundary of Marton's inner bound in the general
case. We also prove a new inequality for the binary skew-symmetric broadcast
channel that yields a very simple characterization of the entire Marton inner
bound for this channel.Comment: Submitted to ISIT 201
Second-Order Coding Rates for Channels with State
We study the performance limits of state-dependent discrete memoryless
channels with a discrete state available at both the encoder and the decoder.
We establish the epsilon-capacity as well as necessary and sufficient
conditions for the strong converse property for such channels when the sequence
of channel states is not necessarily stationary, memoryless or ergodic. We then
seek a finer characterization of these capacities in terms of second-order
coding rates. The general results are supplemented by several examples
including i.i.d. and Markov states and mixed channels
- …