154 research outputs found

    Cooperative Strategies for Simultaneous and Broadcast Relay Channels

    Full text link
    Consider the \emph{simultaneous relay channel} (SRC) which consists of a set of relay channels where the source wishes to transmit common and private information to each of the destinations. This problem is recognized as being equivalent to that of sending common and private information to several destinations in presence of helper relays where each channel outcome becomes a branch of the \emph{broadcast relay channel} (BRC). Cooperative schemes and capacity region for a set with two memoryless relay channels are investigated. The proposed coding schemes, based on \emph{Decode-and-Forward} (DF) and \emph{Compress-and-Forward} (CF) must be capable of transmitting information simultaneously to all destinations in such set. Depending on the quality of source-to-relay and relay-to-destination channels, inner bounds on the capacity of the general BRC are derived. Three cases of particular interest are considered: cooperation is based on DF strategy for both users --referred to as DF-DF region--, cooperation is based on CF strategy for both users --referred to as CF-CF region--, and cooperation is based on DF strategy for one destination and CF for the other --referred to as DF-CF region--. These results can be seen as a generalization and hence unification of previous works. An outer-bound on the capacity of the general BRC is also derived. Capacity results are obtained for the specific cases of semi-degraded and degraded Gaussian simultaneous relay channels. Rates are evaluated for Gaussian models where the source must guarantee a minimum amount of information to both users while additional information is sent to each of them.Comment: 32 pages, 7 figures, To appear in IEEE Trans. on Information Theor

    Re-proving Channel Polarization Theorems: An Extremality and Robustness Analysis

    Get PDF
    The general subject considered in this thesis is a recently discovered coding technique, polar coding, which is used to construct a class of error correction codes with unique properties. In his ground-breaking work, Ar{\i}kan proved that this class of codes, called polar codes, achieve the symmetric capacity --- the mutual information evaluated at the uniform input distribution ---of any stationary binary discrete memoryless channel with low complexity encoders and decoders requiring in the order of O(NlogN)O(N\log N) operations in the block-length NN. This discovery settled the long standing open problem left by Shannon of finding low complexity codes achieving the channel capacity. Polar coding settled an open problem in information theory, yet opened plenty of challenging problems that need to be addressed. A significant part of this thesis is dedicated to advancing the knowledge about this technique in two directions. The first one provides a better understanding of polar coding by generalizing some of the existing results and discussing their implications, and the second one studies the robustness of the theory over communication models introducing various forms of uncertainty or variations into the probabilistic model of the channel.Comment: Preview of my PhD Thesis, EPFL, Lausanne, 2014. For the full version, see http://people.epfl.ch/mine.alsan/publication

    Refined Strong Converse for the Constant Composition Codes

    Full text link
    A strong converse bound for constant composition codes of the form Pe(n)1An0.5(1Esc(R,W,p))enEsc(R,W,p)P_{e}^{(n)} \geq 1- A n^{-0.5(1-E_{sc}'(R,W,p))} e^{-n E_{sc}(R,W,p)} is established using the Berry-Esseen theorem through the concepts of Augustin information and Augustin mean, where AA is a constant determined by the channel WW, the composition pp, and the rate RR, i.e., AA does not depend on the block length nn.Comment: 7 page
    corecore