1,473 research outputs found

    On the Multiple Access Channel with Asymmetric Noisy State Information at the Encoders

    Get PDF
    We consider the problem of reliable communication over multiple-access channels (MAC) where the channel is driven by an independent and identically distributed state process and the encoders and the decoder are provided with various degrees of asymmetric noisy channel state information (CSI). For the case where the encoders observe causal, asymmetric noisy CSI and the decoder observes complete CSI, we provide inner and outer bounds to the capacity region, which are tight for the sum-rate capacity. We then observe that, under a Markov assumption, similar capacity results also hold in the case where the receiver observes noisy CSI. Furthermore, we provide a single letter characterization for the capacity region when the CSI at the encoders are asymmetric deterministic functions of the CSI at the decoder and the encoders have non-causal noisy CSI (its causal version is recently solved in \cite{como-yuksel}). When the encoders observe asymmetric noisy CSI with asymmetric delays and the decoder observes complete CSI, we provide a single letter characterization for the capacity region. Finally, we consider a cooperative scenario with common and private messages, with asymmetric noisy CSI at the encoders and complete CSI at the decoder. We provide a single letter expression for the capacity region for such channels. For the cooperative scenario, we also note that as soon as the common message encoder does not have access to CSI, then in any noisy setup, covering the cases where no CSI or noisy CSI at the decoder, it is possible to obtain a single letter characterization for the capacity region. The main component in these results is a generalization of a converse coding approach, recently introduced in [1] for the MAC with asymmetric quantized CSI at the encoders and herein considerably extended and adapted for the noisy CSI setup.Comment: Submitted to the IEEE Transactions on Information Theor

    On Cooperative Multiple Access Channels with Delayed CSI at Transmitters

    Full text link
    We consider a cooperative two-user multiaccess channel in which the transmission is controlled by a random state. Both encoders transmit a common message and, one of the encoders also transmits an individual message. We study the capacity region of this communication model for different degrees of availability of the states at the encoders, causally or strictly causally. In the case in which the states are revealed causally to both encoders but not to the decoder we find an explicit characterization of the capacity region in the discrete memoryless case. In the case in which the states are revealed only strictly causally to both encoders, we establish inner and outer bounds on the capacity region. The outer bound is non-trivial, and has a relatively simple form. It has the advantage of incorporating only one auxiliary random variable. We then introduce a class of cooperative multiaccess channels with states known strictly causally at both encoders for which the inner and outer bounds agree; and so we characterize the capacity region for this class. In this class of channels, the state can be obtained as a deterministic function of the channel inputs and output. We also study the model in which the states are revealed, strictly causally, in an asymmetric manner, to only one encoder. Throughout the paper, we discuss a number of examples; and compute the capacity region of some of these examples. The results shed more light on the utility of delayed channel state information for increasing the capacity region of state-dependent cooperative multiaccess channels; and tie with recent progress in this framework.Comment: 54 pages. To appear in IEEE Transactions on Information Theory. arXiv admin note: substantial text overlap with arXiv:1201.327

    Integer-Forcing Source Coding

    Full text link
    Integer-Forcing (IF) is a new framework, based on compute-and-forward, for decoding multiple integer linear combinations from the output of a Gaussian multiple-input multiple-output channel. This work applies the IF approach to arrive at a new low-complexity scheme, IF source coding, for distributed lossy compression of correlated Gaussian sources under a minimum mean squared error distortion measure. All encoders use the same nested lattice codebook. Each encoder quantizes its observation using the fine lattice as a quantizer and reduces the result modulo the coarse lattice, which plays the role of binning. Rather than directly recovering the individual quantized signals, the decoder first recovers a full-rank set of judiciously chosen integer linear combinations of the quantized signals, and then inverts it. In general, the linear combinations have smaller average powers than the original signals. This allows to increase the density of the coarse lattice, which in turn translates to smaller compression rates. We also propose and analyze a one-shot version of IF source coding, that is simple enough to potentially lead to a new design principle for analog-to-digital converters that can exploit spatial correlations between the sampled signals.Comment: Submitted to IEEE Transactions on Information Theor

    Multiaccess Channels with State Known to Some Encoders and Independent Messages

    Full text link
    We consider a state-dependent multiaccess channel (MAC) with state non-causally known to some encoders. We derive an inner bound for the capacity region in the general discrete memoryless case and specialize to a binary noiseless case. In the case of maximum entropy channel state, we obtain the capacity region for binary noiseless MAC with one informed encoder by deriving a non-trivial outer bound for this case. For a Gaussian state-dependent MAC with one encoder being informed of the channel state, we present an inner bound by applying a slightly generalized dirty paper coding (GDPC) at the informed encoder that allows for partial state cancellation, and a trivial outer bound by providing channel state to the decoder also. The uninformed encoders benefit from the state cancellation in terms of achievable rates, however, appears that GDPC cannot completely eliminate the effect of the channel state on the achievable rate region, in contrast to the case of all encoders being informed. In the case of infinite state variance, we analyze how the uninformed encoder benefits from the informed encoder's actions using the inner bound and also provide a non-trivial outer bound for this case which is better than the trivial outer bound.Comment: Accepted to EURASIP Journal on Wireless Communication and Networking, Feb. 200

    Distributed Joint Source-Channel Coding in Wireless Sensor Networks

    Get PDF
    Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency
    corecore