17,750 research outputs found

    Communicating Remote Gaussian Sources over Gaussian Multiple Access Channels

    Get PDF
    Abstract—We study a multiple-terminal joint source-channel coding problem, where two remote correlated Gaussian sources are transmitted over a Gaussian multiple-access channel with two transmitters. Each transmitter observes one of the sources contaminated in Gaussian noise. The receiver wishes to reconstruct both sources. We derive necessary conditions and sufficient conditions for the receiver to be able to reconstruct the sources with given expected squared-error distortions. These conditions establish the optimality of uncoded transmission below some signal-to-noise ratio (SNR) threshold, and they also establish the high-SNR asymptotics. To achieve the latter, a coding scheme is proposed that superimposes analog uncoded transmission and digital combined source-channel Gaussian vector quantization. I

    Erasure Multiple Descriptions

    Full text link
    We consider a binary erasure version of the n-channel multiple descriptions problem with symmetric descriptions, i.e., the rates of the n descriptions are the same and the distortion constraint depends only on the number of messages received. We consider the case where there is no excess rate for every k out of n descriptions. Our goal is to characterize the achievable distortions D_1, D_2,...,D_n. We measure the fidelity of reconstruction using two distortion criteria: an average-case distortion criterion, under which distortion is measured by taking the average of the per-letter distortion over all source sequences, and a worst-case distortion criterion, under which distortion is measured by taking the maximum of the per-letter distortion over all source sequences. We present achievability schemes, based on random binning for average-case distortion and systematic MDS (maximum distance separable) codes for worst-case distortion, and prove optimality results for the corresponding achievable distortion regions. We then use the binary erasure multiple descriptions setup to propose a layered coding framework for multiple descriptions, which we then apply to vector Gaussian multiple descriptions and prove its optimality for symmetric scalar Gaussian multiple descriptions with two levels of receivers and no excess rate for the central receiver. We also prove a new outer bound for the general multi-terminal source coding problem and use it to prove an optimality result for the robust binary erasure CEO problem. For the latter, we provide a tight lower bound on the distortion for \ell messages for any coding scheme that achieves the minimum achievable distortion for k messages where k is less than or equal to \ell.Comment: 48 pages, 2 figures, submitted to IEEE Trans. Inf. Theor

    Achievable Rate Regions for Two-Way Relay Channel using Nested Lattice Coding

    Get PDF
    This paper studies Gaussian Two-Way Relay Channel where two communication nodes exchange messages with each other via a relay. It is assumed that all nodes operate in half duplex mode without any direct link between the communication nodes. A compress-and-forward relaying strategy using nested lattice codes is first proposed. Then, the proposed scheme is improved by performing a layered coding : a common layer is decoded by both receivers and a refinement layer is recovered only by the receiver which has the best channel conditions. The achievable rates of the new scheme are characterized and are shown to be higher than those provided by the decode-and-forward strategy in some regions.Comment: 27 pages, 13 figures, Submitted to IEEE Transactions on Wireless Communications (October 2013

    Communicating over Filter-and-Forward Relay Networks with Channel Output Feedback

    Full text link
    Relay networks aid in increasing the rate of communication from source to destination. However, the capacity of even a three-terminal relay channel is an open problem. In this work, we propose a new lower bound for the capacity of the three-terminal relay channel with destination-to-source feedback in the presence of correlated noise. Our lower bound improves on the existing bounds in the literature. We then extend our lower bound to general relay network configurations using an arbitrary number of filter-and-forward relay nodes. Such network configurations are common in many multi-hop communication systems where the intermediate nodes can only perform minimal processing due to limited computational power. Simulation results show that significant improvements in the achievable rate can be obtained through our approach. We next derive a coding strategy (optimized using post processed signal-to-noise ratio as a criterion) for the three-terminal relay channel with noisy channel output feedback for two transmissions. This coding scheme can be used in conjunction with open-loop codes for applications like automatic repeat request (ARQ) or hybrid-ARQ.Comment: 15 pages, 8 figures, to appear in IEEE Transactions on Signal Processin

    A Lattice Coding Scheme for Secret Key Generation from Gaussian Markov Tree Sources

    Full text link
    In this article, we study the problem of secret key generation in the multiterminal source model, where the terminals have access to correlated Gaussian sources. We assume that the sources form a Markov chain on a tree. We give a nested lattice-based key generation scheme whose computational complexity is polynomial in the number, N , of independent and identically distributed samples observed by each source. We also compute the achievable secret key rate and give a class of examples where our scheme is optimal in the fine quantization limit. However, we also give examples that show that our scheme is not always optimal in the limit of fine quantization.Comment: 10 pages, 3 figures. A 5-page version of this article has been submitted to the 2016 IEEE International Symposium on Information Theory (ISIT

    Multi Terminal Probabilistic Compressed Sensing

    Get PDF
    In this paper, the `Approximate Message Passing' (AMP) algorithm, initially developed for compressed sensing of signals under i.i.d. Gaussian measurement matrices, has been extended to a multi-terminal setting (MAMP algorithm). It has been shown that similar to its single terminal counterpart, the behavior of MAMP algorithm is fully characterized by a `State Evolution' (SE) equation for large block-lengths. This equation has been used to obtain the rate-distortion curve of a multi-terminal memoryless source. It is observed that by spatially coupling the measurement matrices, the rate-distortion curve of MAMP algorithm undergoes a phase transition, where the measurement rate region corresponding to a low distortion (approximately zero distortion) regime is fully characterized by the joint and conditional Renyi information dimension (RID) of the multi-terminal source. This measurement rate region is very similar to the rate region of the Slepian-Wolf distributed source coding problem where the RID plays a role similar to the discrete entropy. Simulations have been done to investigate the empirical behavior of MAMP algorithm. It is observed that simulation results match very well with predictions of SE equation for reasonably large block-lengths.Comment: 11 pages, 13 figures. arXiv admin note: text overlap with arXiv:1112.0708 by other author
    corecore