81 research outputs found

    Sparse Regression Codes for Multi-terminal Source and Channel Coding

    Full text link
    We study a new class of codes for Gaussian multi-terminal source and channel coding. These codes are designed using the statistical framework of high-dimensional linear regression and are called Sparse Superposition or Sparse Regression codes. Codewords are linear combinations of subsets of columns of a design matrix. These codes were recently introduced by Barron and Joseph and shown to achieve the channel capacity of AWGN channels with computationally feasible decoding. They have also recently been shown to achieve the optimal rate-distortion function for Gaussian sources. In this paper, we demonstrate how to implement random binning and superposition coding using sparse regression codes. In particular, with minimum-distance encoding/decoding it is shown that sparse regression codes attain the optimal information-theoretic limits for a variety of multi-terminal source and channel coding problems.Comment: 9 pages, appeared in the Proceedings of the 50th Annual Allerton Conference on Communication, Control, and Computing - 201

    Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity

    Full text link
    Unsourced random-access (U-RA) is a type of grant-free random access with a virtually unlimited number of users, of which only a certain number KaK_a are active on the same time slot. Users employ exactly the same codebook, and the task of the receiver is to decode the list of transmitted messages. Recently a concatenated coding construction for U-RA on the AWGN channel was presented, in which a sparse regression code (SPARC) is used as an inner code to create an effective outer OR-channel. Then an outer code is used to resolve the multiple-access interference in the OR-MAC. In this work we show that this concatenated construction can achieve a vanishing per-user error probability in the limit of large blocklength and a large number of active users at sum-rates up to the symmetric Shannon capacity, i.e. as long as K_aR < 0.5\log_2(1+K_a\SNR). This extends previous point-to-point optimality results about SPARCs to the unsourced multiuser scenario. Additionally, we calculate the algorithmic threshold, that is a bound on the sum-rate up to which the inner decoding can be done reliably with the low-complexity AMP algorithm.Comment: 7 pages, submitted to ISIT 2020. arXiv admin note: substantial text overlap with arXiv:1901.0623

    Spatially Coupled Sparse Regression Codes: Design and State Evolution Analysis.

    Get PDF
    We consider the design and analysis of spatially coupled sparse regression codes (SC-SPARCs), which were recently introduced by Barbier et al. for efficient communication over the additive white Gaussian noise channel. SC-SPARCs can be efficiently decoded using an Approximate Message Passing (AMP) decoder, whose performance in each iteration can be predicted via a set of equations called state evolution. In this paper, we give an asymptotic characterization of the state evolution equations for SC-SPARCs. For any given base matrix (that defines the coupling structure of the SC-SPARC) and rate, this characterization can be used to predict whether or not AMP decoding will succeed in the large system limit. We then consider a simple base matrix defined by two parameters (ω,Λ)(\omega, \Lambda), and show that AMP decoding succeeds in the large system limit for all rates R<CR < \mathcal{C}. The asymptotic result also indicates how the parameters of the base matrix affect the decoding progression. Simulation results are presented to evaluate the performance of SC-SPARCs defined with the proposed base matrix.Comment: 8 pages, 6 figures. A shorter version of this paper to appear in ISIT 201

    Lossy Compression via Sparse Linear Regression: Performance under Minimum-distance Encoding

    Full text link
    We study a new class of codes for lossy compression with the squared-error distortion criterion, designed using the statistical framework of high-dimensional linear regression. Codewords are linear combinations of subsets of columns of a design matrix. Called a Sparse Superposition or Sparse Regression codebook, this structure is motivated by an analogous construction proposed recently by Barron and Joseph for communication over an AWGN channel. For i.i.d Gaussian sources and minimum-distance encoding, we show that such a code can attain the Shannon rate-distortion function with the optimal error exponent, for all distortions below a specified value. It is also shown that sparse regression codes are robust in the following sense: a codebook designed to compress an i.i.d Gaussian source of variance σ2\sigma^2 with (squared-error) distortion DD can compress any ergodic source of variance less than σ2\sigma^2 to within distortion DD. Thus the sparse regression ensemble retains many of the good covering properties of the i.i.d random Gaussian ensemble, while having having a compact representation in terms of a matrix whose size is a low-order polynomial in the block-length.Comment: This version corrects a typo in the statement of Theorem 2 of the published pape
    • …
    corecore