81 research outputs found
Sparse Regression Codes for Multi-terminal Source and Channel Coding
We study a new class of codes for Gaussian multi-terminal source and channel
coding. These codes are designed using the statistical framework of
high-dimensional linear regression and are called Sparse Superposition or
Sparse Regression codes. Codewords are linear combinations of subsets of
columns of a design matrix. These codes were recently introduced by Barron and
Joseph and shown to achieve the channel capacity of AWGN channels with
computationally feasible decoding. They have also recently been shown to
achieve the optimal rate-distortion function for Gaussian sources. In this
paper, we demonstrate how to implement random binning and superposition coding
using sparse regression codes. In particular, with minimum-distance
encoding/decoding it is shown that sparse regression codes attain the optimal
information-theoretic limits for a variety of multi-terminal source and channel
coding problems.Comment: 9 pages, appeared in the Proceedings of the 50th Annual Allerton
Conference on Communication, Control, and Computing - 201
Recommended from our members
Capacity-achieving Sparse Regression Codes via approximate message passing decoding
Sparse superposition codes were recently introduced by Barron and Joseph for reliable communication over the AWGN channel at rates approaching the channel capacity. In this code, the codewords are sparse linear combinations of columns of a design matrix. In this paper, we propose an approximate message passing decoder for sparse superposition codes. The complexity of the decoder scales linearly with the size of the design matrix. The performance of the decoder is rigorously analyzed and it is shown to asymptotically achieve the AWGN capacity. We also provide simulation results to demonstrate the performance of the decoder at finite block lengths, and introduce a power allocation that significantly improves the empirical performance.RV would like to acknowledge support from a Marie Curie Career Integration Grant (GA Number 631489). AG is supported by an EPSRC Doctoral Training Award.This is the author accepted manuscript. The final version is available from IEEE via http://dx.doi.org/10.1109/ISIT.2015.728280
Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity
Unsourced random-access (U-RA) is a type of grant-free random access with a
virtually unlimited number of users, of which only a certain number are
active on the same time slot. Users employ exactly the same codebook, and the
task of the receiver is to decode the list of transmitted messages. Recently a
concatenated coding construction for U-RA on the AWGN channel was presented, in
which a sparse regression code (SPARC) is used as an inner code to create an
effective outer OR-channel. Then an outer code is used to resolve the
multiple-access interference in the OR-MAC. In this work we show that this
concatenated construction can achieve a vanishing per-user error probability in
the limit of large blocklength and a large number of active users at sum-rates
up to the symmetric Shannon capacity, i.e. as long as K_aR <
0.5\log_2(1+K_a\SNR). This extends previous point-to-point optimality results
about SPARCs to the unsourced multiuser scenario. Additionally, we calculate
the algorithmic threshold, that is a bound on the sum-rate up to which the
inner decoding can be done reliably with the low-complexity AMP algorithm.Comment: 7 pages, submitted to ISIT 2020. arXiv admin note: substantial text
overlap with arXiv:1901.0623
Spatially Coupled Sparse Regression Codes: Design and State Evolution Analysis.
We consider the design and analysis of spatially coupled sparse regression
codes (SC-SPARCs), which were recently introduced by Barbier et al. for
efficient communication over the additive white Gaussian noise channel.
SC-SPARCs can be efficiently decoded using an Approximate Message Passing (AMP)
decoder, whose performance in each iteration can be predicted via a set of
equations called state evolution. In this paper, we give an asymptotic
characterization of the state evolution equations for SC-SPARCs. For any given
base matrix (that defines the coupling structure of the SC-SPARC) and rate,
this characterization can be used to predict whether or not AMP decoding will
succeed in the large system limit. We then consider a simple base matrix
defined by two parameters , and show that AMP decoding
succeeds in the large system limit for all rates . The
asymptotic result also indicates how the parameters of the base matrix affect
the decoding progression. Simulation results are presented to evaluate the
performance of SC-SPARCs defined with the proposed base matrix.Comment: 8 pages, 6 figures. A shorter version of this paper to appear in ISIT
201
Recommended from our members
Spatially Coupled Sparse Regression Codes for Single- and Multi-user Communications
Sparse regression codes (SPARCs) are a class of channel codes for efficient communication over the single-user additive white Gaussian noise (AWGN) channel at rates approaching the channel capacity. In a standard SPARC, codewords are sparse linear combinations of columns of an i.i.d. Gaussian design matrix, and the user message is encoded in the indices of those columns. Techniques such as power allocation and spatial coupling have been proposed to improve the performance of low-complexity iterative decoding algorithms such as approximate message passing (AMP).
In this thesis we investigate spatially coupled SPARCs, where the design matrix has a block- wise band-diagonal structure, and modulated SPARCs, which generalise standard SPARCs by introducing modulation to the encoding of user messages. We introduce a base matrix framework which provides a unified way to construct power allocated and spatially coupled design matrices, and propose AMP decoders for modulated SPARCs constructed using base matrices.
We prove that phase shift keying modulated and spatially coupled SPARCs with AMP decoding asymptotically achieve the capacity of the (complex) AWGN channel. We also show via numerical simulations that they can achieve lower error rates than standard coded modulation schemes at finite code lengths. A sliding window AMP decoder is proposed for spatially coupled SPARCs that significantly reduces the decoding latency and complexity.
We then investigate coding schemes based on random linear models and AMP decoding for the multi-user Gaussian multiple access channel in the asymptotic regime where the number of users grows linearly with the code length. For a fixed target error rate and message size per user (in bits), we obtain the exact trade-off between energy-per-bit and the user density achievable in the large system limit. We show that a coding scheme based on spatially coupled Gaussian matrices and AMP decoding achieves near-optimal trade-off for a large range of user densities. To the best of our knowledge, this is the first efficient coding scheme to do so in this multiple access regime. Moreover, the spatially coupled coding scheme has a practical interpretation: it can be viewed as block-wise time-division with overlap.Funded by a Doctoral Training Partnership Award from the Engineering and Physical Sciences Research Council
Recommended from our members
Design Techniques for Efficient Sparse Regression Codes
Sparse regression codes (SPARCs) are a recently introduced coding scheme for the additive
white Gaussian noise channel, for which polynomial time decoding algorithms have been proposed which provably achieve the Shannon channel capacity. One such algorithm is the approximate message passing (AMP) decoder. However, directly implementing these decoders
does not yield good empirical performance at practical block lengths. This thesis develops techniques for improving both the error rate performance, and the time and memory complexity,
of the AMP decoder. It focuses on practical and efficient implementations for both single- and
multi-user scenarios.
A key design parameter for SPARCs is the power allocation, which is a vector of coefficients which determines how codewords are constructed. In this thesis, novel power allocation
schemes are proposed which result in several orders of magnitude improvement to error rate
compared to previous designs. Further improvements to error rate come from investigating
the role of other SPARC construction parameters, and from performing an online estimation
of a key AMP parameter instead of using a pre-computed value.
Another significant improvement to error rates comes from a novel three-stage decoder
which combines SPARCs with an outer code based on low-density parity-check codes. This
construction protects only vulnerable sections of the SPARC codeword with the outer code,
minimising the impact to the code rate. The combination provides a sharp waterfall in bit error
rates and very low overall codeword error rates.
Two changes to the basic SPARC structure are proposed to reduce computational and
memory complexity. First, the design matrix is replaced with an efficient in-place transform
based on Hadamard matrices, which dramatically reduces the overall decoder time and memory complexity with no impact on error rate. Second, an alternative SPARC design is developed, called Modulated SPARCs. These are shown to also achieve the Shannon channel capacity, while obtaining similar empirical error rates to the original SPARC, and permitting a
further reduction in time and memory complexity.
Finally, SPARCs are implemented for the broadcast and multiple access channels, and for
the multiple description and Wyner-Ziv source coding models. Designs for appropriate power
allocations and decoding strategies are proposed and are found to give good empirical results,
demonstrating that SPARCs are also well suited to these multi-user settings.Funded by a Doctoral Training Award from the Engineering and Physical Sciences Research Council
Lossy Compression via Sparse Linear Regression: Performance under Minimum-distance Encoding
We study a new class of codes for lossy compression with the squared-error
distortion criterion, designed using the statistical framework of
high-dimensional linear regression. Codewords are linear combinations of
subsets of columns of a design matrix. Called a Sparse Superposition or Sparse
Regression codebook, this structure is motivated by an analogous construction
proposed recently by Barron and Joseph for communication over an AWGN channel.
For i.i.d Gaussian sources and minimum-distance encoding, we show that such a
code can attain the Shannon rate-distortion function with the optimal error
exponent, for all distortions below a specified value. It is also shown that
sparse regression codes are robust in the following sense: a codebook designed
to compress an i.i.d Gaussian source of variance with
(squared-error) distortion can compress any ergodic source of variance less
than to within distortion . Thus the sparse regression ensemble
retains many of the good covering properties of the i.i.d random Gaussian
ensemble, while having having a compact representation in terms of a matrix
whose size is a low-order polynomial in the block-length.Comment: This version corrects a typo in the statement of Theorem 2 of the
published pape
- …