3 research outputs found
Analysis and Design of Serially Concatenated LDGM Codes
In this paper, we first present the asymptotic performance of serially
concatenated low-density generator-matrix (SCLDGM) codes for binary input
additive white Gaussian noise channels using discretized density evolution
(DDE). We then provide a necessary condition for the successful decoding of
these codes. The error-floor analysis along with the lower bound formulas for
both LDGM and SCLDGM codes are also provided and verified. We further show that
by concatenating inner LDGM codes with a high-rate outer LDPC code instead of
concatenating two LDGM codes as in SCLDGM codes, good codes without error
floors can be constructed. Finally, with an efficient DDE-based optimization
approach that utilizes the necessary condition for the successful decoding, we
construct optimized SCLDGM codes that approach the Shannon limit. The improved
performance of our optimized SCLDGM codes is demonstrated through both
asymptotic and simulation results.Comment: This work has been submitted to the IEEE for possible publication.
Copyright may be transferred without notice, after which this version may no
longer be accessibl
Capacity Achieving Linear Codes with Random Binary Sparse Generating Matrices
In this paper, we prove the existence of capacity achieving linear codes with
random binary sparse generating matrices. The results on the existence of
capacity achieving linear codes in the literature are limited to the random
binary codes with equal probability generating matrix elements and sparse
parity-check matrices. Moreover, the codes with sparse generating matrices
reported in the literature are not proved to be capacity achieving.
As opposed to the existing results in the literature, which are based on
optimal maximum a posteriori decoders, the proposed approach is based on a
different decoder and consequently is suboptimal. We also demonstrate an
interesting trade-off between the sparsity of the generating matrix and the
error exponent (a constant which determines how exponentially fast the
probability of error decays as block length tends to infinity). An interesting
observation is that for small block sizes, less sparse generating matrices have
better performances while for large blok sizes, the performance of the random
generating matrices become independent of the sparsity. Moreover, we prove the
existence of capacity achieving linear codes with a given (arbitrarily low)
density of ones on rows of the generating matrix. In addition to proving the
existence of capacity achieving sparse codes, an important conclusion of our
paper is that for a sufficiently large code length, no search is necessary in
practice to find a deterministic matrix by proving that any arbitrarily
selected sequence of sparse generating matrices is capacity achieving with high
probability. The focus in this paper is on the binary symmetric and binary
erasure channels.her discrete memory-less symmetric channels.Comment: Submitted to IEEE transaction on Information Theor
Coding for Crowdsourced Classification with XOR Queries
This paper models the crowdsourced labeling/classification problem as a
sparsely encoded source coding problem, where each query answer, regarded as a
code bit, is the XOR of a small number of labels, as source information bits.
In this paper we leverage the connections between this problem and well-studied
codes with sparse representations for the channel coding problem to provide
querying schemes with almost optimal number of queries, each of which involving
only a constant number of labels. We also extend this scenario to the case
where some workers can be unresponsive. For this case, we propose querying
schemes where each query involves only log n items, where n is the total number
of items to be labeled. Furthermore, we consider classification of two
correlated labeling systems and provide two-stage querying schemes with almost
optimal number of queries each involving a constant number of labels.Comment: 6 page