690 research outputs found
Optimized puncturing distributions for irregular non-binary LDPC codes
In this paper we design non-uniform bit-wise puncturing distributions for
irregular non-binary LDPC (NB-LDPC) codes. The puncturing distributions are
optimized by minimizing the decoding threshold of the punctured LDPC code, the
threshold being computed with a Monte-Carlo implementation of Density
Evolution. First, we show that Density Evolution computed with Monte-Carlo
simulations provides accurate (very close) and precise (small variance)
estimates of NB-LDPC code ensemble thresholds. Based on the proposed method, we
analyze several puncturing distributions for regular and semi-regular codes,
obtained either by clustering punctured bits, or spreading them over the
symbol-nodes of the Tanner graph. Finally, optimized puncturing distributions
for non-binary LDPC codes with small maximum degree are presented, which
exhibit a gap between 0.2 and 0.5 dB to the channel capacity, for punctured
rates varying from 0.5 to 0.9.Comment: 6 pages, ISITA1
Blind Reconciliation
Information reconciliation is a crucial procedure in the classical
post-processing of quantum key distribution (QKD). Poor reconciliation
efficiency, revealing more information than strictly needed, may compromise the
maximum attainable distance, while poor performance of the algorithm limits the
practical throughput in a QKD device. Historically, reconciliation has been
mainly done using close to minimal information disclosure but heavily
interactive procedures, like Cascade, or using less efficient but also less
interactive -just one message is exchanged- procedures, like the ones based in
low-density parity-check (LDPC) codes. The price to pay in the LDPC case is
that good efficiency is only attained for very long codes and in a very narrow
range centered around the quantum bit error rate (QBER) that the code was
designed to reconcile, thus forcing to have several codes if a broad range of
QBER needs to be catered for. Real world implementations of these methods are
thus very demanding, either on computational or communication resources or
both, to the extent that the last generation of GHz clocked QKD systems are
finding a bottleneck in the classical part. In order to produce compact, high
performance and reliable QKD systems it would be highly desirable to remove
these problems. Here we analyse the use of short-length LDPC codes in the
information reconciliation context using a low interactivity, blind, protocol
that avoids an a priori error rate estimation. We demonstrate that 2x10^3 bits
length LDPC codes are suitable for blind reconciliation. Such codes are of high
interest in practice, since they can be used for hardware implementations with
very high throughput.Comment: 22 pages, 8 figure
Spatially Coupled Turbo Codes: Principles and Finite Length Performance
In this paper, we give an overview of spatially coupled turbo codes (SC-TCs),
the spatial coupling of parallel and serially concatenated convolutional codes,
recently introduced by the authors. For presentation purposes, we focus on
spatially coupled serially concatenated codes (SC-SCCs). We review the main
principles of SC-TCs and discuss their exact density evolution (DE) analysis on
the binary erasure channel. We also consider the construction of a family of
rate-compatible SC-SCCs with simple 4-state component encoders. For all
considered code rates, threshold saturation of the belief propagation (BP) to
the maximum a posteriori threshold of the uncoupled ensemble is demonstrated,
and it is shown that the BP threshold approaches the Shannon limit as the
coupling memory increases. Finally we give some simulation results for finite
lengths.Comment: Invited paper, IEEE Int. Symp. Wireless Communications Systems
(ISWCS), Aug. 201
Asymptotic Weight Enumerators of Randomly Punctured, Expurgated, and Shortened Code Ensembles
In this paper, we examine the effect of random
puncturing, expurgating, and shortening on the asymptotic
weight enumerator of certain linear code ensembles. We begin
by discussing the actions of the three alteration methods on
individual codes. We derive expressions for the average resulting
code weight enumerator under each alteration. We then extend
these results to the spectral shape of linear code ensembles
whose original spectral shape is known, and demonstrate our
findings on two specific code ensembles: the Shannon ensemble
and the regular (j, k) Gallager ensemble
Untainted Puncturing for Irregular Low-Density Parity-Check Codes
Puncturing is a well-known coding technique widely used for constructing
rate-compatible codes. In this paper, we consider the problem of puncturing
low-density parity-check codes and propose a new algorithm for intentional
puncturing. The algorithm is based on the puncturing of untainted symbols, i.e.
nodes with no punctured symbols within their neighboring set. It is shown that
the algorithm proposed here performs better than previous proposals for a range
of coding rates and short proportions of punctured symbols.Comment: 4 pages, 3 figure
Repeat-Accumulate Codes for Reconciliation in Continuous Variable Quantum Key Distribution
This paper investigates the design of low-complexity error correction codes
for the verification step in continuous variable quantum key distribution
(CVQKD) systems. We design new coding schemes based on quasi-cyclic
repeat-accumulate codes which demonstrate good performances for CVQKD
reconciliation
Capacity-Achieving Ensembles of Accumulate-Repeat-Accumulate Codes for the Erasure Channel with Bounded Complexity
The paper introduces ensembles of accumulate-repeat-accumulate (ARA) codes
which asymptotically achieve capacity on the binary erasure channel (BEC) with
{\em bounded complexity}, per information bit, of encoding and decoding. It
also introduces symmetry properties which play a central role in the
construction of capacity-achieving ensembles for the BEC with bounded
complexity. The results here improve on the tradeoff between performance and
complexity provided by previous constructions of capacity-achieving ensembles
of codes defined on graphs. The superiority of ARA codes with moderate to large
block length is exemplified by computer simulations which compare their
performance with those of previously reported capacity-achieving ensembles of
LDPC and IRA codes. The ARA codes also have the advantage of being systematic.Comment: Submitted to IEEE Trans. on Information Theory, December 1st, 2005.
Includes 50 pages and 13 figure
- …