100 research outputs found
Array Convolutional Low-Density Parity-Check Codes
This paper presents a design technique for obtaining regular time-invariant
low-density parity-check convolutional (RTI-LDPCC) codes with low complexity
and good performance. We start from previous approaches which unwrap a
low-density parity-check (LDPC) block code into an RTI-LDPCC code, and we
obtain a new method to design RTI-LDPCC codes with better performance and
shorter constraint length. Differently from previous techniques, we start the
design from an array LDPC block code. We show that, for codes with high rate, a
performance gain and a reduction in the constraint length are achieved with
respect to previous proposals. Additionally, an increase in the minimum
distance is observed.Comment: 4 pages, 2 figures, accepted for publication in IEEE Communications
Letter
Progressive Differences Convolutional Low-Density Parity-Check Codes
We present a new family of low-density parity-check (LDPC) convolutional
codes that can be designed using ordered sets of progressive differences. We
study their properties and define a subset of codes in this class that have
some desirable features, such as fixed minimum distance and Tanner graphs
without short cycles. The design approach we propose ensures that these
properties are guaranteed independently of the code rate. This makes these
codes of interest in many practical applications, particularly when high rate
codes are needed for saving bandwidth. We provide some examples of coded
transmission schemes exploiting this new class of codes.Comment: 8 pages, 2 figures. Accepted for publication in IEEE Communications
Letters. Copyright transferred to IEE
Time-Invariant Spatially Coupled Low-Density Parity-Check Codes with Small Constraint Length
We consider a special family of SC-LDPC codes, that is, time-invariant LDPCC
codes, which are known in the literature for a long time. Codes of this kind
are usually designed by starting from QC block codes, and applying suitable
unwrapping procedures. We show that, by directly designing the LDPCC code
syndrome former matrix without the constraints of the underlying QC block code,
it is possible to achieve smaller constraint lengths with respect to the best
solutions available in the literature. We also find theoretical lower bounds on
the syndrome former constraint length for codes with a specified minimum length
of the local cycles in their Tanner graphs. For this purpose, we exploit a new
approach based on a numerical representation of the syndrome former matrix,
which generalizes over a technique we already used to study a special subclass
of the codes here considered.Comment: 5 pages, 4 figures, to be presented at IEEE BlackSeaCom 201
Punctured Binary Simplex Codes as LDPC codes
Digital data transfer can be protected by means of suitable error correcting
codes. Among the families of state-of-the-art codes, LDPC (Low Density
Parity-Check) codes have received a great deal of attention recently, because
of their performance and flexibility of operation, in wireless and mobile radio
channels, as well as in cable transmission systems. In this paper, we present a
class of rate-adaptive LDPC codes, obtained as properly punctured simplex
codes. These codes allow for the use of an efficient soft-decision decoding
algorithm, provided that a condition called row-column constraint is satisfied.
This condition is tested on small-length codes, and then extended to
medium-length codes. The puncturing operations we apply do not influence the
satisfaction of the row-column constraint, assuring that a wide range of code
rates can be obtained. We can reach code rates remarkably higher than those
obtainable by the original simplex code, and the price in terms of minimum
distance turns out to be relatively small, leading to interesting trade-offs in
the resulting asymptotic coding gain
Design and Analysis of Time-Invariant SC-LDPC Convolutional Codes With Small Constraint Length
In this paper, we deal with time-invariant spatially coupled low-density
parity-check convolutional codes (SC-LDPC-CCs). Classic design approaches
usually start from quasi-cyclic low-density parity-check (QC-LDPC) block codes
and exploit suitable unwrapping procedures to obtain SC-LDPC-CCs. We show that
the direct design of the SC-LDPC-CCs syndrome former matrix or, equivalently,
the symbolic parity-check matrix, leads to codes with smaller syndrome former
constraint lengths with respect to the best solutions available in the
literature. We provide theoretical lower bounds on the syndrome former
constraint length for the most relevant families of SC-LDPC-CCs, under
constraints on the minimum length of cycles in their Tanner graphs. We also
propose new code design techniques that approach or achieve such theoretical
limits.Comment: 30 pages, 5 figures, accepted for publication in IEEE Transactions on
Communication
LSZ reduction formula in the worldline formalism
Una nuova rappresentazione di integrale di percorso viene proposta per la rappresentazione della funzione di Green nel contesto del formalismo di worldline. Questa versione può essere LSZ ridotta direttamente al livello dell'integrale di percorso, direttamente nello spazio delle posizioni. Come applicazione, alcune somme di diagrammi di Feynman sono calcolate usando l'integrale di percorso proposto. Una ulteriore applicazione è lo studio del limite classico e delle relazioni tipo KLT direttamente nel limite classico
Interleaved Product LDPC Codes
Product LDPC codes take advantage of LDPC decoding algorithms and the high
minimum distance of product codes. We propose to add suitable interleavers to
improve the waterfall performance of LDPC decoding. Interleaving also reduces
the number of low weight codewords, that gives a further advantage in the error
floor region.Comment: 11 pages, 5 figures, accepted for publication in IEEE Transactions on
Communication
Rate-compatible LDPC Codes based on Primitive Polynomials and Golomb Rulers
We introduce and study a family of rate-compatible Low-Density Parity-Check
(LDPC) codes characterized by very simple encoders. The design of these codes
starts from simplex codes, which are defined by parity-check matrices having a
straightforward form stemming from the coefficients of a primitive polynomial.
For this reason, we call the new codes Primitive Rate-Compatible LDPC
(PRC-LDPC) codes. By applying puncturing to these codes, we obtain a bit-level
granularity of their code rates. We show that, in order to achieve good LDPC
codes, the underlying polynomials, besides being primitive, must meet some more
stringent conditions with respect to those of classical punctured simplex
codes. We leverage non-modular Golomb rulers to take the new requirements into
account. We characterize the minimum distance properties of PRC-LDPC codes, and
study and discuss their encoding and decoding complexity. Finally, we assess
their error rate performance under iterative decoding
Hindering reaction attacks by using monomial codes in the McEliece cryptosystem
In this paper we study recent reaction attacks against QC-LDPC and QC-MDPC
code-based cryptosystems, which allow an opponent to recover the private
parity-check matrix through its distance spectrum by observing a sufficiently
high number of decryption failures. We consider a special class of codes, known
as monomial codes, to form private keys with the desirable property of having a
unique and complete distance spectrum. We verify that for these codes the
problem of recovering the secret key from the distance spectrum is equivalent
to that of finding cliques in a graph, and use this equivalence to prove that
current reaction attacks are not applicable when codes of this type are used in
the McEliece cryptosystem.Comment: 5 pages, 0 figures, 1 table, accepted for presentation at the 2018
IEEE International Symposium on Information Theory (ISIT
Iterative Soft-Decision Decoding of Binary Cyclic Codes
Binary cyclic codes achieve good error correction performance and allow the implementation of very simpleencoder and decoder circuits. Among them, BCH codesrepresent a very important class of t-error correcting codes, with known structural properties and error correction capability. Decoding of binary cyclic codes is often accomplished through hard-decision decoders, although it is recognized that softdecision decoding algorithms can produce significant coding gain with respect to hard-decision techniques. Several approaches have been proposed to implement iterative soft-decision decoding of binary cyclic codes. We study the technique based on “extended parity-check matrices”, and show that such method is not suitable for high rates or long codes. We propose a new approach, based on “reduced parity-check matrices” and “spread parity-check matrices”, that can achieve better correction performance in many practical cases, without increasing the complexity
- …