146 research outputs found

    Fundamental Limits of "Ankylography" due to Dimensional Deficiency

    Full text link
    Single-shot diffractive imaging of truly 3D structures suffers from a dimensional deficiency and does not scale. The applicability of "ankylography" is limited to objects that are small-sized in at least one dimension or that are essentially 2D otherwise.Comment: 2 pages, no figur

    A Homological Approach to Belief Propagation and Bethe Approximations

    Full text link
    We introduce a differential complex of local observables given a decomposition of a global set of random variables into subsets. Its boundary operator allows us to define a transport equation equivalent to Belief Propagation. This definition reveals a set of conserved quantities under Belief Propagation and gives new insight on the relationship of its equilibria with the critical points of Bethe free energy.Comment: 14 pages, submitted for the 2019 Geometric Science of Information colloquiu

    On the Decoding Failure Rate of QC-MDPC Bit-Flipping Decoders

    Get PDF
    International audienceQuasi-cyclic moderate density parity check codes allow the design of McEliece-like public-key encryption schemes with compact keys and a security that provably reduces to hard decoding problems for quasi-cyclic codes.In particular, QC-MDPC are among the most promising code-based key encapsulation mechanisms (KEM) that are proposed to the NIST call for standardization of quantum safe cryptography (two proposals, BIKE and QC-MDPC KEM).The first generation of decoding algorithms suffers from a small, but not negligible, decoding failure rate (DFR in the order of 10⁻⁷ to 10⁻¹⁰). This allows a key recovery attack presented by Guo, Johansson, and Stankovski (GJS attack) at Asiacrypt 2016 which exploits a small correlation between the faulty message patterns and the secret key of the scheme, and limits the usage of the scheme to KEMs using ephemeral public keys. It does not impact the interactive establishment of secure communications (e.g. TLS), but the use of static public keys for asynchronous applications (e.g. email) is rendered dangerous.Understanding and improving the decoding of QC-MDPC is thus of interest for cryptographic applications. In particular, finding parameters for which the failure rate is provably negligible (typically as low as 2⁻⁶⁴ or 2⁻¹²⁸) would allow static keys and increase the applicability of the mentioned cryptosystems.We study here a simple variant of bit-flipping decoding, which we call step-by-step decoding. It has a higher DFR but its evolution can be modeled by a Markov chain, within the theoretical framework of Julia Chaulet's PhD thesis. We study two other, more efficient, decoders. One is the textbook algorithm. The other is (close to) the BIKE decoder. For all those algorithms we provide simulation results, and, assuming an evolution similar to the step-by-step decoder, we extrapolate the value of the DFR as a function of the block length. This will give an indication of how much the code parameters must be increased to ensure resistance to the GJS attack

    Analysis of reaction and timing attacks against cryptosystems based on sparse parity-check codes

    Full text link
    In this paper we study reaction and timing attacks against cryptosystems based on sparse parity-check codes, which encompass low-density parity-check (LDPC) codes and moderate-density parity-check (MDPC) codes. We show that the feasibility of these attacks is not strictly associated to the quasi-cyclic (QC) structure of the code but is related to the intrinsically probabilistic decoding of any sparse parity-check code. So, these attacks not only work against QC codes, but can be generalized to broader classes of codes. We provide a novel algorithm that, in the case of a QC code, allows recovering a larger amount of information than that retrievable through existing attacks and we use this algorithm to characterize new side-channel information leakages. We devise a theoretical model for the decoder that describes and justifies our results. Numerical simulations are provided that confirm the effectiveness of our approach

    High girth column-weight-two LDPC codes based on distance graphs

    Get PDF
    Copyright © 2007 G. Malema and M. Liebelt. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.LDPC codes of column weight of two are constructed from minimal distance graphs or cages. Distance graphs are used to represent LDPC code matrices such that graph vertices that represent rows and edges are columns. The conversion of a distance graph into matrix form produces an adjacency matrix with column weight of two and girth double that of the graph. The number of 1's in each row (row weight) is equal to the degree of the corresponding vertex. By constructing graphs with different vertex degrees, we can vary the rate of corresponding LDPC code matrices. Cage graphs are used as examples of distance graphs to design codes with different girths and rates. Performance of obtained codes depends on girth and structure of the corresponding distance graphs.Gabofetswe Malema and Michael Liebel

    Analysis of Error-Correcting Codes for Lattice-Based Key Exchange

    Get PDF
    Lattice problems allow the construction of very efficient key exchange and public-key encryption schemes. When using the Learning with Errors (LWE) or Ring-LWE (RLWE) problem such schemes exhibit an interesting trade-off between decryption error rate and security. The reason is that secret and error distributions with a larger standard deviation lead to better security but also increase the chance of decryption failures. As a consequence, various message/key encoding or reconciliation techniques have been proposed that usually encode one payload bit into several coefficients. In this work, we analyze how error-correcting codes can be used to enhance the error resilience of protocols like NewHope, Frodo, or Kyber. For our case study, we focus on the recently introduced NewHope Simple and propose and analyze four different options for error correction: i) BCH code; ii) combination of BCH code and additive threshold encoding; iii) LDPC code; and iv) combination of BCH and LDPC code. We show that lattice-based cryptography can profit from classical and modern codes by combining BCH and LDPC codes. This way we achieve quasi-error-free communication and an increase of the estimated post-quantum bit-security level by 20.39% and a decrease of the communication overhead by 12.8%

    Improving Network-on-Chip-based Turbo Decoder Architectures

    Get PDF
    In this work novel results concerning Networkon- Chip-based turbo decoder architectures are presented. Stemming from previous publications, this work concentrates first on improving the throughput by exploiting adaptive-bandwidth-reduction techniques. This technique shows in the best case an improvement of more than 60 Mb/s. Moreover, it is known that double-binary turbo decoders require higher area than binary ones. This characteristic has the negative effect of increasing the data width of the network nodes. Thus, the second contribution of this work is to reduce the network complexity to support doublebinary codes, by exploiting bit-level and pseudo-floatingpoint representation of the extrinsic information. These two techniques allow for an area reduction of up to more than the 40 % with a performance degradation of about 0.2 d
    corecore