13 research outputs found

    Improving the efficiency of the LDPC code-based McEliece cryptosystem through irregular codes

    Full text link
    We consider the framework of the McEliece cryptosystem based on LDPC codes, which is a promising post-quantum alternative to classical public key cryptosystems. The use of LDPC codes in this context allows to achieve good security levels with very compact keys, which is an important advantage over the classical McEliece cryptosystem based on Goppa codes. However, only regular LDPC codes have been considered up to now, while some further improvement can be achieved by using irregular LDPC codes, which are known to achieve better error correction performance than regular LDPC codes. This is shown in this paper, for the first time at our knowledge. The possible use of irregular transformation matrices is also investigated, which further increases the efficiency of the system, especially in regard to the public key size.Comment: 6 pages, 3 figures, presented at ISCC 201

    New cryptanalysis of LFSR-based stream ciphers and decoders for p-ary QC-MDPC codes

    Get PDF
    The security of modern cryptography is based on the hardness of solving certain problems. In this context, a problem is considered hard if there is no known polynomial time algorithm to solve it. Initially, the security assessment of cryptographic systems only considered adversaries with classical computational resources, i.e., digital computers. It is now known that there exist polynomial-time quantum algorithms that would render certain cryptosystems insecure if large-scale quantum computers were available. Thus, adversaries with access to such computers should also be considered. In particular, cryptosystems based on the hardness of integer factorisation or the discrete logarithm problem would be broken. For some others such as symmetric-key cryptosystems, the impact seems not to be as serious; it is recommended to at least double the key size of currently used systems to preserve their security level. The potential threat posed by sufficiently powerful quantum computers motivates the continued study and development of post-quantum cryptography, that is, cryptographic systems that are secure against adversaries with access to quantum computations. It is believed that symmetric-key cryptosystems should be secure from quantum attacks. In this manuscript, we study the security of one such family of systems; namely, stream ciphers. They are mainly used in applications where high throughput is required in software or low resource usage is required in hardware. Our focus is on the cryptanalysis of stream ciphers employing linear feedback shift registers (LFSRs). This is modelled as the problem of finding solutions to systems of linear equations with associated probability distributions on the set of right hand sides. To solve this problem, we first present a multivariate version of the correlation attack introduced by Siegenthaler. Building on the ideas of the multivariate attack, we propose a new cryptanalytic method with lower time complexity. Alongside this, we introduce the notion of relations modulo a matrix B, which may be seen as a generalisation of parity-checks used in fast correlation attacks. The latter are among the most important class of attacks against LFSR-based stream ciphers. Our new method is successfully applied to hard instances of the filter generator and requires a lower amount of keystream compared to other attacks in the literature. We also perform a theoretical attack against the Grain-v1 cipher and an experimental attack against a toy Grain-like cipher. Compared to the best previous attack, our technique requires less keystream bits but also has a higher time complexity. This is the result of joint work with Semaev. Public-key cryptosystems based on error-correcting codes are also believed to be secure against quantum attacks. To this end, we develop a new technique in code-based cryptography. Specifically, we propose new decoders for quasi-cyclic moderate density parity-check (QC-MDPC) codes. These codes were proposed by Misoczki et al.\ for use in the McEliece scheme. The use of QC-MDPC codes avoids attacks applicable when using low-density parity-check (LDPC) codes and also allows for keys with short size. Although we focus on decoding for a particular instance of the p-ary QC-MDPC scheme, our new decoding algorithm is also a general decoding method for p-ary MDPC-like schemes. This algorithm is a bit-flipping decoder, and its performance is improved by varying thresholds for the different iterations. Experimental results demonstrate that our decoders enjoy a very low decoding failure rate for the chosen p-ary QC-MDPC instance. This is the result of joint work with Guo and Johansson.Doktorgradsavhandlin

    Decryption Failure Attacks on Post-Quantum Cryptography

    Get PDF
    This dissertation discusses mainly new cryptanalytical results related to issues of securely implementing the next generation of asymmetric cryptography, or Public-Key Cryptography (PKC).PKC, as it has been deployed until today, depends heavily on the integer factorization and the discrete logarithm problems.Unfortunately, it has been well-known since the mid-90s, that these mathematical problems can be solved due to Peter Shor's algorithm for quantum computers, which achieves the answers in polynomial time.The recently accelerated pace of R&D towards quantum computers, eventually of sufficient size and power to threaten cryptography, has led the crypto research community towards a major shift of focus.A project towards standardization of Post-quantum Cryptography (PQC) was launched by the US-based standardization organization, NIST. PQC is the name given to algorithms designed for running on classical hardware/software whilst being resistant to attacks from quantum computers.PQC is well suited for replacing the current asymmetric schemes.A primary motivation for the project is to guide publicly available research toward the singular goal of finding weaknesses in the proposed next generation of PKC.For public key encryption (PKE) or digital signature (DS) schemes to be considered secure they must be shown to rely heavily on well-known mathematical problems with theoretical proofs of security under established models, such as indistinguishability under chosen ciphertext attack (IND-CCA).Also, they must withstand serious attack attempts by well-renowned cryptographers both concerning theoretical security and the actual software/hardware instantiations.It is well-known that security models, such as IND-CCA, are not designed to capture the intricacies of inner-state leakages.Such leakages are named side-channels, which is currently a major topic of interest in the NIST PQC project.This dissertation focuses on two things, in general:1) how does the low but non-zero probability of decryption failures affect the cryptanalysis of these new PQC candidates?And 2) how might side-channel vulnerabilities inadvertently be introduced when going from theory to the practice of software/hardware implementations?Of main concern are PQC algorithms based on lattice theory and coding theory.The primary contributions are the discovery of novel decryption failure side-channel attacks, improvements on existing attacks, an alternative implementation to a part of a PQC scheme, and some more theoretical cryptanalytical results

    Error-Correction Coding and Decoding: Bounds, Codes, Decoders, Analysis and Applications

    Get PDF
    Coding; Communications; Engineering; Networks; Information Theory; Algorithm

    Cryptography based on the Hardness of Decoding

    Get PDF
    This thesis provides progress in the fields of for lattice and coding based cryptography. The first contribution consists of constructions of IND-CCA2 secure public key cryptosystems from both the McEliece and the low noise learning parity with noise assumption. The second contribution is a novel instantiation of the lattice-based learning with errors problem which uses uniform errors

    Silver: Silent VOLE and Oblivious Transfer from Hardness of Decoding Structured LDPC Codes

    Get PDF
    We put forth new protocols for oblivious transfer extension and vector OLE, called \emph{Silver}, for SILent Vole and oblivious transfER. Silver offers extremely high performances: generating 10 million random OTs on one core of a standard laptop requires only 300ms of computation and 122KB of communication. This represents 37% less computation and ~1300x less communication than the standard IKNP protocol, as well as ~4x less computation and ~4x less communication than the recent protocol of Yang et al. (CCS 2020). Silver is \emph{silent}: after a one-time cheap interaction, two parties can store small seeds, from which they can later \emph{locally} generate a large number of OTs \emph{while remaining offline}. Neither IKNP nor Yang et al. enjoys this feature; compared to the best known silent OT extension protocol of Boyle et al. (CCS 2019), upon which we build up, Silver has 19x less computation, and the same communication. Due to its attractive efficiency features, Silver yields major efficiency improvements in numerous MPC protocols. Our approach is a radical departure from the standard paradigm for building MPC protocols, in that we do \emph{not} attempt to base our constructions on a well-studied assumption. Rather, we follow an approach closer in spirit to the standard paradigm in the design of symmetric primitives: we identify a set of fundamental structural properties that allow us to withstand all known attacks, and put forth a candidate design, guided by our analysis. We also rely on extensive experimentations to analyze our candidate and experimentally validate their properties. In essence, our approach boils down to constructing new families of linear codes with (plausibly) high minimum distance and extremely low encoding time. While further analysis is of course warranted to confidently assess the security of Silver, we hope and believe that initiating this approach to the design of MPC primitives will pave the way to new secure primitives with extremely attractive efficiency features

    저밀도 부호의 응용: 묶음 지그재그 파운틴 부호와 WOM 부호

    Get PDF
    학위논문 (박사)-- 서울대학교 대학원 : 전기·컴퓨터공학부, 2017. 2. 노종선.This dissertation contains the following two contributions on the applications of sparse codes. Fountain codes Batched zigzag (BZ) fountain codes – Two-phase batched zigzag (TBZ) fountain codes Write-once memory (WOM) codes – WOM codes implemented by rate-compatible low-density generator matrix (RC-LDGM) codes First, two classes of fountain codes, called batched zigzag fountain codes and two-phase batched zigzag fountain codes, are proposed for the symbol erasure channel. At a cost of slightly lengthened code symbols, the involved message symbols in each batch of the proposed codes can be recovered by low complexity zigzag decoding algorithm. Thus, the proposed codes have low buffer occupancy during decoding process. These features are suitable for receivers with limited hardware resources in the broadcasting channel. A method to obtain degree distributions of code symbols for the proposed codes via ripple size evolution is also proposed by taking into account the released code symbols from the batches. It is shown that the proposed codes outperform Luby transform codes and zigzag decodable fountain codes with respect to intermediate recovery rate and coding overhead when message length is short, symbol erasure rate is low, and available buffer size is limited. In the second part of this dissertation, WOM codes constructed by sparse codes are presented. Recently, WOM codes are adopted to NAND flash-based solid-state drive (SSD) in order to extend the lifetime by reducing the number of erasure operations. Here, a new rewriting scheme for the SSD is proposed, which is implemented by multiple binary erasure quantization (BEQ) codes. The corresponding BEQ codes are constructed by RC-LDGM codes. Moreover, by putting RC-LDGM codes together with a page selection method, writing efficiency can be improved. It is verified via simulation that the SSD with proposed rewriting scheme outperforms the SSD without and with the conventional WOM codes for single level cell (SLC) and multi-level cell (MLC) flash memories.1 Introduction 1 1.1 Background 1 1.2 Overview of Dissertation 5 2 Sparse Codes 7 2.1 Linear Block Codes 7 2.2 LDPC Codes 9 2.3 Message Passing Decoder 11 3 New Fountain Codes with Improved Intermediate Recovery Based on Batched Zigzag Coding 13 3.1 Preliminaries 17 3.1.1 Definitions and Notation 17 3.1.2 LT Codes 18 3.1.3 Zigzag Decodable Codes 20 3.1.4 Bit-Level Overhead 22 3.2 New Fountain Codes Based on Batched Zigzag Coding 23 3.2.1 Construction of Shift Matrix 24 3.2.2 Encoding and Decoding of the Proposed BZ Fountain Codes 25 3.2.3 Storage and Computational Complexity 28 3.3 Degree Distribution of BZ Fountain Codes 31 3.3.1 Relation Between Ψ(x)\Psi(x) and Ω(x)\Omega(x) 31 3.3.2 Derivation of Ω(x)\Omega(x) via Ripple Size Evolution 32 3.4 Two-Phase Batched Zigzag Fountain Codes with Additional Memory 40 3.4.1 Code Construction 41 3.4.2 Bit-Level Overhead 46 3.5 Numerical Analysis 49 4 Write-Once Memory Codes Using Rate-Compatible LDGM Codes 60 4.1 Preliminaries 62 4.1.1 NAND Flash Memory 62 4.1.2 Rewriting Schemes for Flash Memory 62 4.1.3 Construction of Rewriting Codes by BEQ Codes 65 4.2 Proposed Rewriting Codes 67 4.2.1 System Model 67 4.2.2 Multi-rate Rewriting Codes 68 4.2.3 Page Selection for Rewriting 70 4.3 RC-LDGM Codes 74 4.4 Numerical Analysis 76 5 Conclusions 80 Bibliography 82 초록 94Docto

    A survey of timing channels and countermeasures

    Get PDF
    A timing channel is a communication channel that can transfer information to a receiver/decoder by modulating the timing behavior of an entity. Examples of this entity include the interpacket delays of a packet stream, the reordering packets in a packet stream, or the resource access time of a cryptographic module. Advances in the information and coding theory and the availability of high-performance computing systems interconnected by high-speed networks have spurred interest in and development of various types of timing channels. With the emergence of complex timing channels, novel detection and prevention techniques are also being developed to counter them. In this article, we provide a detailed survey of timing channels broadly categorized into network timing channel, in which communicating entities are connected by a network, and in-system timing channel, in which the communicating entities are within a computing system. This survey builds on the last comprehensive survey by Zander et al. [2007] and considers all three canonical applications of timing channels, namely, covert communication, timing side channel, and network flow watermarking. We survey the theoretical foundations, the implementation, and the various detection and prevention techniques that have been reported in literature. Based on the analysis of the current literature, we discuss potential future research directions both in the design and application of timing channels and their detection and prevention techniques

    Contributions to Confidentiality and Integrity Algorithms for 5G

    Get PDF
    The confidentiality and integrity algorithms in cellular networks protect the transmission of user and signaling data over the air between users and the network, e.g., the base stations. There are three standardised cryptographic suites for confidentiality and integrity protection in 4G, which are based on the AES, SNOW 3G, and ZUC primitives, respectively. These primitives are used for providing a 128-bit security level and are usually implemented in hardware, e.g., using IP (intellectual property) cores, thus can be quite efficient. When we come to 5G, the innovative network architecture and high-performance demands pose new challenges to security. For the confidentiality and integrity protection, there are some new requirements on the underlying cryptographic algorithms. Specifically, these algorithms should: 1) provide 256 bits of security to protect against attackers equipped with quantum computing capabilities; and 2) provide at least 20 Gbps (Gigabits per second) speed in pure software environments, which is the downlink peak data rate in 5G. The reason for considering software environments is that the encryption in 5G will likely be moved to the cloud and implemented in software. Therefore, it is crucial to investigate existing algorithms in 4G, checking if they can satisfy the 5G requirements in terms of security and speed, and possibly propose new dedicated algorithms targeting these goals. This is the motivation of this thesis, which focuses on the confidentiality and integrity algorithms for 5G. The results can be summarised as follows.1. We investigate the security of SNOW 3G under 256-bit keys and propose two linear attacks against it with complexities 2172 and 2177, respectively. These cryptanalysis results indicate that SNOW 3G cannot provide the full 256-bit security level. 2. We design some spectral tools for linear cryptanalysis and apply these tools to investigate the security of ZUC-256, the 256-bit version of ZUC. We propose a distinguishing attack against ZUC-256 with complexity 2236, which is 220 faster than exhaustive key search. 3. We design a new stream cipher called SNOW-V in response to the new requirements for 5G confidentiality and integrity protection, in terms of security and speed. SNOW-V can provide a 256-bit security level and achieve a speed as high as 58 Gbps in software based on our extensive evaluation. The cipher is currently under evaluation in ETSI SAGE (Security Algorithms Group of Experts) as a promising candidate for 5G confidentiality and integrity algorithms. 4. We perform deeper cryptanalysis of SNOW-V to ensure that two common cryptanalysis techniques, guess-and-determine attacks and linear cryptanalysis, do not apply to SNOW-V faster than exhaustive key search. 5. We introduce two minor modifications in SNOW-V and propose an extreme performance variant, called SNOW-Vi, in response to the feedback about SNOW-V that some use cases are not fully covered. SNOW-Vi covers more use cases, especially some platforms with less capabilities. The speeds in software are increased by 50% in average over SNOW-V and can be up to 92 Gbps.Besides these works on 5G confidentiality and integrity algorithms, the thesis is also devoted to local pseudorandom generators (PRGs). 6. We investigate the security of local PRGs and propose two attacks against some constructions instantiated on the P5 predicate. The attacks improve existing results with a large gap and narrow down the secure parameter regime. We also extend the attacks to other local PRGs instantiated on general XOR-AND and XOR-MAJ predicates and provide some insight in the choice of safe parameters
    corecore