353 research outputs found

    Variations of the McEliece Cryptosystem

    Full text link
    Two variations of the McEliece cryptosystem are presented. The first one is based on a relaxation of the column permutation in the classical McEliece scrambling process. This is done in such a way that the Hamming weight of the error, added in the encryption process, can be controlled so that efficient decryption remains possible. The second variation is based on the use of spatially coupled moderate-density parity-check codes as secret codes. These codes are known for their excellent error-correction performance and allow for a relatively low key size in the cryptosystem. For both variants the security with respect to known attacks is discussed

    Error-correcting pairs: a new approach to code-based cryptography

    Get PDF
    International audienceMcEliece proposed the first public-key cryptosystem based on linear error-correcting codes. A code with an efficient bounded distance decoding algorithm is chosen as secret key. It is assumed that the chosen code looks like a random code. The known efficient bounded distance decoding algorithms of the families of codes proposed for code-based cryptography, like Reed-Solomon codes, Goppa codes, alternant codes or algebraic geometry codes, can be described in terms of error-correcting pairs (ECP). That means that, the McEliece cryptosystem is not only based on the intractability of bounded distance decoding but also on the problem of retrieving an error-correcting pair from the public code. In this article we propose the class of codes with a t-ECP whose error-correcting pair that is not easily reconstructed from of a given generator matrix

    A Survey on Homomorphic Encryption Schemes: Theory and Implementation

    Full text link
    Legacy encryption systems depend on sharing a key (public or private) among the peers involved in exchanging an encrypted message. However, this approach poses privacy concerns. Especially with popular cloud services, the control over the privacy of the sensitive data is lost. Even when the keys are not shared, the encrypted material is shared with a third party that does not necessarily need to access the content. Moreover, untrusted servers, providers, and cloud operators can keep identifying elements of users long after users end the relationship with the services. Indeed, Homomorphic Encryption (HE), a special kind of encryption scheme, can address these concerns as it allows any third party to operate on the encrypted data without decrypting it in advance. Although this extremely useful feature of the HE scheme has been known for over 30 years, the first plausible and achievable Fully Homomorphic Encryption (FHE) scheme, which allows any computable function to perform on the encrypted data, was introduced by Craig Gentry in 2009. Even though this was a major achievement, different implementations so far demonstrated that FHE still needs to be improved significantly to be practical on every platform. First, we present the basics of HE and the details of the well-known Partially Homomorphic Encryption (PHE) and Somewhat Homomorphic Encryption (SWHE), which are important pillars of achieving FHE. Then, the main FHE families, which have become the base for the other follow-up FHE schemes are presented. Furthermore, the implementations and recent improvements in Gentry-type FHE schemes are also surveyed. Finally, further research directions are discussed. This survey is intended to give a clear knowledge and foundation to researchers and practitioners interested in knowing, applying, as well as extending the state of the art HE, PHE, SWHE, and FHE systems.Comment: - Updated. (October 6, 2017) - This paper is an early draft of the survey that is being submitted to ACM CSUR and has been uploaded to arXiv for feedback from stakeholder

    Some New Mathematical Tools in Cryptology

    Get PDF
    In this paper some new mathematical technique used in the design and analysis of cipher systems have been reviewed. Firstly, some modern cryptosystems like stream ciphers, permutation-based systems and public key encryption systems are described and the mathematical tools used in their design have been outlined. Special emphasis has been laid on the problems related to application of computational complexity to cryptosystems. Recent work on the design of the systems based on a combined encryption and coding for error correction has also been reviewed. Some recent system-oriented techniques of cryptanalysis have been discussed. It has been brought out that with the increase in the complexity of the cryptosystems it is necessary to apply some statistical and classification techniques for the purpose of identifying a cryptosystem as also for classification of the total key set into smaller classes. Finally, some very recent work on the application of artificial intelligence technique in cryptography and cryptanalysis has been mentioned

    Folding Alternant and Goppa Codes with Non-Trivial Automorphism Groups

    Get PDF
    The main practical limitation of the McEliece public-key encryption scheme is probably the size of its key. A famous trend to overcome this issue is to focus on subclasses of alternant/Goppa codes with a non trivial automorphism group. Such codes display then symmetries allowing compact parity-check or generator matrices. For instance, a key-reduction is obtained by taking quasi-cyclic (QC) or quasi-dyadic (QD) alternant/Goppa codes. We show that the use of such symmetric alternant/Goppa codes in cryptography introduces a fundamental weakness. It is indeed possible to reduce the key-recovery on the original symmetric public-code to the key-recovery on a (much) smaller code that has not anymore symmetries. This result is obtained thanks to a new operation on codes called folding that exploits the knowledge of the automorphism group. This operation consists in adding the coordinates of codewords which belong to the same orbit under the action of the automorphism group. The advantage is twofold: the reduction factor can be as large as the size of the orbits, and it preserves a fundamental property: folding the dual of an alternant (resp. Goppa) code provides the dual of an alternant (resp. Goppa) code. A key point is to show that all the existing constructions of alternant/Goppa codes with symmetries follow a common principal of taking codes whose support is globally invariant under the action of affine transformations (by building upon prior works of T. Berger and A. D{\"{u}}r). This enables not only to present a unified view but also to generalize the construction of QC, QD and even quasi-monoidic (QM) Goppa codes. All in all, our results can be harnessed to boost up any key-recovery attack on McEliece systems based on symmetric alternant or Goppa codes, and in particular algebraic attacks.Comment: 19 page

    Cryptanalysis of Two McEliece Cryptosystems Based on Quasi-Cyclic Codes

    Full text link
    We cryptanalyse here two variants of the McEliece cryptosystem based on quasi-cyclic codes. Both aim at reducing the key size by restricting the public and secret generator matrices to be in quasi-cyclic form. The first variant considers subcodes of a primitive BCH code. We prove that this variant is not secure by finding and solving a linear system satisfied by the entries of the secret permutation matrix. The other variant uses quasi-cyclic low density parity-check codes. This scheme was devised to be immune against general attacks working for McEliece type cryptosystems based on low density parity-check codes by choosing in the McEliece scheme more general one-to-one mappings than permutation matrices. We suggest here a structural attack exploiting the quasi-cyclic structure of the code and a certain weakness in the choice of the linear transformations that hide the generator matrix of the code. Our analysis shows that with high probability a parity-check matrix of a punctured version of the secret code can be recovered in cubic time complexity in its length. The complete reconstruction of the secret parity-check matrix of the quasi-cyclic low density parity-check codes requires the search of codewords of low weight which can be done with about 2372^{37} operations for the specific parameters proposed.Comment: Major corrections. This version supersedes previuos one
    • …
    corecore