13 research outputs found

    Learning strikes again: The case of the DRS signature scheme

    Get PDF
    Lattice signature schemes generally require particular care when it comes to preventing secret information from leaking through signature transcript. For example, the Goldreich-Goldwasser-Halevi (GGH) signature scheme and the NTRUSign scheme were completely broken by the parallelepiped-learning attack of Nguyen and Regev (Eurocrypt 2006). Several heuristic countermeasures were also shown vulnerable to similar statistical attacks.At PKC 2008, Plantard, Susilo and Win proposed a new variant of GGH, informally arguing resistance to such attacks. Based on this variant, Plantard, Sipasseuth, Dumondelle and Susilo proposed a concrete signature scheme, called DRS, that has been accepted in the round 1 of the NIST post-quantum cryptography project.In this work, we propose yet another statistical attack and demonstrate a weakness of the DRS scheme: one can recover some partial information of the secret key from sufficiently many signatures. One difficulty is that, due to the DRS reduction algorithm, the relation between the statistical leak and the secret seems more intricate. We work around this difficulty by training a statistical model, using a few features that we designed according to a simple heuristic analysis.While we only recover partial information on the secret key, this information is easily exploited by lattice attacks, significantly decreasing their complexity. Concretely, we claim that, provided that signatures are available, the secret key may be recovered using BKZ-138 for the first set of DRS parameters submitted to the NIST. This puts the security level of this parameter set below 80-bits (maybe even 70-bits), to be compared to an original claim of 128-bits.</p

    Cryptanalysis and Secure Implementation of Modern Cryptographic Algorithms

    Get PDF
    Cryptanalytic attacks can be divided into two classes: pure mathematical attacks and Side Channel Attacks (SCAs). Pure mathematical attacks are traditional cryptanalytic techniques that rely on known or chosen input-output pairs of the cryptographic function and exploit the inner structure of the cipher to reveal the secret key information. On the other hand, in SCAs, it is assumed that attackers have some access to the cryptographic device and can gain some information from its physical implementation. Cold-boot attack is a SCA which exploits the data remanence property of Random Access Memory (RAM) to retrieve its content which remains readable shortly after its power has been removed. Fault analysis is another example of SCAs in which the attacker is assumed to be able to induce faults in the cryptographic device and observe the faulty output. Then, by careful inspection of faulty outputs, the attacker recovers the secret information, such as secret inner state or secret key. Scan-based Design-For-Test (DFT) is a widely deployed technique for testing hardware chips. Scan-based SCAs exploit the information obtained by analyzing the scanned data in order to retrieve secret information from cryptographic hardware devices that are designed with this testability feature. In the first part of this work, we investigate the use of an off-the-shelf SAT solver, CryptoMinSat, to improve the key recovery of the Advance Encryption Standard (AES-128) key schedules from its corresponding decayed memory images which can be obtained using cold-boot attacks. We also present a fault analysis on both NTRUEncrypt and NTRUSign cryptosystems. For this specific original instantiation of the NTRU encryption system with parameters (N,p,q)(N,p,q), our attack succeeds with probability 11p\approx 1-\frac{1}{p} and when the number of faulted coefficients is upper bounded by tt, it requires O((pN)t)O((pN)^t) polynomial inversions in Z/pZ[x]/(xN1)\mathbb Z/p\mathbb Z[x]/(x^{N}-1). We also investigate several techniques to strengthen hardware implementations of NTRUEncrypt against this class of attacks. For NTRUSign with parameters (NN, q=plq=p^l, B\mathcal{B}, \emph{standard}, N\mathcal{N}), when the attacker is able to skip the norm-bound signature checking step, our attack needs one fault to succeed with probability 11p\approx 1-\frac{1}{p} and requires O((qN)t)O((qN)^t) steps when the number of faulted polynomial coefficients is upper bounded by tt. The attack is also applicable to NTRUSign utilizing the \emph{transpose} NTRU lattice but it requires double the number of fault injections. Different countermeasures against the proposed attack are also investigated. Furthermore, we present a scan-based SCA on NTRUEncrypt hardware implementations that employ scan-based DFT techniques. Our attack determines the scan chain structure of the polynomial multiplication circuits used in the decryption algorithm which allows the cryptanalyst to efficiently retrieve the secret key. Several key agreement schemes based on matrices were recently proposed. For example, \'{A}lvarez \emph{et al.} proposed a scheme in which the secret key is obtained by multiplying powers of block upper triangular matrices whose elements are defined over Zp\mathbb{Z}_p. Climent \emph{et al.} identified the elements of the endomorphisms ring End(Zp×Zp2)End(\mathbb{Z}_p \times \mathbb{Z}_{p^2}) with elements in a set, EpE_p, of matrices of size 2×22\times 2, whose elements in the first row belong to Zp\mathbb{Z}_{p} and the elements in the second row belong to Zp2\mathbb{Z}_{p^2}. Keith Salvin presented a key exchange protocol using matrices in the general linear group, GL(r,Zn)GL(r,\mathbb{Z}_n), where nn is the product of two distinct large primes. The system is fully specified in the US patent number 7346162 issued in 2008. In the second part of this work, we present mathematical cryptanalytic attacks against these three schemes and show that they can be easily broken for all practical choices of their security parameters

    NTRU-KE: A Lattice-based Public Key Exchange Protocol

    Get PDF
    Public key exchange protocol is identified as an important application in the field of public-key cryptography. Most of the existing public key exchange schemes are Diffie-Hellman (DH)-type, whose security is based on DH problems over different groups. Note that there exists Shor\u27s polynomial-time algorithm to solve these DH problems when a quantum computer is available, we are therefore motivated to seek for a non-DH-type and quantum resistant key exchange protocol. To this end, we turn our attention to lattice-based cryptography. The higher methodology behind our roadmap is that in analogy to the link between ElGamal, DSA, and DH, one should expect a NTRU lattice-based key exchange primitive in related to NTRU-ENCRYPT and NTRU-SIGN. However, this excepted key exchange protocol is not presented yet and still missing. In this paper, this missing key exchange protocol is found, hereafter referred to as NTRU-KE, which is studied in aspects of security and key-mismatch failure. In comparison with ECDH (Elliptic Curve-based Diffie-Hellman), NTRU-KE features faster computation speed, resistance to quantum attack, and more communication overhead. Accordingly, we come to the conclusion that NTRU-KE is currently comparable with ECDH. However, decisive advantage of NTRU-KE will occur when quantum computers become a reality

    On the Design and Improvement of Lattice-based Cryptosystems

    Get PDF
    Digital signatures and encryption schemes constitute arguably an integral part of cryptographic schemes with the goal to meet the security needs of present and future private and business applications. However, almost all public key cryptosystems applied in practice are put at risk due to its vulnerability to quantum attacks as a result of Shor's quantum algorithm. The magnitude of economic and social impact is tremendous inherently asking for alternatives replacing classical schemes in case large-scale quantum computers are built. Lattice-based cryptography emerged as a powerful candidate attracting lots of attention not only due to its conjectured resistance against quantum attacks, but also because of its unique security guarantee to provide worst-case hardness of average-case instances. Hence, the requirement of imposing further assumptions on the hardness of randomly chosen instances disappears, resulting in more efficient instantiations of cryptographic schemes. The best known lattice attack algorithms run in exponential time. In this thesis we contribute to a smooth transition into a world with practically efficient lattice-based cryptographic schemes. This is indeed accomplished by designing new algorithms and cryptographic schemes as well as improving existing ones. Our contributions are threefold. First, we construct new encryption schemes that fully exploit the error term in LWE instances. To this end, we introduce a novel computational problem that we call Augmented LWE (A-LWE), differing from the original LWE problem only in the way the error term is produced. In fact, we embed arbitrary data into the error term without changing the target distributions. Following this, we prove that A-LWE instances are indistinguishable from LWE samples. This allows to build powerful encryption schemes on top of the A-LWE problem that are simple in its representations and efficient in practice while encrypting huge amounts of data realizing message expansion factors close to 1. This improves, to our knowledge, upon all existing encryption schemes. Due to the versatility of the error term, we further add various security features such as CCA and RCCA security or even plug lattice-based signatures into parts of the error term, thus providing an additional mechanism to authenticate encrypted data. Based on the methodology to embed arbitrary data into the error term while keeping the target distributions, we realize a novel CDT-like discrete Gaussian sampler that beats the best known samplers such as Knuth-Yao or the standard CDT sampler in terms of running time. At run time the table size amounting to 44 elements is constant for every discrete Gaussian parameter and the total space requirements are exactly as large as for the standard CDT sampler. Further results include a very efficient inversion algorithm for ring elements in special classes of cyclotomic rings. In fact, by use of the NTT it is possible to efficiently check for invertibility and deduce a representation of the corresponding unit group. Moreover, we generalize the LWE inversion algorithm for the trapdoor candidate of Micciancio and Peikert from power of two moduli to arbitrary composed integers using a different approach. In the second part of this thesis, we present an efficient trapdoor construction for ideal lattices and an associated description of the GPV signature scheme. Furthermore, we improve the signing step using a different representation of the involved perturbation matrix leading to enhanced memory usage and running times. Subsequently, we introduce an advanced compression algorithm for GPV signatures, which previously suffered from huge signature sizes as a result of the construction or due to the requirement of the security proof. We circumvent this problem by introducing the notion of public and secret randomness for signatures. In particular, we generate the public portion of a signature from a short uniform random seed without violating the previous conditions. This concept is subsequently transferred to the multi-signer setting which increases the efficiency of the compression scheme in presence of multiple signers. Finally in this part, we propose the first lattice-based sequential aggregate signature scheme that enables a group of signers to sequentially generate an aggregate signature of reduced storage size such that the verifier is still able to check that each signer indeed signed a message. This approach is realized based on lattice-based trapdoor functions and has many application areas such as wireless sensor networks. In the final part of this thesis, we extend the theoretical foundations of lattices and propose new representations of lattice problems by use of Cauchy integrals. Considering lattice points as simple poles of some complex functions allows to operate on lattice points via Cauchy integrals and its generalizations. For instance, we can deduce for the one-dimensional and two-dimensional case simple expressions for the number of lattice points inside a domain using trigonometric or elliptic functions

    Signing Information in the Quantum Era

    Get PDF
    Signatures are primarily used as a mark of authenticity, to demonstrate that the sender of a message is who they claim to be. In the current digital age, signatures underpin trust in the vast majority of information that we exchange, particularly on public networks such as the internet. However, schemes for signing digital information which are based on assumptions of computational complexity are facing challenges from advances in mathematics, the capability of computers, and the advent of the quantum era. Here we present a review of digital signature schemes, looking at their origins and where they are under threat. Next, we introduce post-quantum digital schemes, which are being developed with the specific intent of mitigating against threats from quantum algorithms whilst still relying on digital processes and infrastructure. Finally, we review schemes for signing information carried on quantum channels, which promise provable security metrics. Signatures were invented as a practical means of authenticating communications and it is important that the practicality of novel signature schemes is considered carefully, which is kept as a common theme of interest throughout this review

    Sécurité étendue de la cryptographie fondée sur les réseaux euclidiens

    Get PDF
    Lattice-based cryptography is considered as a quantum-safe alternative for the replacement of currently deployed schemes based on RSA and discrete logarithm on prime fields or elliptic curves. It offers strong theoretical security guarantees, a large array of achievable primitives, and a competitive level of efficiency. Nowadays, in the context of the NIST post-quantum standardization process, future standards may ultimately be chosen and several new lattice-based schemes are high-profile candidates. The cryptographic research has been encouraged to analyze lattice-based cryptosystems, with a particular focus on practical aspects. This thesis is rooted in this effort.In addition to black-box cryptanalysis with classical computing resources, we investigate the extended security of these new lattice-based cryptosystems, employing a broad spectrum of attack models, e.g. quantum, misuse, timing or physical attacks. Accounting that these models have already been applied to a large variety of pre-quantum asymmetric and symmetric schemes before, we concentrate our efforts on leveraging and addressing the new features introduced by lattice structures. Our contribution is twofold: defensive, i.e. countermeasures for implementations of lattice-based schemes and offensive, i.e. cryptanalysis.On the defensive side, in view of the numerous recent timing and physical attacks, we wear our designer’s hat and investigate algorithmic protections. We introduce some new algorithmic and mathematical tools to construct provable algorithmic countermeasures in order to systematically prevent all timing and physical attacks. We thus participate in the actual provable protection of the GLP, BLISS, qTesla and Falcon lattice-based signatures schemes.On the offensive side, we estimate the applicability and complexity of novel attacks leveraging the lack of perfect correctness introduced in certain lattice-based encryption schemes to improve their performance. We show that such a compromise may enable decryption failures attacks in a misuse or quantum model. We finally introduce an algorithmic cryptanalysis tool that assesses the security of the mathematical problem underlying lattice-based schemes when partial knowledge of the secret is available. The usefulness of this new framework is demonstrated with the improvement and automation of several known classical, decryption-failure, and side-channel attacks.La cryptographie fondée sur les réseaux euclidiens représente une alternative prometteuse à la cryptographie asymétrique utilisée actuellement, en raison de sa résistance présumée à un ordinateur quantique universel. Cette nouvelle famille de schémas asymétriques dispose de plusieurs atouts parmi lesquels de fortes garanties théoriques de sécurité, un large choix de primitives et, pour certains de ses représentants, des performances comparables aux standards actuels. Une campagne de standardisation post-quantique organisée par le NIST est en cours et plusieurs schémas utilisant des réseaux euclidiens font partie des favoris. La communauté scientifique a été encouragée à les analyser car ils pourraient à l’avenir être implantés dans tous nos systèmes. L’objectif de cette thèse est de contribuer à cet effort.Nous étudions la sécurité de ces nouveaux cryptosystèmes non seulement au sens de leur résistance à la cryptanalyse en “boîte noire” à l’aide de moyens de calcul classiques, mais aussi selon un spectre plus large de modèles de sécurité, comme les attaques quantiques, les attaques supposant des failles d’utilisation, ou encore les attaques par canaux auxiliaires. Ces différents types d’attaques ont déjà été largement formalisés et étudiés par le passé pour des schémas asymétriques et symétriques pré-quantiques. Dans ce mémoire, nous analysons leur application aux nouvelles structures induites par les réseaux euclidiens. Notre travail est divisé en deux parties complémentaires : les contremesures et les attaques.La première partie regroupe nos contributions à l’effort actuel de conception de nouvelles protections algorithmiques afin de répondre aux nombreuses publications récentes d’attaques par canaux auxiliaires. Les travaux réalisés en équipe auxquels nous avons pris part on abouti à l’introduction de nouveaux outils mathématiques pour construire des contre-mesures algorithmiques, appuyées sur des preuves formelles, qui permettent de prévenir systématiquement les attaques physiques et par analyse de temps d’exécution. Nous avons ainsi participé à la protection de plusieurs schémas de signature fondés sur les réseaux euclidiens comme GLP, BLISS, qTesla ou encore Falcon.Dans une seconde partie consacrée à la cryptanalyse, nous étudions dans un premier temps de nouvelles attaques qui tirent parti du fait que certains schémas de chiffrement à clé publique ou d’établissement de clé peuvent échouer avec une faible probabilité. Ces échecs sont effectivement faiblement corrélés au secret. Notre travail a permis d’exhiber des attaques dites « par échec de déchiffrement » dans des modèles de failles d’utilisation ou des modèles quantiques. Nous avons d’autre part introduit un outil algorithmique de cryptanalyse permettant d’estimer la sécurité du problème mathématique sous-jacent lorsqu’une information partielle sur le secret est donnée. Cet outil s’est avéré utile pour automatiser et améliorer plusieurs attaques connues comme des attaques par échec de déchiffrement, des attaques classiques ou encore des attaques par canaux auxiliaires

    Post Quantum Cryptography

    Get PDF
    Riassunto tesi magistrale: Post Quantum Cryptography Candidato: VAIRA Antonio Durante la stesura della mia tesi, su cui ho lavorato quest'ultimo anno della mia carriera universitaria, ho cercato di rispondere alla domanda: Qual è lo stato dell'arte della crittografia odierna in grado di sostenere un attacco da parte di utente malevolo in possesso di un “grande” computer quantistico? Per “grande” si intende un computer quantistico che abbia un registro di diverse migliaia di qubit e quindi sia in grado di far girare degli algoritmi quantistici effettivamente utilizzabili. Ad oggi sono stati ideati solamente due algoritmi quantistici per scopi cripto-analitici ed uno di questi, l’algoritmo di Shor, permette di fattorizzare grandi numeri in un tempo ridotto. Questo “semplice” problema matematico (o meglio algoritmico) è alla base della sicurezza dei cripto-sistemi a chiave pubblica utilizzati oggi. In un sistema a chiave pubblica odierno, come l'RSA, la difficoltà di manomissione è legata alla difficoltà algoritmica di fattorizzare grandi numeri, dove difficoltà implica non l'impossibilità ma bensì un dispendio di risorse/tempo per l’hacking maggiore del valore stesso dell'informazione che si otterrebbe da tale hacking. Nel mio lavoro inizialmente ho preso in esame, in linea di massima, sia i cripto-sistemi utilizzati oggi sia gli algoritmi quantistici utilizzabili in un ipotetico futuro per comprometterli. Successivamente, ho focalizzato la mia attenzione sulle alternative “mainstream” ai cripto-sistemi impiegati oggi: ovvero algoritmi post-quantum, proposti dalla comunità di cripto-analisti attivi nel campo, e ho provato a costruire un framework per valutarne l'effettivo impiego. In questo framework, quindi, ho scelto di approfondire lo studio della famiglia dei lattice-based (algoritmi la cui sicurezza è basata sulla difficoltà di risolvere problemi relativi ai lattici multidimensionali). All'interno di questa famiglia di cripto-sistemi ne ho individuato uno in particolare, lo NTRU, particolarmente promettente per un impiego, nell'immediato futuro, all'interno di una corporate PKI (un esempio a me vicino è la PKI sviluppata all'interno dell'Airbus, per la quale ho lavorato durante quest'ultimo anno). Ho in seguito approfondito lo studio di un altro algoritmo appartenente alla medesima famiglia dei cripto-sistemi lattice-based : il ring-LWE in quanto molto più interessante da un punto di vista accademico ma ancora relativamente sconosciuto. Successivamente ho elaborato alcune modifiche per lo stesso algoritmo con lo scopo di renderlo più veloce e affidabile, incrementando così la probabilità di recuperare un messaggio corretto dal corrispondente crittogramma. Come ultima tappa, ho implementato l'algoritmo di criptazione modificato utilizzando un linguaggio ad alto livello (molto vicino a python) e ho comparato sia i tempi di esecuzione sia le risorse utilizzate con un versione non modificata ma implementata con lo stesso linguaggio, rendendo così i confronti il più coerenti possibile. Dai risultati ottenuti risulta che la versione modificata del ring-LWE è estremamente promettente ma necessita di una più approfondila analisi da un punto di vista cripto-analitico, ovvero è necessario stressare l'algoritmo per capire se introduca o meno ulteriori vulnerabilità rispetto alla versione originale. In conclusione l'algoritmo ring-LWE, che ho maggiormente approfondito, offre diversi e interessanti spunti di riflessione ma è ben lontano dall'essere implementato in una infrastruttura reale. Nel evenienza di una sostituzione immediata è in generale buona norma in crittografia ripiegare su strade già battute e implementazioni più vecchie e fidate, un esempio all'interno degli algoritmi post-quantum è sicuramente lo NTRU (già dal 2008 standard IEEE: Std 1363.1). Come ultima riflessione personale vorrei aggiungere che malgrado in questo mio lavoro di tesi l'aspetto fisico appaia marginale è solo grazie alla preparazione, che ho maturato nel mio percorso di studi, che ho potuto svolgerlo senza grandi difficoltà e facendo un esperienza davvero costruttiva in una grande azienda come l'Airbus. Dopo tutto un bagaglio fisico ci rende dei “problem-solver” nelle situazioni più disparate
    corecore