24 research outputs found
A Generic Attack on Lattice-based Schemes using Decryption Errors with Application to ss-ntru-pke
Hard learning problems are central topics in recent cryptographic research. Many cryptographic primitives relate their security to difficult problems in lattices, such as the shortest vector problem. Such schemes include the possibility of decryption errors with some very small probability. In this paper we propose and discuss a generic attack for secret key recovery based on generating decryption errors. In a standard PKC setting, the model first consists of a precomputation phase where
special messages and their corresponding error vectors are generated. Secondly, the messages are submitted for decryption and some decryption errors are observed. Finally, a phase with a statistical analysis of the messages/errors causing the decryption errors reveals the secret key. The idea is that conditioned on certain secret keys, the decryption error probability is significantly higher than the average case used in the error probability estimation. The attack is demonstrated in detail on one
NIST Post-Quantum Proposal, ss-ntru-pke, that is attacked with complexity below the claimed security level
Decryption Failure Attacks on Post-Quantum Cryptography
This dissertation discusses mainly new cryptanalytical results related to issues of securely implementing the next generation of asymmetric cryptography, or Public-Key Cryptography (PKC).PKC, as it has been deployed until today, depends heavily on the integer factorization and the discrete logarithm problems.Unfortunately, it has been well-known since the mid-90s, that these mathematical problems can be solved due to Peter Shor's algorithm for quantum computers, which achieves the answers in polynomial time.The recently accelerated pace of R&D towards quantum computers, eventually of sufficient size and power to threaten cryptography, has led the crypto research community towards a major shift of focus.A project towards standardization of Post-quantum Cryptography (PQC) was launched by the US-based standardization organization, NIST. PQC is the name given to algorithms designed for running on classical hardware/software whilst being resistant to attacks from quantum computers.PQC is well suited for replacing the current asymmetric schemes.A primary motivation for the project is to guide publicly available research toward the singular goal of finding weaknesses in the proposed next generation of PKC.For public key encryption (PKE) or digital signature (DS) schemes to be considered secure they must be shown to rely heavily on well-known mathematical problems with theoretical proofs of security under established models, such as indistinguishability under chosen ciphertext attack (IND-CCA).Also, they must withstand serious attack attempts by well-renowned cryptographers both concerning theoretical security and the actual software/hardware instantiations.It is well-known that security models, such as IND-CCA, are not designed to capture the intricacies of inner-state leakages.Such leakages are named side-channels, which is currently a major topic of interest in the NIST PQC project.This dissertation focuses on two things, in general:1) how does the low but non-zero probability of decryption failures affect the cryptanalysis of these new PQC candidates?And 2) how might side-channel vulnerabilities inadvertently be introduced when going from theory to the practice of software/hardware implementations?Of main concern are PQC algorithms based on lattice theory and coding theory.The primary contributions are the discovery of novel decryption failure side-channel attacks, improvements on existing attacks, an alternative implementation to a part of a PQC scheme, and some more theoretical cryptanalytical results
(One) Failure Is Not an Option:Bootstrapping the Search for Failures in Lattice-Based Encryption Schemes
Lattice-based encryption schemes are often subject to the possibility of decryption failures, in which valid encryptions are decrypted incorrectly. Such failures, in large number, leak information about the secret key, enabling an attack strategy alternative to pure lattice reduction. Extending the failure boosting\u27\u27 technique of D\u27Anvers et al. in PKC 2019, we propose an approach that we call directional failure boosting\u27\u27 that uses previously found failing ciphertexts\u27\u27 to accelerate the search for new ones. We analyse in detail the case where the lattice is defined over polynomial ring modules quotiented by and demonstrate it on a simple Mod-LWE-based scheme parametrized Ć la Kyber768/Saber. We show that, using our technique, for a given secret key (single-target setting), the cost of searching for additional failing ciphertexts after one or more have already been found, can be sped up dramatically. We thus demonstrate that, in this single-target model, these schemes should be designed so that it is hard to even obtain one decryption failure. Besides, in a wider security model where there are many target secret keys (multi-target setting), our attack greatly improves over the state of the art
Studies on the Security of Selected Advanced Asymmetric Cryptographic Primitives
The main goal of asymmetric cryptography is to provide confidential communication, which allows two parties to communicate securely even in the presence of adversaries. Ever since its invention in the seventies, asymmetric cryptography has been improved and developed further, and a formal security framework has been established around it. This framework includes different security goals, attack models, and security notions. As progress was made in the field, more advanced asymmetric cryptographic primitives were proposed, with other properties in addition to confidentiality. These new primitives also have their own definitions and notions of security.
This thesis consists of two parts, where the first relates to the security of fully homomorphic encryption and related primitives. The second part presents a novel cryptographic primitive, and defines what security goals the primitive should achieve.
The first part of the thesis consists of Article I, II, and III, which all pertain to the security of homomorphic encryption schemes in one respect or another. Article I demonstrates that a particular fully homomorphic encryption scheme is insecure in the sense that an adversary with access only to the public material can recover the secret key. It is also shown that this insecurity mainly stems from the operations necessary to make the scheme fully homomorphic. Article II presents an adaptive key recovery attack on a leveled homomorphic encryption scheme. The scheme in question claimed to withstand precisely such attacks, and was the only scheme of its kind to do so at the time. This part of the thesis culminates with Article III, which is an overview article on the IND-CCA1 security of all acknowledged homomorphic encryption schemes.
The second part of the thesis consists of Article IV, which presents Vetted Encryption (VE), a novel asymmetric cryptographic primitive. The primitive is designed to allow a recipient to vet who may send them messages, by setting up a public filter with a public verification key, and providing each vetted sender with their own encryption key. There are three different variants of VE, based on whether the sender is identifiable to the filter and/or the recipient. Security definitions, general constructions and comparisons to already existing cryptographic primitives are provided for all three variants.Doktorgradsavhandlin
Envisioning the Future of Cyber Security in Post-Quantum Era: A Survey on PQ Standardization, Applications, Challenges and Opportunities
The rise of quantum computers exposes vulnerabilities in current public key
cryptographic protocols, necessitating the development of secure post-quantum
(PQ) schemes. Hence, we conduct a comprehensive study on various PQ approaches,
covering the constructional design, structural vulnerabilities, and offer
security assessments, implementation evaluations, and a particular focus on
side-channel attacks. We analyze global standardization processes, evaluate
their metrics in relation to real-world applications, and primarily focus on
standardized PQ schemes, selected additional signature competition candidates,
and PQ-secure cutting-edge schemes beyond standardization. Finally, we present
visions and potential future directions for a seamless transition to the PQ
era
Achieving secure and efficient lattice-based public-key encryption: the impact of the secret-key distribution
Lattice-based public-key encryption has a large
number of design choices that can be combined in
diverse ways to obtain different tradeoffs. One of these choices is
the distribution from which secret keys are sampled.
Numerous secret-key distributions exist in the state of the
art, including (discrete) Gaussian, binomial, ternary, and fixed-weight ternary.
Although the secret-key distribution
impacts both the concrete security and the performance of
the schemes, it has not been compared in a detailed way how
the choice of secret-key distribution affects this tradeoff.
In this paper, we compare different aspects of
secret-key distributions from submissions to the NIST
post-quantum standardization effort.
We consider their impact on
concrete security (influenced by the entropy and
variance of the distribution), and
on decryption
failures and
IND-CCA2 security
(influenced by the probability of sampling keys
with ``non average, large\u27\u27 norm).
Next, we select concrete parameters of an
encryption
scheme instantiated with the above
distributions
%optimized for key sizes,
to identify which distribution(s) offer the
best tradeoffs between security and key sizes.
The conclusions of the paper are:
first, the above optimization shows that
fixed-weight
ternary secret keys result in the smallest key sizes in the analyzed scheme.
The reason is that such secret keys
reduce the decryption failure rate
and hence allow for a higher
noise-to-modulus ratio, alleviating the slight increase in
lattice dimension required for countering specialized attacks that
apply in this case.
Second, compared to secret keys
with independently sampled components,
secret keys with a fixed composition
(i.e., the number of secret key components equal to any possible
value is fixed)
result in the scheme becoming more secure against
active attacks based on decryption failures
Multitarget decryption failure attacks and their application to Saber and Kyber
Many lattice-based encryption schemes are subject to a very small probability of decryption failures. It has been shown that an adversary can efficiently recover the secret key using a number of ciphertexts that cause such a decryption failure. In PKC~2019, D\u27Anvers~et~al. introduced `failure boosting\u27, a technique to speed up the search for decryption failures. In this work we first improve the state-of-the-art multitarget failure boosting attacks. We then improve the cost calculation of failure boosting and extend the applicability of these calculations to permit cost calculations of real-world schemes. Using our newly developed methodologies we determine the multitarget decryption failure attack cost for all parameter sets of Saber and Kyber, showing among others that the quantum security of Saber can theoretically be reduced from 172 bits to 145 bits in specific circumstances. We then discuss the applicability of decryption failure attack in real-world scenarios, showing that an attack might not be practical to execute
SƩcuritƩ Ʃtendue de la cryptographie fondƩe sur les rƩseaux euclidiens
Lattice-based cryptography is considered as a quantum-safe alternative for the replacement of currently deployed schemes based on RSA and discrete logarithm on prime fields or elliptic curves. It offers strong theoretical security guarantees, a large array of achievable primitives, and a competitive level of efficiency. Nowadays, in the context of the NIST post-quantum standardization process, future standards may ultimately be chosen and several new lattice-based schemes are high-profile candidates. The cryptographic research has been encouraged to analyze lattice-based cryptosystems, with a particular focus on practical aspects. This thesis is rooted in this effort.In addition to black-box cryptanalysis with classical computing resources, we investigate the extended security of these new lattice-based cryptosystems, employing a broad spectrum of attack models, e.g. quantum, misuse, timing or physical attacks. Accounting that these models have already been applied to a large variety of pre-quantum asymmetric and symmetric schemes before, we concentrate our efforts on leveraging and addressing the new features introduced by lattice structures. Our contribution is twofold: defensive, i.e. countermeasures for implementations of lattice-based schemes and offensive, i.e. cryptanalysis.On the defensive side, in view of the numerous recent timing and physical attacks, we wear our designerās hat and investigate algorithmic protections. We introduce some new algorithmic and mathematical tools to construct provable algorithmic countermeasures in order to systematically prevent all timing and physical attacks. We thus participate in the actual provable protection of the GLP, BLISS, qTesla and Falcon lattice-based signatures schemes.On the offensive side, we estimate the applicability and complexity of novel attacks leveraging the lack of perfect correctness introduced in certain lattice-based encryption schemes to improve their performance. We show that such a compromise may enable decryption failures attacks in a misuse or quantum model. We finally introduce an algorithmic cryptanalysis tool that assesses the security of the mathematical problem underlying lattice-based schemes when partial knowledge of the secret is available. The usefulness of this new framework is demonstrated with the improvement and automation of several known classical, decryption-failure, and side-channel attacks.La cryptographie fondeĢe sur les reĢseaux euclidiens repreĢsente une alternative prometteuse aĢ la cryptographie asymeĢtrique utiliseĢe actuellement, en raison de sa reĢsistance preĢsumeĢe aĢ un ordinateur quantique universel. Cette nouvelle famille de scheĢmas asymeĢtriques dispose de plusieurs atouts parmi lesquels de fortes garanties theĢoriques de seĢcuriteĢ, un large choix de primitives et, pour certains de ses repreĢsentants, des performances comparables aux standards actuels. Une campagne de standardisation post-quantique organiseĢe par le NIST est en cours et plusieurs scheĢmas utilisant des reĢseaux euclidiens font partie des favoris. La communauteĢ scientifique a eĢteĢ encourageĢe aĢ les analyser car ils pourraient aĢ lāavenir eĢtre implanteĢs dans tous nos systeĢmes. Lāobjectif de cette theĢse est de contribuer aĢ cet effort.Nous eĢtudions la seĢcuriteĢ de ces nouveaux cryptosysteĢmes non seulement au sens de leur reĢsistance aĢ la cryptanalyse en āboiĢte noireā aĢ lāaide de moyens de calcul classiques, mais aussi selon un spectre plus large de modeĢles de seĢcuriteĢ, comme les attaques quantiques, les attaques supposant des failles dāutilisation, ou encore les attaques par canaux auxiliaires. Ces diffeĢrents types dāattaques ont deĢjaĢ eĢteĢ largement formaliseĢs et eĢtudieĢs par le passeĢ pour des scheĢmas asymeĢtriques et symeĢtriques preĢ-quantiques. Dans ce meĢmoire, nous analysons leur application aux nouvelles structures induites par les reĢseaux euclidiens. Notre travail est diviseĢ en deux parties compleĢmentaires : les contremesures et les attaques.La premieĢre partie regroupe nos contributions aĢ lāeffort actuel de conception de nouvelles protections algorithmiques afin de reĢpondre aux nombreuses publications reĢcentes dāattaques par canaux auxiliaires. Les travaux reĢaliseĢs en eĢquipe auxquels nous avons pris part on abouti aĢ lāintroduction de nouveaux outils matheĢmatiques pour construire des contre-mesures algorithmiques, appuyeĢes sur des preuves formelles, qui permettent de preĢvenir systeĢmatiquement les attaques physiques et par analyse de temps dāexeĢcution. Nous avons ainsi participeĢ aĢ la protection de plusieurs scheĢmas de signature fondeĢs sur les reĢseaux euclidiens comme GLP, BLISS, qTesla ou encore Falcon.Dans une seconde partie consacreĢe aĢ la cryptanalyse, nous eĢtudions dans un premier temps de nouvelles attaques qui tirent parti du fait que certains scheĢmas de chiffrement aĢ cleĢ publique ou dāeĢtablissement de cleĢ peuvent eĢchouer avec une faible probabiliteĢ. Ces eĢchecs sont effectivement faiblement correĢleĢs au secret. Notre travail a permis dāexhiber des attaques dites Ā« par eĢchec de deĢchiffrement Ā» dans des modeĢles de failles dāutilisation ou des modeĢles quantiques. Nous avons dāautre part introduit un outil algorithmique de cryptanalyse permettant dāestimer la seĢcuriteĢ du probleĢme matheĢmatique sous-jacent lorsquāune information partielle sur le secret est donneĢe. Cet outil sāest aveĢreĢ utile pour automatiser et ameĢliorer plusieurs attaques connues comme des attaques par eĢchec de deĢchiffrement, des attaques classiques ou encore des attaques par canaux auxiliaires
On the IND-CCA1 Security of FHE Schemes
Fully homomorphic encryption (FHE) is a powerful tool in cryptography that allows one to perform arbitrary computations on encrypted material without having to decrypt it first. There are numerous FHE schemes, all of which are expanded from somewhat homomorphic encryption (SHE) schemes, and some of which are considered viable in practice. However, while these FHE schemes are semantically (IND-CPA) secure, the question of their IND-CCA1 security is much less studied, and we therefore provide an overview of the IND-CCA1 security of all acknowledged FHE schemes in this paper. To give this overview, we grouped the SHE schemes into broad categories based on their similarities and underlying hardness problems. For each category, we show that the SHE schemes are susceptible to either known adaptive key recovery attacks, a natural extension of known attacks, or our proposed attacks. Finally, we discuss the known techniques to achieve IND-CCA1-secure FHE and SHE schemes. We concluded that none of the proposed schemes were IND-CCA1-secure and that the known general constructions all had their shortcomings.publishedVersio
Implementation and Benchmarking of Round 2 Candidates in the NIST Post-Quantum Cryptography Standardization Process Using Hardware and Software/Hardware Co-design Approaches
Performance in hardware has typically played a major role in differentiating among leading candidates in cryptographic standardization efforts. Winners of two past NIST cryptographic contests (Rijndael in case of AES and Keccak in case of SHA-3) were ranked consistently among the two fastest candidates when implemented using FPGAs and ASICs. Hardware implementations of cryptographic operations may quite easily outperform software implementations for at least a subset of major performance metrics, such as speed, power consumption, and energy usage, as well as in terms of security against physical attacks, including side-channel analysis. Using hardware also permits much higher flexibility in trading one subset of these properties for another. A large number of candidates at the early stages of the standardization process makes the accurate and fair comparison very challenging. Nevertheless, in all major past cryptographic standardization efforts, future winners were identified quite early in the evaluation process and held their lead until the standard was selected. Additionally, identifying some candidates as either inherently slow or costly in hardware helped to eliminate a subset of candidates, saving countless hours of cryptanalysis. Finally, early implementations provided a baseline for future design space explorations, paving a way to more comprehensive and fairer benchmarking at the later stages of a given cryptographic competition. In this paper, we first summarize, compare, and analyze results reported by other groups until mid-May 2020, i.e., until the end of Round 2 of the NIST PQC process. We then outline our own methodology for implementing and benchmarking PQC candidates using both hardware and software/hardware co-design approaches. We apply our hardware approach to 6 lattice-based CCA-secure Key Encapsulation Mechanisms (KEMs), representing 4 NIST PQC submissions. We then apply a software-hardware co-design approach to 12 lattice-based CCA-secure KEMs, representing 8 Round 2 submissions. We hope that, combined with results reported by other groups, our study will provide NIST with helpful information regarding the relative performance of a significant subset of Round 2 PQC candidates, assuming that at least their major operations, and possibly the entire algorithms, are off-loaded to hardware