105 research outputs found
Accelerating Bliss: the geometry of ternary polynomials
The signature scheme Bliss proposed by Ducas, Durmus, Lepoint and Lyubashevsky at Crypto’13, is currently the most compact and efficient lattice-based signature scheme that is provably secure under lattice assumptions. It does compare favourably with the standardized schemes RSA and ECDSA on both Software and Hardware.
In this work, we introduce a new technique that improves the above scheme, offering an acceleration factor up to 2.8, depending on the set of parameters.
Namely, we improve the unnatural geometric bound used in Bliss to a tighter and much more natural bound by using some extra degree of freedom: the ternary representations of binary challenges. Precisely, we efficiently choose a ternary representation that makes the result deterministically shorter than the expected length for a random challenges.
Our modified scheme Bliss-b is rather close to the original scheme, and both versions are compatible. The patch has been implemented on the Open-Source Software implementation of Bliss, and will be released under similar license
Second order statistical behavior of LLL and BKZ
The LLL algorithm (from Lenstra, Lenstra and Lovász) and its generalization BKZ (from Schnorr and Euchner) are widely used in cryptanalysis, especially for lattice-based cryptography. Precisely understanding their behavior is crucial for deriving appropriate key-size for cryptographic schemes subject to lattice-reduction attacks. Current models, e.g. the Geometric Series Assumption and Chen-Nguyen’s BKZ-simulator, have provided a decent first-order analysis of the behavior of LLL and BKZ. However, they only focused on the average behavior and were not perfectly accurate. In this work, we initiate a second order analysis of this behavior. We confirm and quantify discrepancies between models and experiments —in particular in the head and tail regions— and study their consequences. We also provide variations around the mean and correlations statistics, and study their impact. While mostly based on experiments, by pointing at and quantifying unaccounted phenomena, our study sets the ground for a theoretical and predictive understanding of LLL and BKZ performances at the second order
Provable lattice reduction of Zn with blocksize n/2
The Lattice Isomorphism Problem (LIP) is the computational task of recovering, assuming it exists, an orthogonal linear transformation sending one lattice to another. For cryptographic purposes, the case of the trivial lattice Zn is of particular interest (Z LIP). Heuristic analysis suggests that the BKZ algorithm with blocksize β= n/ 2 + o(n) solves such instances (Ducas, Postlethwaite, Pulles, van Woerden, ASIACRYPT 2022). In this work, I propose a provable version of this statement, namely, that Z LIP can indeed be solved by making polynomially many calls to a Shortest Vector Problem oracle in dimension at most n/ 2 + 1
Sanitization of FHE ciphertexts
By definition, fully homomorphic encryption (FHE) schemes support homomorphic decryption, and all known FHE constructions are bootstrapped from a Somewhat Homomorphic Encryption (SHE) scheme via this technique. Additionally, when a public key is provided, ciphertexts are also re-randomizable, e.g., by adding to them fresh encryptions of 0. From those two operations we devise an algorithm to sanitize a ciphertext, by making its distribution canonical. In particular, the distribution of the ciphertext does not depend on the circuit that led to it via homomorphic evaluation, thus providing circuit privacy in the honest-but-curious model. Unlike the previous approach based on noise flooding, our approach does not degrade much the security/efficiency trade-off of the underlying FHE. The technique can be applied to all lattice-based FHE proposed so far, without substantially affecting their concrete parameters
Lattice Attacks on NTRU and LWE: A History of Refinements
Since its invention in 1982, the LLL lattice reduction algorithm (Lenstra, Lenstra, Lovasz 1982) has found countless applications. In cryptanalysis, the two most prominent applications of LLL and its generalisations --e.g. Slide, BKZ and SD-BKZ-- are factoring RSA keys with extra information on the secret key via Coppersmith\u27s method and the cryptanalysis of lattice-based schemes.
After almost 40 years of cryptanalytic applications, predicting and optimising lattice reduction algorithms remains an active area of research. While we do have theorems bounding the worst-case performance of these algorithms, those bounds are asymptotic and not necessarily tight when applied to practical or even cryptographic instances. Reasoning about the behaviour of those algorithms relies on heuristics and approximations, some of which are known to fail for relevant corner cases.
Decades after Lenstra, Lenstra, and Lovász gave birth to this fascinating and lively research area, this state of affairs became a more pressing issue recently. Motivated by post-quantum security, standardisation bodies, governments and industry started to move towards deploying lattice-based cryptographic algorithms. This spurred the refinement of those heuristics and approximations, leading to a better understanding of the behaviour of these algorithms over the last few years.
Lattice reduction algorithms, such as LLL and BKZ, proceed with repeated local improvements to the lattice basis, and each such local improvement means solving the short(est) vector problem in a lattice of a smaller dimension. Therefore, two questions arise: how costly is it to find those local improvements and what is the global behaviour as those improvements are applied.
While those two questions may not be perfectly independent, we will, in this survey, focus on the second one, namely, the global behaviour of such algorithms, given oracle access for finding local improvements. Our focus on the global behaviour is motivated by our intent to draw more of the community\u27s attention to this aspect. We will take a particular interest in the behaviour of such algorithms on a specific class of lattices, underlying the most popular lattice problems to build cryptographic primitives, namely the LWE problem and the NTRU problem. We will emphasise on the approximations that have been made, their progressive refinements and highlight open problems to be addressed
Does the Dual-Sieve Attack on Learning with Errors even Work?
Guo and Johansson (ASIACRYPT 2021), and MATZOV (tech.~report 2022) have independently claimed improved attacks against various NIST lattice candidate by adding a Fast Fourier Transform (FFT) trick on top of the so-called Dual-Sieve attack. Recently, there was more follow up work in this line adding new practical improvements.
However, from a theoretical perspective, all of these works are painfully specific to Learning with Errors, while the principle of the Dual-Sieve attack is more general (Laarhoven & Walter, CT-RSA 2021). More critically, all of these works are based on heuristics that have received very little theoretical and experimental attention.
This work attempts to rectify the above deficiencies of the literature.
We first propose a generalization of the FFT trick by Guo and Johansson to arbitrary Bounded Distance Decoding instances. This generalization offers a new improvement to the attack.
We then theoretically explore the underlying heuristics and show that these are in contradiction with formal, unconditional theorems in some regimes, and with well-tested heuristics in other regimes. The specific instantiations of the recent literature fall into this second regime.
We confirm these contradictions with experiments, documenting several phenomena that are not predicted by the analysis, including a ``waterfall-floor\u27\u27 phenomenon, reminiscent of Low-Density Parity-Check decoding failures.
We conclude that the success probability of the recent Dual-Sieve-FFT attacks are presumably significantly overestimated. We further discuss the adequate way forward towards fixing the attack and its analysis
Polynomial Time Bounded Distance Decoding near Minkowski's Bound in Discrete Logarithm Lattices
International audienceWe propose a concrete family of dense lattices of arbitrary dimension n in which the lattice Bounded Distance Decoding (BDD) problem can be solved in determin-istic polynomial time. This construction is directly adapted from the Chor-Rivest cryptosystem (IEEE-TIT 1988). The lattice construction needs discrete logarithm computations that can be made in deterministic polynomial time for well-chosen parameters. Each lattice comes with a deterministic polynomial time decoding algorithm able to decode up to large radius. Namely, we reach decoding radius within O(log n) Minkowski's bound, for both 1 and 2 norms
Hull Attacks on the Lattice Isomorphism Problem
The lattice isomorphism problem (LIP) asks one to find an isometry between two lattices. It has recently been proposed as a foundation for cryptography in two independent works [Ducas & van Woerden, EUROCRYPT 2022, Bennett et al. preprint 2021]. This problem is the lattice variant of the code equivalence problem, on which the notion of the hull of a code can lead to devastating attacks.
In this work we study the cryptanalytic role of an adaptation of the hull to the lattice setting, namely, the -hull. We first show that the -hull is not helpful for creating an arithmetic distinguisher. More specifically, the genus of the -hull can be efficiently predicted from and the original genus and therefore carries no extra information.
However, we also show that the hull can be helpful for geometric attacks: for certain lattices the minimal distance of the hull is relatively smaller than that of the original lattice, and this can be exploited. The attack cost remains exponential, but the constant in the exponent is halved. This second result gives a counterexample to the general hardness conjecture of LIP proposed by Ducas & van Woerden.
Our results suggests that one should be very considerate about the geometry of hulls when instantiating LIP for cryptography. They also point to unimodular lattices as attractive options, as they are equal to their dual and their hulls, leaving only the original lattice to an attacker. Remarkably, this is already the case in proposed instantiations, namely the trivial lattice and the Barnes-Wall lattices
On the Shortness of Vectors to be found by the Ideal-SVP Quantum Algorithm
The hardness of finding short vectors in ideals of cyclotomic number fields (hereafter, Ideal-SVP) can serve as a worst-case assumption for numerous efficient cryptosystems, via the average-case problems Ring-SIS and Ring-LWE. For a while, it could be assumed the Ideal-SVP problem was as hard as the ana
- …