Improving post-quantum cryptography through cryptanalysis

Abstract

Large quantum computers pose a threat to our public-key cryptographic infrastructure. The possible responses are: Do nothing; accept the fact that quantum computers might be used to break widely deployed protocols. Mitigate the threat by switching entirely to symmetric-key protocols. Mitigate the threat by switching to different public-key protocols. Each user of public-key cryptography will make one of these choices, and we should not expect consensus. Some users will do nothing---perhaps because they view the threat as being too remote. And some users will find that they never needed public-key cryptography in the first place. The work that I present here is for people who need public-key cryptography and want to switch to new protocols. Each of the three articles raises the security estimate of a cryptosystem by showing that some attack is less effective than was previously believed. Each article thereby reduces the cost of using a protocol by letting the user choose smaller (or more efficient) parameters at a fixed level of security. In Part 1, I present joint work with Samuel Jaques in which we revise security estimates for the Supersingular Isogeny Key Exchange (SIKE) protocol. We show that known quantum claw-finding algorithms do not outperform classical claw-finding algorithms. This allows us to recommend 434-bit primes for use in SIKE at the same security level that 503-bit primes had previously been recommended. In Part 2, I present joint work with Martin Albrecht, Vlad Gheorghiu, and Eamonn Postelthwaite that examines the impact of quantum search on sieving algorithms for the shortest vector problem. Cryptographers commonly assume that the cost of solving the shortest vector problem in dimension dd is 2(0.265+o(1))d2^{(0.265\ldots +o(1))d} quantumly and 2(0.292+o(1))d2^{(0.292\ldots + o(1))d} classically. These are upper bounds based on a near neighbor search algorithm due to Becker--Ducas--Gama--Laarhoven. Naively, one might think that dd must be at least 483(128/0.265)483 (\approx 128/0.265) to avoid attacks that cost fewer than 21282^{128} operations. Our analysis accounts for terms in the o(1)o(1) that were previously ignored. In a realistic model of quantum computation, we find that applying the Becker--Ducas--Gama--Laarhoven algorithm in dimension d>376d > 376 will cost more than 21282^{128} operations. We also find reason to believe that the classical algorithm will outperform the quantum algorithm in dimensions d<288d < 288. In Part 3, I present solo work on a variant of post-quantum RSA. The original pqRSA proposal by Bernstein--Heninger--Lou--Valenta uses terabyte keys of the form n=p1p2p3p4pip231n = p_1p_2p_3p_4\cdots p_i\cdots p_{2^{31}} where each pip_i is a 40964096-bit prime. My variant uses terabyte keys of the form n=p12p23p35p47piπip20044225287n = p_1^2p_2^3p_3^5p_4^7\cdots p_i^{\pi_i}\cdots p_{20044}^{225287} where each pip_i is a 40964096-bit prime and πi\pi_i is the ii-th prime. Prime generation is the most expensive part of post-quantum RSA in practice, so the smaller number of prime factors in my proposal gives a large speedup in key generation. The repeated factors help an attacker identify an element of small order, and thereby allow the attacker to use a small-order variant of Shor's algorithm. I analyze small-order attacks and discuss the cost of the classical pre-computation that they require

    Similar works