13 research outputs found

    MECHANISM TO SPEED UP SECURE COMMUNICATION HANDSHAKES IN CONSTRAINED CONDITIONS

    Get PDF
    It is desirable to speed up secure tunnel negotiations in constrained mediums where numerous clients are competing to form secure connections to destination servers or endpoints. In support of that objective, techniques are presented herein that minimize the authentication data that is transferred within encrypted tunnel handshakes under constrained conditions where the data can introduce unacceptable slowness or failures. The techniques may apply to Internet of Things (IoT) constrained environments in which bandwidth is sparse and multiple devices are competing for bandwidth. The techniques may also be used on the Internet with post-quantum algorithms, which can introduce unnecessary slowness due to their long keys and signatures. Other environments that may benefit include, for example, a Wireless Smart Utility Network (Wi-Sun) network, an Institute of Electrical and Electronics Engineers (IEEE) technical standard 802.15.4 network, virtual private networks (VPNs) and Zero-Trust Access networks, Web acceleration or proxy functions, etc

    Do we need to change some things? Open questions posed by the upcoming post-quantum migration to existing standards and deployments

    Get PDF
    Cryptographic algorithms are vital components ensuring the privacy and security of computer systems. They have constantly improved and evolved over the years following new developments, attacks, breaks, and lessons learned. A recent example is that of quantum-resistant cryptography, which has gained a lot of attention in the last decade and is leading to new algorithms being standardized today. These algorithms, however, present a real challenge: they come with strikingly different size and performance characteristics than their classical counterparts. At the same time, common foundational aspects of our transport protocols have lagged behind as the Internet remains a very diverse space in which different use-cases and parts of the world have different needs. This vision paper motivates more research and possible standards updates related to the upcoming quantum-resistant cryptography migration. It stresses the importance of amplification reflection attacks and congestion control concerns in transport protocols and presents research and standardization takeaways for assessing the impact and the efficacy of potential countermeasures. It emphasizes the need to go beyond the standardization of key encapsulation mechanisms in order to address the numerous protocols and deployments of public-key encryption while avoiding pitfalls. Finally, it motivates the critical need for research in anonymous credentials and blind signatures at the core of numerous deployments and standardization efforts aimed at providing privacy-preserving trust signals

    LMS vs XMSS: Comparion of two Hash-Based Signature Standards

    Get PDF
    Quantum computing poses challenges to public key signatures as we know them today. LMS and XMSS are two hash based signature schemes that have been proposed in the IETF as quantum secure. Both schemes are based on well-studied hash trees, but their similarities and differences have not yet been discussed. In this work, we attempt to compare the two standards. We compare their security assumptions and quantify their signature and public key sizes. We also address the computation overhead they introduce. Our goal is to provide a clear understanding of the schemes’ similarities and differences for implementers and protocol designers to be able to make a decision as to which standard to chose

    The impact of data-heavy, post-quantum TLS 1.3 on the Time-To-Last-Byte of real-world connections

    Get PDF
    It has been shown that post-quantum key exchange and authentication with ML-KEM and ML-DSA, NIST’s postquantum algorithm picks, will have an impact on TLS 1.3 performance used in the Web or other applications. Studies so far have focused on the overhead of quantum-resistant algorithms on TLS time-to-first-byte (handshake time). Although these works have been important in quantifying the slowdown in connection establishment, they do not capture the full picture regarding real-world TLS 1.3 connections which carry sizable amounts of data. Intuitively, the introduction of an extra 10KB of ML-KEM and ML-DSA exchanges in the connection negotiation will inflate the connection establishment time proportionally more than it will increase the total connection time of a Web connection carrying 200KB of data. In this work, we quantify the impact of ML-KEM and ML-DSA on typical TLS 1.3 connections which transfer a few hundreds of KB from the server to the client. We study the slowdown in the time-to-last-byte of postquantum connections under normal network conditions and in more unstable environments with high packet delay variability and loss probabilities. We show that the impact of ML-KEM and ML-DSA on the TLS 1.3 time-to-last-byte under stable network conditions is lower than the impact on the handshake and diminishes as the transferred data increases. The time-to-last-byte increase stays below 5% for high-bandwidth, stable networks. It goes from 32% increase of the handshake time to under 15% increase of the time-to-last-byte when transferring 50KiB of data or more under low-bandwidth, stable network conditions. Even when congestion control affects connection establishment, the additional slowdown drops below 10% as the connection data increases to 200KiB. We also show that connections under lossy or volatile network conditions could see higher impact from post-quantum handshakes, but these connections’ time-to-lastbyte increase still drops as the transferred data increases. Finally, we show that such connections are already significantly slow and volatile regardless of the TLS handshake

    PQ-HPKE: Post-Quantum Hybrid Public Key Encryption

    Get PDF
    Public key cryptography is used to asymmetrically establish keys, authenticate or encrypt data between communicating parties at a relatively high performance cost. To reduce computational overhead, modern network protocols combine asymmetric primitives for key establishment and authentication with symmetric ones. Similarly, Hybrid Public Key Encryption, a relatively new scheme, uses public key cryptography for key derivation and symmetric key cryptography for data encryption. In this paper, we present the first quantum-resistant implementation of HPKE to address concerns that quantum computers bring to asymmetric algorithms. We propose PQ-only and PQ-hybrid HPKE variants and analyze their performance for two post-quantum key encapsulation mechanisms and various plaintext sizes. We compare these variants with RSA and classical HPKE and show that the additional post-quantum overhead is amortized over the plaintext size. Our PQ-hybrid variant with a lattice-based KEM shows an overhead of 52% for 1KB of encrypted data which is reduced to 17% for 1MB of plaintext. We report 1.83, 1.78, and 2.15 x10^6 clock cycles needed for encrypting 1MB of message based on classical, PQ-only, and PQ-hybrid HPKE respectively, where we note that the cost of introducing quantum-resistance to HPKE is relatively low

    Post-Quantum Authentication in TLS 1.3: A Performance Study

    Get PDF
    The potential development of large-scale quantum computers is raising concerns among IT and security research professionals due to their ability to solve (elliptic curve) discrete logarithm and integer factorization problems in polynomial time. All currently used public key algorithms would be deemed insecure in a post-quantum (PQ) setting. In response, the National Institute of Standards and Technology (NIST) has initiated a process to standardize quantum-resistant crypto algorithms, focusing primarily on their security guarantees. Since PQ algorithms present significant differences over classical ones, their overall evaluation should not be performed out-of-context. This work presents a detailed performance evaluation of the NIST signature algorithm candidates and investigates the imposed latency on TLS 1.3 connection establishment under realistic network conditions. In addition, we investigate their impact on TLS session throughput and analyze the trade-off between lengthy PQ signatures and computationally heavy PQ cryptographic operations. Our results demonstrate that the adoption of at least two PQ signature algorithms would be viable with little additional overhead over current signature algorithms. Also, we argue that many NIST PQ candidates can effectively be used for less time-sensitive applications, and provide an in-depth discussion on the integration of PQ authentication in encrypted tunneling protocols, along with the related challenges, improvements, and alternatives. Finally, we propose and evaluate the combination of different PQ signature algorithms across the same certificate chain in TLS. Results show a reduction of the TLS handshake time and a significant increase of a server\u27s TLS tunnel connection rate over using a single PQ signature scheme

    Post-Quantum Hash-Based Signatures for Secure Boot

    Get PDF
    The potential development of large-scale quantum computers is raising concerns among IT and security research professionals due to their ability to solve (elliptic curve) discrete logarithm and integer factorization problems in polynomial time. All currently used, public-key cryptography algorithms would be deemed insecure in a post-quantum setting. In response, the United States National Institute of Standards and Technology has initiated a process to standardize quantum-resistant cryptographic algorithms, focusing primarily on their security guarantees. Additionally, the Internet Engineering Task Force has published two quantum-secure signature schemes and has been looking into adding quantum-resistant algorithms in protocols. In this work, we investigate two post-quantum, hash-based signature schemes published by the Internet Engineering Task Force and submitted to the National Institute of Standards and Technology for use in secure boot. We evaluate various parameter sets for the use-cases in question and we prove that post-quantum signatures would not have material impact on image signing. We also study the hierarchical design of these signatures in different scenarios of hardware secure boot

    The Viability of Post-quantum X.509 Certificates

    Get PDF
    If quantum computers were built, they would pose concerns for public key cryptography as we know it. Among other cryptographic techniques, they would jeopardize the use of PKI X.509 certificates (RSA, ECDSA) used today for authentication. To overcome the concern, new quantum secure signature schemes have been proposed in the literature. Most of these schemes have significantly larger public key and signature sizes than the ones used today. Even though post-quantum signatures could work well for some usecases like software signing, there are concerns about the effect their size and processing cost would have on technologies using X.509 certificates. In this work, we investigate the viability of post-quantum signatures in X.509 certificates and protocols that use them (e.g. TLS, IKEv2). We prove that, in spite of common concerns, they could work in today\u27s protocols and could be a viable solution to the emergence of quantum computing. We also quantify the overhead they introduce in protocol connection establishment and show that even though it is significant, it is not detrimental. Finally, we formalize the areas of further testing necessary to conclusively establish that the signature schemes standardized in NIST\u27s PQ Project can work with X.509 certs in a post-quantum Internet

    Post-Quantum LMS and SPHINCS+ Hash-Based Signatures for UEFI Secure Boot

    Get PDF
    The potential development of large-scale quantum computers is raising concerns among IT and security research professionals due to their ability to solve (elliptic curve) discrete logarithm and integer factorization problems in polynomial time. This would jeopardize IT security as we know it. In this work, we investigate two quantum-safe, hash-based signature schemes published by the Internet Engineering Task Force and submitted to the National Institute of Standards and Technology for use in secure boot. We evaluate various parameter sets for the use-case in question and we prove that post-quantum signatures with less than one second signing and less than 10ms verification would not have material impact (less than1‰) on secure boot. We evaluate the hierarchical design of these signatures in hardware-based and virtual secure boot. In addition, we develop Hardware Description Language code and show that the code footprint is just a few kilobytes in size which would fit easily in almost all modern FPGAs. We also analyze and evaluate potential challenges for integration in existing technologies and we discuss considerations for vendors embarking on a journey of image signing with hash-based signatures

    Two PQ Signature Use-cases: Non-issues, challenges and potential solutions.

    No full text
    The recent advances and attention to quantum computing have raised serious security concerns among IT professionals. The ability of a quantum computer to efficiently solve (elliptic curve) discrete logarithm, and integer factorization problems poses a threat to current public key exchange, encryption, and digital signature schemes. Consequently, in 2016 NIST initiated an open call for quantum-resistant crypto algorithms. This process, currently in its second round, has yielded nine signature algorithms for possible standardization. In this work, we are evaluating two post-quantum signature use-cases and analyze the signature schemes that seem most appropriate for them. We first consider Hash-Based Signatures for software signing and secure boot. We propose suitable parameters and show that their acceptable performance makes them good candidates for image signing. We then evaluate NIST candidate post-quantum signatures for TLS 1.3. We show that Dilithium and Falcon are the best available options but come with an impact on TLS performance. Finally, we present challenges and potential solutions introduced by these algorithms
    corecore