83 research outputs found

    Can NSEC5 be practical for DNSSEC deployments?

    Full text link
    NSEC5 is proposed modification to DNSSEC that simultaneously guarantees two security properties: (1) privacy against offline zone enumeration, and (2) integrity of zone contents, even if an adversary compromises the authoritative nameserver responsible for responding to DNS queries for the zone. This paper redesigns NSEC5 to make it both practical and performant. Our NSEC5 redesign features a new fast verifiable random function (VRF) based on elliptic curve cryptography (ECC), along with a cryptographic proof of its security. This VRF is also of independent interest, as it is being standardized by the IETF and being used by several other projects. We show how to integrate NSEC5 using our ECC-based VRF into the DNSSEC protocol, leveraging precomputation to improve performance and DNS protocol-level optimizations to shorten responses. Next, we present the first full-fledged implementation of NSEC5—extending widely-used DNS software to present a nameserver and recursive resolver that support NSEC5—and evaluate their performance under aggressive DNS query loads. Our performance results indicate that our redesigned NSEC5 can be viable even for high-throughput scenarioshttps://eprint.iacr.org/2017/099.pdfFirst author draf

    On the Adoption of the Elliptic Curve Digital Signature Algorithm (ECDSA) in DNSSEC

    Get PDF
    The Domain Name System Security Extensions (DNSSEC) are steadily being deployed across the Internet. DNSSEC extends the DNS protocol with two vital security properties, authenticity and integrity, using digital signatures. While DNSSEC is meant to solve security issues in the DNS, it also introduces a new one: the digital signatures significantly increase DNS packet sizes, making DNSSEC an attractive vector to abuse in amplification denial-of-service attacks. By default, DNSSEC uses RSA for digital signatures. Earlier work has shown that alternative signature schemes, based on elliptic curve cryptography, can significantly reduce the impact of signatures on DNS response sizes. In this paper we study the actual adoption of ECDSA by DNSSEC operators, based on longitudinal datasets covering over 50% of the global DNS namespace over a period of 1.5 years. Adoption is still marginal, with just 2.3% of DNSSEC-signed domains in the .com TLD using ECDSA. Nevertheless, use of ECDSA is growing, with at least one large operator leading the pack. And adoption could be up to 42% higher. As we demonstrate, there are barriers to deployment that hamper adoption. Operators wishing to deploy DNSSEC using current recommendations (with ECDSA as signing algorithm) must be mindful of this when planning their deployment

    The Reality of Algorithm Agility:Studying the DNSSEC Algorithm Life-Cycle

    Get PDF
    The DNS Security Extensions (DNSSEC) add data origin authentication and data integrity to the Domain Name System (DNS), the naming system of the Internet. With DNSSEC, signatures are added to the information provided in the DNS using public key cryptography. Advances in both cryptography and cryptanalysis make it necessary to deploy new algorithms in DNSSEC, as well as deprecate those with weakened security. If this process is easy, then the protocol has achieved what the IETF terms "algorithm agility". In this paper, we study the lifetime of algorithms for DNSSEC. This includes: (i) standardizing the algorithm, (ii) implementing support in DNS software, (iii) deploying new algorithms at domains and recursive resolvers, and (iv) replacing deprecated algorithms. Using data from more than 6.7 million signed domains and over 10,000 vantage points in the DNS, combined with qualitative studies, we show that DNSSEC has only partially achieved algorithm agility. Standardizing new algorithms and deprecating insecure ones can take years. We highlight the main barriers for getting new algorithms deployed, but also discuss success factors. This study provides key insights to take into account when new algorithms are introduced, for example when the Internet must transition to quantum-safe public key cryptography

    NSEC5, DNSSEC authenticated denial of existence

    Full text link
    The Domain Name System Security Extensions (DNSSEC) introduced two resource records (RR) for authenticated denial of existence: the NSEC RR and the NSEC3 RR. This document introduces NSEC5 as an alternative mechanism for DNSSEC authenticated denial of existence. NSEC5 uses verifiable random functions (VRFs) to prevent offline enumeration of zone contents. NSEC5 also protects the integrity of the zone contents even if an adversary compromises one of the authoritative servers for the zone. Integrity is preserved because NSEC5 does not require private zone-signing keys to be present on all authoritative servers for the zone, in contrast to DNSSEC online signing schemes like NSEC3 White Lies.https://datatracker.ietf.org/doc/draft-vcelak-nsec5/First author draf

    Retrofitting Post-Quantum Cryptography in Internet Protocols:A Case Study of DNSSEC

    Get PDF
    Quantum computing is threatening current cryptography, especially the asymmetric algorithms used in many Internet protocols. More secure algorithms, colloquially referred to as Post-Quantum Cryptography (PQC), are under active development. These new algorithms differ significantly from current ones. They can have larger signatures or keys, and often require more computational power. This means we cannot just replace existing algorithms by PQC alternatives, but need to evaluate if they meet the requirements of the Internet protocols that rely on them. In this paper we provide a case study, analyzing the impact of PQC on the Domain Name System (DNS) and its Security Extensions (DNSSEC). In its main role, DNS translates human-readable domain names to IP addresses and DNSSEC guarantees message integrity and authenticity. DNSSEC is particularly challenging to transition to PQC, since DNSSEC and its underlying transport protocols require small signatures and keys and efficient validation. We evaluate current candidate PQC signature algorithms in the third round of the NIST competition on their suitability for use in DNSSEC. We show that three algorithms, partially, meet DNSSEC’s requirements but also show where and how we would still need to adapt DNSSEC. Thus, our research lays the foundation for making DNSSEC, and protocols with similar constraints ready for PQC

    Making the Case for Elliptic Curves in DNSSEC

    Get PDF
    ABSTRACT The Domain Name System Security Extensions (DNSSEC) add authenticity and integrity to the DNS, improving its security. Unfortunately, DNSSEC is not without problems. DNSSEC adds digital signatures to the DNS, significantly increasing the size of DNS responses. This means DNS-SEC is more susceptible to packet fragmentation and makes DNSSEC an attractive vector to abuse in amplificationbased denial-of-service attacks. Additionally, key management policies are often complex. This makes DNSSEC fragile and leads to operational failures. In this paper, we argue that the choice for RSA as default cryptosystem in DNS-SEC is a major factor in these three problems. Alternative cryptosystems, based on elliptic curve cryptography (EC-DSA and EdDSA), exist but are rarely used in DNSSEC. We show that these are highly attractive for use in DNS-SEC, although they also have disadvantages. To address these, we have initiated research that aims to investigate the viability of deploying ECC at a large scale in DNSSEC

    Making DNSSEC Future Proof

    Get PDF

    A mechanism for guaranteeing the authenticity of digital identity documents using digital signatures, DNSSEC and Blockchain

    Get PDF
    Trabalho de Conclusão de Curso (graduação)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2018.Este projeto propõe um mecanismo para garantia de autenticidade de documentos de identidade digitais utilizando Assinaturas Digitais, DNSSEC e Blockchain. Apesar de existirem trabalhos propondo implementações similares, nenhuma delas aborda a distribuição das chaves públicas utilizando o sistema DNS e não fornecem uma maneira de implementar timestamping confiável. O objetivo desse projeto é propor um mecanismo com as características mencionadas anterior- mente, implementar uma prova de conceito e avaliá-la utilizando cenários de ataque construídos. Ao final, algumas melhorias que não foram implementadas foram apresentadas como trabalhos futuros.This project proposes a mechanism for guaranteeing the authenticity of digital identity documents using Digital Signatures, DNSSEC and Blockchain. Although there are papers proposing similar implementations, none of them address the public key distribution using the DNS system and do not provide a way of implementing trusted timestamping. The goal of this project is to propose a mechanism design with the aforementioned characteristics, implement a proof of concept and evaluate it using crafted attack scenarios. In the end, a few improvements that were not imple- mented were presented as future work

    A kilobit hidden SNFS discrete logarithm computation

    Get PDF
    We perform a special number field sieve discrete logarithm computation in a 1024-bit prime field. To our knowledge, this is the first kilobit-sized discrete logarithm computation ever reported for prime fields. This computation took a little over two months of calendar time on an academic cluster using the open-source CADO-NFS software. Our chosen prime pp looks random, and p1p--1 has a 160-bit prime factor, in line with recommended parameters for the Digital Signature Algorithm. However, our p has been trapdoored in such a way that the special number field sieve can be used to compute discrete logarithms in F_p\mathbb{F}\_p^* , yet detecting that p has this trapdoor seems out of reach. Twenty-five years ago, there was considerable controversy around the possibility of back-doored parameters for DSA. Our computations show that trapdoored primes are entirely feasible with current computing technology. We also describe special number field sieve discrete log computations carried out for multiple weak primes found in use in the wild. As can be expected from a trapdoor mechanism which we say is hard to detect, our research did not reveal any trapdoored prime in wide use. The only way for a user to defend against a hypothetical trapdoor of this kind is to require verifiably random primes

    Automatic generation of high speed elliptic curve cryptography code

    Get PDF
    Apparently, trust is a rare commodity when power, money or life itself are at stake. History is full of examples. Julius Caesar did not trust his generals, so that: ``If he had anything confidential to say, he wrote it in cipher, that is, by so changing the order of the letters of the alphabet, that not a word could be made out. If anyone wishes to decipher these, and get at their meaning, he must substitute the fourth letter of the alphabet, namely D, for A, and so with the others.'' And so the history of cryptography began moving its first steps. Nowadays, encryption has decayed from being an emperor's prerogative and became a daily life operation. Cryptography is pervasive, ubiquitous and, the best of all, completely transparent to the unaware user. Each time we buy something on the Internet we use it. Each time we search something on Google we use it. Everything without (almost) realizing that it silently protects our privacy and our secrets. Encryption is a very interesting instrument in the "toolbox of security" because it has very few side effects, at least on the user side. A particularly important one is the intrinsic slow down that its use imposes in the communications. High speed cryptography is very important for the Internet, where busy servers proliferate. Being faster is a double advantage: more throughput and less server overhead. In this context, however, the public key algorithms starts with a big handicap. They have very bad performances if compared to their symmetric counterparts. Due to this reason their use is often reduced to the essential operations, most notably key exchanges and digital signatures. The high speed public key cryptography challenge is a very practical topic with serious repercussions in our technocentric world. Using weak algorithms with a reduced key length to increase the performances of a system can lead to catastrophic results. In 1985, Miller and Koblitz independently proposed to use the group of rational points of an elliptic curve over a finite field to create an asymmetric algorithm. Elliptic Curve Cryptography (ECC) is based on a problem known as the ECDLP (Elliptic Curve Discrete Logarithm Problem) and offers several advantages with respect to other more traditional encryption systems such as RSA and DSA. The main benefit is that it requires smaller keys to provide the same security level since breaking the ECDLP is much harder. In addition, a good ECC implementation can be very efficient both in time and memory consumption, thus being a good candidate for performing high speed public key cryptography. Moreover, some elliptic curve based techniques are known to be extremely resilient to quantum computing attacks, such as the SIDH (Supersingular Isogeny Diffie-Hellman). Traditional elliptic curve cryptography implementations are optimized by hand taking into account the mathematical properties of the underlying algebraic structures, the target machine architecture and the compiler facilities. This process is time consuming, requires a high degree of expertise and, ultimately, error prone. This dissertation' ultimate goal is to automatize the whole optimization process of cryptographic code, with a special focus on ECC. The framework presented in this thesis is able to produce high speed cryptographic code by automatically choosing the best algorithms and applying a number of code-improving techniques inspired by the compiler theory. Its central component is a flexible and powerful compiler able to translate an algorithm written in a high level language and produce a highly optimized C code for a particular algebraic structure and hardware platform. The system is generic enough to accommodate a wide array of number theory related algorithms, however this document focuses only on optimizing primitives based on elliptic curves defined over binary fields
    corecore