84 research outputs found

    Optimizing Hash-Based Signatures in Java

    Get PDF
    Hash-based signature schemes are an extensively studied and well-understood choice for quantum-safe digital signatures. However, certain operations, most notably the key generation, can be comparably expensive. It is, therefore, essential to use well-optimized implementations. This thesis aims to explore, implement, and evaluate optimization strategies for hashbased signature implementations in Java. These include the use of special hardware features like vector instructions and hardware acceleration for hash functions as well as the parallelization of the key generation. Overall, we are able to reduce the time required for an XMSS key generation with SHA-2 by up to 96.4% (on four CPU cores) compared to the unmodified BouncyCastle implementation. For SPHINCS+ with the Haraka hash function family, we achieve a reduction of up to 95.7% on only one CPU core. Furthermore, we investigate the use of two scheme variants WOTS-BR and WOTS+C proposed in the literature for verification-optimized signatures. We improve the existing theoretical analysis of both, provide a comparison and experimentally validate our improved theoretical analysis

    Merkle-Damgård Construction Method and Alternatives: A Review

    Get PDF
    Cryptographic hash function is an important cryptographic tool in the field of information security. Design of most widely used hash functions such as MD5 and SHA-1 is based on the iterations of compression function by Merkle-Damgård construction method with constant initialization vector. Merkle-Damgård construction showed that the security of hash function depends on the security of the compression function. Several attacks on Merkle-Damgård construction based hash functions motivated researchers to propose different cryptographic constructions to enhance the security of hash functions against the differential and generic attacks. Cryptographic community had been looking for replacements for these weak hash functions and they have proposed new hash functions based on different variants of Merkle-Damgård construction. As a result of an open competition NIST announced Keccak as a SHA-3 standard. This paper provides a review of cryptographic hash function, its security requirements and different design methods of compression function

    Revisiting Dedicated and Block Cipher based Hash Functions

    Get PDF
    A hash function maps a variable length input into a fixed length output. The hash functions that are used in the information security related applications are referred as cryptographic hash functions. Hash functions are being used as building blocks of many complex cryptographic mechanisms and protocols. Construction of a hash function consists of two components. First component is a compression function and the second component is a domain extender. The various hash function design philosophies try to design the compression function from different angles. Two major categories of hash functions are: dedicated hash functions, and block cipher-based hash functions. These two kinds of design philosophies have been revisited in this paper. Two dedicated has functions from MD4 family - MD4, and SHA-256 constructions have been detailed in this paper. To limit the scope of this paper in this framework, discussions on attacks on hash functions, and SHA-3 finalists have been excluded here. Keywords

    The suffix-free-prefix-free hash function construction and its indifferentiability security analysis

    Get PDF
    In this paper, we observe that in the seminal work on indifferentiability analysis of iterated hash functions by Coron et al. and in subsequent works, the initial value (IV) of hash functions is fixed. In addition, these indifferentiability results do not depend on the Merkle–Damgård (MD) strengthening in the padding functionality of the hash functions. We propose a generic n -bit-iterated hash function framework based on an n -bit compression function called suffix-free-prefix-free (SFPF) that works for arbitrary IV s and does not possess MD strengthening. We formally prove that SFPF is indifferentiable from a random oracle (RO) when the compression function is viewed as a fixed input-length random oracle (FIL-RO). We show that some hash function constructions proposed in the literature fit in the SFPF framework while others that do not fit in this framework are not indifferentiable from a RO. We also show that the SFPF hash function framework with the provision of MD strengthening generalizes any n -bit-iterated hash function based on an n -bit compression function and with an n -bit chaining value that is proven indifferentiable from a RO

    Backdoored Hash Functions: Immunizing HMAC and HKDF

    Get PDF
    Security of cryptographic schemes is traditionally measured as the inability of resource-constrained adversaries to violate a desired security goal. The security argument usually relies on a sound design of the underlying components. Arguably, one of the most devastating failures of this approach can be observed when considering adversaries such as intelligence agencies that can influence the design, implementation, and standardization of cryptographic primitives. While the most prominent example of cryptographic backdoors is NIST’s Dual_EC_DRBG, believing that such attempts have ended there is naive. Security of many cryptographic tasks, such as digital signatures, pseudorandom generation, and password protection, crucially relies on the security of hash functions. In this work, we consider the question of how backdoors can endanger security of hash functions and, especially, if and how we can thwart such backdoors. We particularly focus on immunizing arbitrarily backdoored versions of HMAC (RFC 2104) and the hash-based key derivation function HKDF (RFC 5869), which are widely deployed in critical protocols such as TLS. We give evidence that the weak pseudorandomness property of the compression function in the hash function is in fact robust against backdooring. This positive result allows us to build a backdoor-resistant pseudorandom function, i.e., a variant of HMAC, and we show that HKDF can be immunized against backdoors at little cost. Unfortunately, we also argue that safe-guarding unkeyed hash functions against backdoors is presumably hard

    Cryptanalysis of Hash Functions

    Get PDF
    The aim of this thesis is to evaluate the applicability of the recently developed biclique [KRS11] to the preimage attack performed by Sasaki and Aoki [SA09]. This led to a slightly improved time complexity of 2^{121.3} compression function evaluations and a greatly improved memory complexity of 2^{20.7} 32-bit memory words. Thanks to this reasonable memory requirement, an attack faster than brute force can be actually implemented, though its execution time would still be infeasibleope

    Security Definitions For Hash Functions: Combining UCE and Indifferentiability

    Get PDF
    Hash functions are one of the most important cryptographic primitives, but their desired security properties have proven to be remarkably hard to formalize. To prove the security of a protocol using a hash function, nowadays often the random oracle model (ROM) is used due to its simplicity and its strong security guarantees. Moreover, hash function constructions are commonly proven to be secure by showing them to be indifferentiable from a random oracle when using an ideal compression function. However, it is well known that no hash function realizes a random oracle and no real compression function realizes an ideal one. As an alternative to the ROM, Bellare et al. recently proposed the notion of universal computational extractors (UCE). This notion formalizes that a family of functions ``behaves like a random oracle\u27\u27 for ``real-world\u27\u27 protocols while avoiding the general impossibility results. However, in contrast to the indifferentiability framework, UCE is formalized as a multi-stage game without clear composition guarantees. As a first contribution, we introduce context-restricted indifferentiability (CRI), a generalization of indifferentiability that allows us to model that the random oracle does not compose generally but can only be used within a well-specified set of protocols run by the honest parties, thereby making the provided composition guarantees explicit. We then show that UCE and its variants can be phrased as a special case of CRI. Moreover, we show how our notion of CRI leads to generalizations of UCE. As a second contribution, we prove that the hash function constructed by Merkle-Damgard satisfies one of the well-known UCE variants, if we assume that the compression function satisfies one of our generalizations of UCE, basing the overall security on a plausible assumption. This result further validates the Merkle-Damgard construction and shows that UCE-like assumptions can serve both as a valid reference point for modular protocol analyses, as well as for the design of hash functions, linking those two aspects in a framework with explicit composition guarantees

    The Symbiosis between Collision and Preimage Resistance

    Full text link
    We revisit the definitions of preimage resistance, focussing on the question of finding a definition that is simple enough to prove security against, yet flexible enough to be of use for most applications. We give an in-depth analysis of existing preimage resistance notions, introduce several new notions, and establish relations and separations between the known and new preimage notions. This establishes a clear separation between domain-oriented and range-oriented preimage resistance notions. For the former an element is chosen from the domain and hashed to form the target digest; for the latter the target digest is chosen directly from the range. In particular, we show that Rogaway and Shrimpton’s notion of everywhere preimage resistance on its own is less powerful than previously thought. However, we prove that in conjunction with collision resistance, everywhere preimage resistance implies ‘ordinary’ (domain-based) preimage resistance. We show the implications of our result for iterated hash functions and hash chains, where the latter is related to the Winternitz one-time signature scheme.status: publishe

    Where's Crypto?: Automated Identification and Classification of Proprietary Cryptographic Primitives in Binary Code

    Full text link
    The continuing use of proprietary cryptography in embedded systems across many industry verticals, from physical access control systems and telecommunications to machine-to-machine authentication, presents a significant obstacle to black-box security-evaluation efforts. In-depth security analysis requires locating and classifying the algorithm in often very large binary images, thus rendering manual inspection, even when aided by heuristics, time consuming. In this paper, we present a novel approach to automate the identification and classification of (proprietary) cryptographic primitives within binary code. Our approach is based on Data Flow Graph (DFG) isomorphism, previously proposed by Lestringant et al. Unfortunately, their DFG isomorphism approach is limited to known primitives only, and relies on heuristics for selecting code fragments for analysis. By combining the said approach with symbolic execution, we overcome all limitations of their work, and are able to extend the analysis into the domain of unknown, proprietary cryptographic primitives. To demonstrate that our proposal is practical, we develop various signatures, each targeted at a distinct class of cryptographic primitives, and present experimental evaluations for each of them on a set of binaries, both publicly available (and thus providing reproducible results), and proprietary ones. Lastly, we provide a free and open-source implementation of our approach, called Where's Crypto?, in the form of a plug-in for the popular IDA disassembler.Comment: A proof-of-concept implementation can be found at https://github.com/wheres-crypto/wheres-crypt
    corecore