1,458 research outputs found

    On the Design of Secure and Fast Double Block Length Hash Functions

    Get PDF
    In this work the security of the rate-1 double block length hash functions, which based on a block cipher with a block length of n-bit and a key length of 2n-bit, is reconsidered. Counter-examples and new attacks are presented on this general class of double block length hash functions with rate 1, which disclose uncovered flaws in the necessary conditions given by Satoh et al. and Hirose. Preimage and second preimage attacks are presented on Hirose's two examples which were left as an open problem. Therefore, although all the rate-1 hash functions in this general class are failed to be optimally (second) preimage resistant, the necessary conditions are refined for ensuring this general class of the rate-1 hash functions to be optimally secure against the collision attack. In particular, two typical examples, which designed under the refined conditions, are proven to be indifferentiable from the random oracle in the ideal cipher model. The security results are extended to a new class of double block length hash functions with rate 1, where one block cipher used in the compression function has the key length is equal to the block length, while the other is doubled

    Preimage resistance beyond the birthday bound: Double-length hashing revisited

    Get PDF
    Security proofs are an essential part of modern cryptography. Often the challenge is not to come up with appropriate schemes but rather to technically prove that these satisfy the desired security properties. We provide for the first time techniques for proving asymptotically optimal preimage resistance bounds for block cipher based double length, double call hash functions. More precisely, we consider for some \keylength>\blocklength compression functions H:\{0,1\}^{\keylength+\blocklength} \rightarrow \{0,1\}^{2\blocklength} using two calls to an ideal block cipher with an \blocklength-bit block size. Optimally, an adversary trying to find a preimage for HH should require \Omega(2^{2\blocklength}) queries to the underlying block cipher. As a matter of fact there have been several attempts to prove the preimage resistance of such compression functions, but no proof did go beyond the \Omega(2^{\blocklength}) barrier, therefore leaving a huge gap when compared to the optimal bound. In this paper, we introduce two new techniques on how to lift this bound to \Omega(2^{2\blocklength}). We demonstrate our new techniques for a simple and natural design of HH, being the concatenation of two instances of the well-known Davies-Meyer compression function

    Construction of secure and fast hash functions using nonbinary error-correcting codes

    Get PDF

    Quantum-secure message authentication via blind-unforgeability

    Get PDF
    Formulating and designing unforgeable authentication of classical messages in the presence of quantum adversaries has been a challenge, as the familiar classical notions of unforgeability do not directly translate into meaningful notions in the quantum setting. A particular difficulty is how to fairly capture the notion of "predicting an unqueried value" when the adversary can query in quantum superposition. In this work, we uncover serious shortcomings in existing approaches, and propose a new definition. We then support its viability by a number of constructions and characterizations. Specifically, we demonstrate a function which is secure according to the existing definition by Boneh and Zhandry, but is clearly vulnerable to a quantum forgery attack, whereby a query supported only on inputs that start with 0 divulges the value of the function on an input that starts with 1. We then propose a new definition, which we call "blind-unforgeability" (or BU.) This notion matches "intuitive unpredictability" in all examples studied thus far. It defines a function to be predictable if there exists an adversary which can use "partially blinded" oracle access to predict values in the blinded region. Our definition (BU) coincides with standard unpredictability (EUF-CMA) in the classical-query setting. We show that quantum-secure pseudorandom functions are BU-secure MACs. In addition, we show that BU satisfies a composition property (Hash-and-MAC) using "Bernoulli-preserving" hash functions, a new notion which may be of independent interest. Finally, we show that BU is amenable to security reductions by giving a precise bound on the extent to which quantum algorithms can deviate from their usual behavior due to the blinding in the BU security experiment.Comment: 23+9 pages, v3: published version, with one theorem statement in the summary of results correcte

    Quantum attacks on Bitcoin, and how to protect against them

    Get PDF
    The key cryptographic protocols used to secure the internet and financial transactions of today are all susceptible to attack by the development of a sufficiently large quantum computer. One particular area at risk are cryptocurrencies, a market currently worth over 150 billion USD. We investigate the risk of Bitcoin, and other cryptocurrencies, to attacks by quantum computers. We find that the proof-of-work used by Bitcoin is relatively resistant to substantial speedup by quantum computers in the next 10 years, mainly because specialized ASIC miners are extremely fast compared to the estimated clock speed of near-term quantum computers. On the other hand, the elliptic curve signature scheme used by Bitcoin is much more at risk, and could be completely broken by a quantum computer as early as 2027, by the most optimistic estimates. We analyze an alternative proof-of-work called Momentum, based on finding collisions in a hash function, that is even more resistant to speedup by a quantum computer. We also review the available post-quantum signature schemes to see which one would best meet the security and efficiency requirements of blockchain applications.Comment: 21 pages, 6 figures. For a rough update on the progress of Quantum devices and prognostications on time from now to break Digital signatures, see https://www.quantumcryptopocalypse.com/quantum-moores-law

    Random hypergraphs for hashing-based data structures

    Get PDF
    This thesis concerns dictionaries and related data structures that rely on providing several random possibilities for storing each key. Imagine information on a set S of m = |S| keys should be stored in n memory locations, indexed by [n] = {1,…,n}. Each object x [ELEMENT OF] S is assigned a small set e(x) [SUBSET OF OR EQUAL TO] [n] of locations by a random hash function, independent of other objects. Information on x must then be stored in the locations from e(x) only. It is possible that too many objects compete for the same locations, in particular if the load c = m/n is high. Successfully storing all information may then be impossible. For most distributions of e(x), however, success or failure can be predicted very reliably, since the success probability is close to 1 for loads c less than a certain load threshold c^* and close to 0 for loads greater than this load threshold. We mainly consider two types of data structures: • A cuckoo hash table is a dictionary data structure where each key x [ELEMENT OF] S is stored together with an associated value f(x) in one of the memory locations with an index from e(x). The distribution of e(x) is controlled by the hashing scheme. We analyse three known hashing schemes, and determine their exact load thresholds. The schemes are unaligned blocks, double hashing and a scheme for dynamically growing key sets. • A retrieval data structure also stores a value f(x) for each x [ELEMENT OF] S. This time, the values stored in the memory locations from e(x) must satisfy a linear equation that characterises the value f(x). The resulting data structure is extremely compact, but unusual. It cannot answer questions of the form “is y [ELEMENT OF] S?”. Given a key y it returns a value z. If y [ELEMENT OF] S, then z = f(y) is guaranteed, otherwise z may be an arbitrary value. We consider two new hashing schemes, where the elements of e(x) are contained in one or two contiguous blocks. This yields good access times on a word RAM and high cache efficiency. An important question is whether these types of data structures can be constructed in linear time. The success probability of a natural linear time greedy algorithm exhibits, once again, threshold behaviour with respect to the load c. We identify a hashing scheme that leads to a particularly high threshold value in this regard. In the mathematical model, the memory locations [n] correspond to vertices, and the sets e(x) for x [ELEMENT OF] S correspond to hyperedges. Three properties of the resulting hypergraphs turn out to be important: peelability, solvability and orientability. Therefore, large parts of this thesis examine how hyperedge distribution and load affects the probabilities with which these properties hold and derive corresponding thresholds. Translated back into the world of data structures, we achieve low access times, high memory efficiency and low construction times. We complement and support the theoretical results by experiments.Diese Arbeit behandelt Wörterbücher und verwandte Datenstrukturen, die darauf aufbauen, mehrere zufällige Möglichkeiten zur Speicherung jedes Schlüssels vorzusehen. Man stelle sich vor, Information über eine Menge S von m = |S| Schlüsseln soll in n Speicherplätzen abgelegt werden, die durch [n] = {1,…,n} indiziert sind. Jeder Schlüssel x [ELEMENT OF] S bekommt eine kleine Menge e(x) [SUBSET OF OR EQUAL TO] [n] von Speicherplätzen durch eine zufällige Hashfunktion unabhängig von anderen Schlüsseln zugewiesen. Die Information über x darf nun ausschließlich in den Plätzen aus e(x) untergebracht werden. Es kann hierbei passieren, dass zu viele Schlüssel um dieselben Speicherplätze konkurrieren, insbesondere bei hoher Auslastung c = m/n. Eine erfolgreiche Speicherung der Gesamtinformation ist dann eventuell unmöglich. Für die meisten Verteilungen von e(x) lässt sich Erfolg oder Misserfolg allerdings sehr zuverlässig vorhersagen, da für Auslastung c unterhalb eines gewissen Auslastungsschwellwertes c* die Erfolgswahrscheinlichkeit nahezu 1 ist und für c jenseits dieses Auslastungsschwellwertes nahezu 0 ist. Hauptsächlich werden wir zwei Arten von Datenstrukturen betrachten: • Eine Kuckucks-Hashtabelle ist eine Wörterbuchdatenstruktur, bei der jeder Schlüssel x [ELEMENT OF] S zusammen mit einem assoziierten Wert f(x) in einem der Speicherplätze mit Index aus e(x) gespeichert wird. Die Verteilung von e(x) wird hierbei vom Hashing-Schema festgelegt. Wir analysieren drei bekannte Hashing-Schemata und bestimmen erstmals deren exakte Auslastungsschwellwerte im obigen Sinne. Die Schemata sind unausgerichtete Blöcke, Doppel-Hashing sowie ein Schema für dynamisch wachsenden Schlüsselmengen. • Auch eine Retrieval-Datenstruktur speichert einen Wert f(x) für alle x [ELEMENT OF] S. Diesmal sollen die Werte in den Speicherplätzen aus e(x) eine lineare Gleichung erfüllen, die den Wert f(x) charakterisiert. Die entstehende Datenstruktur ist extrem platzsparend, aber ungewöhnlich: Sie ist ungeeignet um Fragen der Form „ist y [ELEMENT OF] S?“ zu beantworten. Bei Anfrage eines Schlüssels y wird ein Ergebnis z zurückgegeben. Falls y [ELEMENT OF] S ist, so ist z = f(y) garantiert, andernfalls darf z ein beliebiger Wert sein. Wir betrachten zwei neue Hashing-Schemata, bei denen die Elemente von e(x) in einem oder in zwei zusammenhängenden Blöcken liegen. So werden gute Zugriffszeiten auf Word-RAMs und eine hohe Cache-Effizienz erzielt. Eine wichtige Frage ist, ob Datenstrukturen obiger Art in Linearzeit konstruiert werden können. Die Erfolgswahrscheinlichkeit eines naheliegenden Greedy-Algorithmus weist abermals ein Schwellwertverhalten in Bezug auf die Auslastung c auf. Wir identifizieren ein Hashing-Schema, das diesbezüglich einen besonders hohen Schwellwert mit sich bringt. In der mathematischen Modellierung werden die Speicherpositionen [n] als Knoten und die Mengen e(x) für x [ELEMENT OF] S als Hyperkanten aufgefasst. Drei Eigenschaften der entstehenden Hypergraphen stellen sich dann als zentral heraus: Schälbarkeit, Lösbarkeit und Orientierbarkeit. Weite Teile dieser Arbeit beschäftigen sich daher mit den Wahrscheinlichkeiten für das Vorliegen dieser Eigenschaften abhängig von Hashing Schema und Auslastung, sowie mit entsprechenden Schwellwerten. Eine Rückübersetzung der Ergebnisse liefert dann Datenstrukturen mit geringen Anfragezeiten, hoher Speichereffizienz und geringen Konstruktionszeiten. Die theoretischen Überlegungen werden dabei durch experimentelle Ergebnisse ergänzt und gestützt

    Design and Analysis of Cryptographic Hash Functions

    Get PDF
    Wydział Matematyki i InformatykiKryptograficzne funkcje haszujące stanowią element składowy wielu algorytmów kryptograficznych. Przykładowymi zastosowaniami kryptograficznych funkcji haszujących są podpisy cyfrowe oraz kody uwierzytelniania wiadomości. Ich własności kryptograficzne mają znaczący wpływ na poziom bezpieczeństwa systemów kryptograficznych wykorzystujących haszowanie. W dysertacji analizowane są kryptograficzne funkcje haszujące oraz omówione główne zasady tworzenia bezpiecznych kryptograficznych funkcji haszujących. Analizujemy bezpieczeństwo dedykowanych funkcji haszujących (BMW, Shabal, SIMD, BLAKE2, Skein) oraz funkcji haszujących zbudowanych z szyfrów blokowych (Crypton, Hierocrypt-3, IDEA, SAFER++, Square). Głównymi metodami kryptoanalizy użytymi są skrócona analiza różnicowa, analiza rotacyjna i przesuwna. Uzyskane wyniki pokazują słabości analizowanych konstrukcji.Cryptographic Hash Functions (CHFs) are building blocks of many cryptographic algorithms. For instance, they are indispensable tools for efficient digital signature and authentication tags. Their security properties have tremendous impact on the security level of systems, which use cryptographic hashing. This thesis analyzes CHFs and studies the design principles for construction of secure and efficient CHFs. The dissertation investigates security of both dedicated hash functions (BMW, Shabal, SIMD, BLAKE2, Skein) and hash functions based on block ciphers (Crypton, Hierocrypt-3, IDEA, SAFER++, Square). The main cryptographic tools applied are truncated differentials, rotational and shift analysis. The findings show weaknesses in the designs
    corecore