196 research outputs found

    Decorrelation: A Theory for Block Cipher Security

    Get PDF
    Pseudorandomness is a classical model for the security of block ciphers. In this paper we propose convenient tools in order to study it in connection with the Shannon Theory, the Carter-Wegman universal hash functions paradigm, and the Luby-Rackoff approach. This enables the construction of new ciphers with security proofs under specific models. We show how to ensure security against basic differential and linear cryptanalysis and even more general attacks. We propose practical construction scheme

    Quantum authentication with key recycling

    Get PDF
    We show that a family of quantum authentication protocols introduced in [Barnum et al., FOCS 2002] can be used to construct a secure quantum channel and additionally recycle all of the secret key if the message is successfully authenticated, and recycle part of the key if tampering is detected. We give a full security proof that constructs the secure channel given only insecure noisy channels and a shared secret key. We also prove that the number of recycled key bits is optimal for this family of protocols, i.e., there exists an adversarial strategy to obtain all non-recycled bits. Previous works recycled less key and only gave partial security proofs, since they did not consider all possible distinguishers (environments) that may be used to distinguish the real setting from the ideal secure quantum channel and secret key resource.Comment: 38+17 pages, 13 figures. v2: constructed ideal secure channel and secret key resource have been slightly redefined; also added a proof in the appendix for quantum authentication without key recycling that has better parameters and only requires weak purity testing code

    Decorrelation: a theory for block cipher security

    Get PDF
    Pseudorandomness is a classical model for the security of block ciphers. In this paper we propose convenient tools in order to study it in connection with the Shannon Theory, the Carter-Wegman universal hash functions paradigm, and the Luby-Rackoff approach. This enables the construction of new ciphers with security proofs under specific models. We show how to ensure security against basic differential and linear cryptanalysis and even more general attacks. We propose practical construction scheme

    On the Key-Uncertainty of Quantum Ciphers and the Computational Security of One-way Quantum Transmission

    Get PDF
    We consider the scenario where Alice wants to send a secret (classical) nn-bit message to Bob using a classical key, and where only one-way transmission from Alice to Bob is possible. In this case, quantum communication cannot help to obtain perfect secrecy with key length smaller then nn. We study the question of whether there might still be fundamental differences between the case where quantum as opposed to classical communication is used. In this direction, we show that there exist ciphers with perfect security producing quantum ciphertext where, even if an adversary knows the plaintext and applies an optimal measurement on the ciphertext, his Shannon uncertainty about the key used is almost maximal. This is in contrast to the classical case where the adversary always learns nn bits of information on the key in a known plaintext attack. We also show that there is a limit to how different the classical and quantum cases can be: the most probable key, given matching plain- and ciphertexts, has the same probability in both the quantum and the classical cases. We suggest an application of our results in the case where only a short secret key is available and the message is much longer.Comment: 19 pages, 2 figures. This is a revised version of an earlier version that appeared in the proc. of Eucrocrypt'04:LNCS3027, 200

    Multidimensional linear cryptanalysis

    Get PDF
    Linear cryptanalysis is an important tool for studying the security of symmetric ciphers. In 1993 Matsui proposed two algorithms, called Algorithm 1 and Algorithm 2, for recovering information about the secret key of a block cipher. The algorithms exploit a biased probabilistic relation between the input and output of the cipher. This relation is called the (one-dimensional) linear approximation of the cipher. Mathematically, the problem of key recovery is a binary hypothesis testing problem that can be solved with appropriate statistical tools. The same mathematical tools can be used for realising a distinguishing attack against a stream cipher. The distinguisher outputs whether the given sequence of keystream bits is derived from a cipher or a random source. Sometimes, it is even possible to recover a part of the initial state of the LFSR used in a key stream generator. Several authors considered using many one-dimensional linear approximations simultaneously in a key recovery attack and various solutions have been proposed. In this thesis a unified methodology for using multiple linear approximations in distinguishing and key recovery attacks is presented. This methodology, which we call multidimensional linear cryptanalysis, allows removing unnecessary and restrictive assumptions. We model the key recovery problems mathematically as hypothesis testing problems and show how to use standard statistical tools for solving them. We also show how the data complexity of linear cryptanalysis on stream ciphers and block ciphers can be reduced by using multiple approximations. We use well-known mathematical theory for comparing different statistical methods for solving the key recovery problems. We also test the theory in practice with reduced round Serpent. Based on our results, we give recommendations on how multidimensional linear cryptanalysis should be used

    Secure Block Ciphers - Cryptanalysis and Design

    Get PDF

    Some Words on Cryptanalysis of Stream Ciphers

    Get PDF
    In the world of cryptography, stream ciphers are known as primitives used to ensure privacy over a communication channel. One common way to build a stream cipher is to use a keystream generator to produce a pseudo-random sequence of symbols. In such algorithms, the ciphertext is the sum of the keystream and the plaintext, resembling the one-time pad principal. Although the idea behind stream ciphers is simple, serious investigation of these primitives has started only in the late 20th century. Therefore, cryptanalysis and design of stream ciphers are important. In recent years, many designs of stream ciphers have been proposed in an effort to find a proper candidate to be chosen as a world standard for data encryption. That potential candidate should be proven good by time and by the results of cryptanalysis. Different methods of analysis, in fact, explain how a stream cipher should be constructed. Thus, techniques for cryptanalysis are also important. This thesis starts with an overview of cryptography in general, and introduces the reader to modern cryptography. Later, we focus on basic principles of design and analysis of stream ciphers. Since statistical methods are the most important cryptanalysis techniques, they will be described in detail. The practice of statistical methods reveals several bottlenecks when implementing various analysis algorithms. For example, a common property of a cipher to produce n-bit words instead of just bits makes it more natural to perform a multidimensional analysis of such a design. However, in practice, one often has to truncate the words simply because the tools needed for analysis are missing. We propose a set of algorithms and data structures for multidimensional cryptanalysis when distributions over a large probability space have to be constructed. This thesis also includes results of cryptanalysis for various cryptographic primitives, such as A5/1, Grain, SNOW 2.0, Scream, Dragon, VMPC, RC4, and RC4A. Most of these results were achieved with the help of intensive use of the proposed tools for cryptanalysis
    corecore