29,191 research outputs found

    Multidimensional linear cryptanalysis

    Get PDF
    Linear cryptanalysis is an important tool for studying the security of symmetric ciphers. In 1993 Matsui proposed two algorithms, called Algorithm 1 and Algorithm 2, for recovering information about the secret key of a block cipher. The algorithms exploit a biased probabilistic relation between the input and output of the cipher. This relation is called the (one-dimensional) linear approximation of the cipher. Mathematically, the problem of key recovery is a binary hypothesis testing problem that can be solved with appropriate statistical tools. The same mathematical tools can be used for realising a distinguishing attack against a stream cipher. The distinguisher outputs whether the given sequence of keystream bits is derived from a cipher or a random source. Sometimes, it is even possible to recover a part of the initial state of the LFSR used in a key stream generator. Several authors considered using many one-dimensional linear approximations simultaneously in a key recovery attack and various solutions have been proposed. In this thesis a unified methodology for using multiple linear approximations in distinguishing and key recovery attacks is presented. This methodology, which we call multidimensional linear cryptanalysis, allows removing unnecessary and restrictive assumptions. We model the key recovery problems mathematically as hypothesis testing problems and show how to use standard statistical tools for solving them. We also show how the data complexity of linear cryptanalysis on stream ciphers and block ciphers can be reduced by using multiple approximations. We use well-known mathematical theory for comparing different statistical methods for solving the key recovery problems. We also test the theory in practice with reduced round Serpent. Based on our results, we give recommendations on how multidimensional linear cryptanalysis should be used

    Linear Cryptanalysis of DES with Asymmetries

    Get PDF
    Linear cryptanalysis of DES, proposed by Matsui in 1993, has had a seminal impact on symmetric-key cryptography, having seen massive research efforts over the past two decades. It has spawned many variants, including multidimensional and zero-correlation linear cryptanalysis. These variants can claim best attacks on several ciphers, including PRESENT, Serpent, and CLEFIA. For DES, none of these variants have improved upon Matsui\u27s original linear cryptanalysis, which has been the best known-plaintext key-recovery attack on the cipher ever since. In a revisit, Junod concluded that when using 2432^{43} known plaintexts, this attack has a complexity of 2412^{41} DES evaluations. His analysis relies on the standard assumptions of right-key equivalence and wrong-key randomisation. In this paper, we first investigate the validity of these fundamental assumptions when applied to DES. For the right key, we observe that strong linear approximations of DES have more than just one dominant trail and, thus, that the right keys are in fact inequivalent with respect to linear correlation. We therefore develop a new right-key model using Gaussian mixtures for approximations with several dominant trails. For the wrong key, we observe that the correlation of a strong approximation after the partial decryption with a wrong key still shows much non-randomness. To remedy this, we propose a novel wrong-key model that expresses the wrong-key linear correlation using a version of DES with more rounds. We extend the two models to the general case of multiple approximations, propose a likelihood-ratio classifier based on this generalisation, and show that it performs better than the classical Bayesian classifier. On the practical side, we find that the distributions of right-key correlations for multiple linear approximations of DES exhibit exploitable asymmetries. In particular, not all sign combinations in the correlation values are possible. This results in our improved multiple linear attack on DES using 4 linear approximations at a time. The lowest computational complexity of 238.862^{38.86} DES evaluations is achieved when using 242.782^{42.78} known plaintexts. Alternatively, using 2412^{41} plaintexts results in a computational complexity of 249.752^{49.75} DES evaluations. We perform practical experiments to confirm our model. To our knowledge, this is the best attack on DES

    Multivariate Profiling of Hulls for Linear Cryptanalysis

    Get PDF
    Extensions of linear cryptanalysis making use of multiple approximations, such as multiple and multidimensional linear cryptanalysis, are an important tool in symmetric-key cryptanalysis, among others being responsible for the best known attacks on ciphers such as Serpent and present. At CRYPTO 2015, Huang et al. provided a refined analysis of the key-dependent capacity leading to a refined key equivalence hypothesis, however at the cost of additional assumptions. Their analysis was extended by Blondeau and Nyberg to also cover an updated wrong key randomization hypothesis, using similar assumptions. However, a recent result by Nyberg shows the equivalence of linear dependence and statistical dependence of linear approximations, which essentially invalidates a crucial assumption on which all these multidimensional models are based. In this paper, we develop a model for linear cryptanalysis using multiple linearly independent approximations which takes key-dependence into account and complies with Nyberg’s result. Our model considers an arbitrary multivariate joint distribution of the correlations, and in particular avoids any assumptions regarding normality. The analysis of this distribution is then tailored to concrete ciphers in a practically feasible way by combining a signal/noise decomposition approach for the linear hulls with a profiling of the actual multivariate distribution of the signal correlations for a large number of keys, thereby entirely avoiding assumptions regarding the shape of this distribution. As an application of our model, we provide an attack on 26 rounds of present which is faster and requires less data than previous attacks, while using more realistic assumptions and far fewer approximations. We successfully extend the attack to present the first 27-round attack which takes key-dependence into account

    Improved Integral and Zero-correlation Linear Cryptanalysis of Reduced-round CLEFIA Block Cipher

    Get PDF
    CLEFIA is a block cipher developed by Sony Corporation in 2007. It is a recommended cipher of CRYPTREC, and has been adopted as ISO/IEC international standard in lightweight cryptography. In this paper, some new 9-round zero-correlation linear distinguishers of CLEFIA are constructed with the input masks and output masks being independent, which allow multiple zero-correlation linear attacks on 14/15-rounds CLEAIA-192/256 with the partial sum technique. Furthermore, the relations between integral distinguishers and zero-correlation linear approximations are improved, and some new integral distinguishers over 9-round are deduced from zero-correlation linear approximations. By using these integral distinguishers and the partial sum technique, the previous integral results on CLEFIA are improved. The two results have either one more rounds or lower time complexity than previous attack results by means of integral and zero-correlation linear cryptanalysis

    Multidimensional Zero-Correlation Linear Cryptanalysis of the Block Cipher KASUMI

    Full text link
    The block cipher KASUMI is widely used for security in many synchronous wireless standards. It was proposed by ETSI SAGE for usage in 3GPP (3rd Generation Partnership Project) ciphering algorthms in 2001. There are a great deal of cryptanalytic results on KASUMI, however, its security evaluation against the recent zero-correlation linear attacks is still lacking so far. In this paper, we select some special input masks to refine the general 5-round zero-correlation linear approximations combining with some observations on the FLFL functions and then propose the 6-round zero-correlation linear attack on KASUMI. Moreover, zero-correlation linear attacks on the last 7-round KASUMI are also introduced under some weak keys conditions. These weak keys take 2−142^{-14} of the whole key space. The new zero-correlation linear attack on the 6-round needs about 2852^{85} encryptions with 262.82^{62.8} known plaintexts. For the attack under weak keys conditions on the last 7 round, the data complexity is about 262.12^{62.1} known plaintexts and the time complexity 2110.52^{110.5} encryptions

    Rigorous Upper Bounds on Data Complexities of Block Cipher Cryptanalysis

    Get PDF
    Statistical analysis of symmetric key attacks aims to obtain an expression for the data complexity which is the number of plaintext-ciphertext pairs needed to achieve the parameters of the attack. Existing statistical analyses invariably use some kind of approximation, the most common being the approximation of the distribution of a sum of random variables by a normal distribution. Such an approach leads to expressions for data complexities which are {\em inherently approximate}. Prior works do not provide any analysis of the error involved in such approximations. In contrast, this paper takes a rigorous approach to analysing attacks on block ciphers. In particular, no approximations are used. Expressions for upper bounds on the data complexities of several basic and advanced attacks are obtained. The analysis is based on the hypothesis testing framework. Probabilities of Type-I and Type-II errors are upper bounded using standard tail inequalities. In the cases of single linear and differential cryptanalysis, we use the Chernoff bound. For the cases of multiple linear and multiple differential cryptanalysis, Hoeffding bounds are used. This allows bounding the error probabilities and obtaining expressions for data complexities. We believe that our method provides important results for the attacks considered here and more generally, the techniques that we develop should have much wider applicability

    Aerodynamic Stability of Satellites in Elliptic Low Earth Orbits

    Full text link
    Topical observations of the thermosphere at altitudes below 200 km200 \, km are of great benefit in advancing the understanding of the global distribution of mass, composition, and dynamical responses to geomagnetic forcing, and momentum transfer via waves. The perceived risks associated with such low altitude and short duration orbits has prohibited the launch of Discovery-class missions. Miniaturization of instruments such as mass spectrometers and advances in the nano-satellite technology, associated with relatively low cost of nano-satellite manufacturing and operation, open an avenue for performing low altitude missions. The time dependent coefficients of a second order non-homogeneous ODE which describes the motion have a double periodic shape. Hence, they will be approximated using Jacobi elliptic functions. Through a change of variables the original ODE will be converted into Hill's ODE for stability analysis using Floquet theory. We are interested in how changes in the coefficients of the ODE affect the stability of the solution. The expected result will be an allowable range of parameters for which the motion is dynamically stable. A possible extension of the application is a computational tool for the rapid evaluation of the stability of entry or re-entry vehicles in rarefied flow regimes and of satellites flying in relatively low orbits.Comment: 18 pages, 16 figure

    Polynomial Response Surface Approximations for the Multidisciplinary Design Optimization of a High Speed Civil Transport

    Get PDF
    Surrogate functions have become an important tool in multidisciplinary design optimization to deal with noisy functions, high computational cost, and the practical difficulty of integrating legacy disciplinary computer codes. A combination of mathematical, statistical, and engineering techniques, well known in other contexts, have made polynomial surrogate functions viable for MDO. Despite the obvious limitations imposed by sparse high fidelity data in high dimensions and the locality of low order polynomial approximations, the success of the panoply of techniques based on polynomial response surface approximations for MDO shows that the implementation details are more important than the underlying approximation method (polynomial, spline, DACE, kernel regression, etc.). This paper surveys some of the ancillary techniques—statistics, global search, parallel computing, variable complexity modeling—that augment the construction and use of polynomial surrogates
    • …
    corecore