950 research outputs found

    A survey on machine learning applied to symmetric cryptanalysis

    Get PDF
    In this work we give a short review of the recent progresses of machine learning techniques applied to cryptanalysis of symmetric ciphers, with particular focus on artificial neural networks. We start with some terminology and basics of neural networks, to then classify the recent works in two categories: "black-box cryptanalysis", techniques that not require previous information about the cipher, and "neuro-aided cryptanalysis", techniques used to improve existing methods in cryptanalysis

    Performance comparison between deep learning-based and conventional cryptographic distinguishers

    Get PDF
    While many similarities between Machine Learning and cryptanalysis tasks exists, so far no major result in cryptanalysis has been reached with the aid of Machine Learning techniques. One exception is the recent work of Gohr, presented at Crypto 2019, where for the first time, conventional cryptanalysis was combined with the use of neural networks to build a more efficient distinguisher and, consequently, a key recovery attack on Speck32/64. On the same line, in this work we propose two Deep Learning (DL) based distinguishers against the Tiny Encryption Algorithm (TEA) and its evolution RAIDEN. Both ciphers have twice block and key size compared to Speck32/64. We show how these two distinguishers outperform a conventional statistical distinguisher, with no prior information on the cipher, and a differential distinguisher based on the differential trails presented by Biryukov and Velichkov at FSE 2014. We also present some variations of the DL-based distinguishers, discuss some of their extra features, and propose some directions for future research

    Monte Carlo Tree Search for automatic differential characteristics search: application to SPECK

    Get PDF
    The search for differential characteristics on block ciphers is a difficult combinatorial problem. In this paper, we investigate the performances of an AI-originated technique, Single Player Monte-Carlo Tree Search (SP-MCTS), in finding good differential characteristics on ARX ciphers, with an application to the block cipher SPECK. In order to make this approach competitive, we include several heuristics, such as the combination of forward and backward searches, and achieve significantly faster results than state-of-the-art works that are not based on automatic solvers. We reach 9, 11, 13, 13 and 15 rounds for SPECK32, SPECK48, SPECK64, SPECK96 and SPECK128 respectively. In order to build our algorithm, we revisit Lipmaa and Moriai\u27s algorithm for listing all optimal differential transitions through modular addition, and propose a variant to enumerate all transitions with probability close (up to a fixed threshold) to the optimal, while fixing a minor bug in the original algorithm

    A Cipher-Agnostic Neural Training Pipeline with Automated Finding of Good Input Differences

    Get PDF
    Neural cryptanalysis is the study of cryptographic primitives throughmachine learning techniques. Following Gohr’s seminal paper at CRYPTO 2019, afocus has been placed on improving the accuracy of such distinguishers against specific primitives, using dedicated training schemes, in order to obtain better key recovery attacks based on machine learning. These distinguishers are highly specialized and not trivially applicable to other primitives. In this paper, we focus on the opposite problem: building a generic pipeline for neural cryptanalysis. Our tool is composed of two parts. The first part is an evolutionary algorithm for the search of good input differences for neural distinguishers. The second part is DBitNet, a neuraldistinguisher architecture agnostic to the structure of the cipher. We show thatthis fully automated pipeline is competitive with a highly specialized approach, inparticular for SPECK32, and SIMON32. We provide new neural distinguishers forseveral primitives (XTEA, LEA, HIGHT, SIMON128, SPECK128) and improve overthe state-of-the-art for PRESENT, KATAN, TEA and GIMLI

    Allylic and Allenylic Dearomatization of Indoles promoted by Graphene Oxide via Covalent Grafting Activation Mode

    Get PDF
    The site‐selective allylative and allenylative dearomatization of indoles with alcohols is performed under carbocatalytic regime in the presence of graphene oxide (GO, 10 wt% loading) as the promoter. Metal‐free conditions, absence of stoichiometric additive, environmentally friendly conditions (H2O/CH3CN, 55 \ub0C, 6 h), broad substrate scope (33 examples, yield up to 92%) and excellent site‐ and stereoselectivity characterize the present methodology. Moreover, a covalent activation model exerted by GO functionalities was corroborated by spectroscopic, experimental and computational evidences. Recovering and regeneration of the GO catalyst via simple acidic treatment was also documented

    A Cipher-Agnostic Neural Training Pipeline with Automated Finding of Good Input Differences

    Get PDF
    Neural cryptanalysis is the study of cryptographic primitives through machine learning techniques. Following Gohr’s seminal paper at CRYPTO 2019, a focus has been placed on improving the accuracy of such distinguishers against specific primitives, using dedicated training schemes, in order to obtain better key recovery attacks based on machine learning. These distinguishers are highly specialized and not trivially applicable to other primitives. In this paper, we focus on the opposite problem: building a generic pipeline for neural cryptanalysis. Our tool is composed of two parts. The first part is an evolutionary algorithm for the search of good input differences for neural distinguishers. The second part is DBitNet, a neural distinguisher architecture agnostic to the structure of the cipher. We show that this fully automated pipeline is competitive with a highly specialized approach, in particular for SPECK32, and SIMON32. We provide new neural distinguishers for several primitives (XTEA, LEA, HIGHT, SIMON128, SPECK128) and improve over the state-of-the-art for PRESENT, KATAN, TEA and GIMLI
    corecore