55 research outputs found

    Analysis of BCNS and Newhope Key-exchange Protocols

    Get PDF
    Lattice-based cryptographic primitives are believed to offer resilience against attacks by quantum computers. Following increasing interest from both companies and government agencies in building quantum computers, a number of works have proposed instantiations of practical post-quantum key-exchange protocols based on hard problems in lattices, mainly based on the Ring Learning With Errors (R-LWE) problem. In this work we present an analysis of Ring-LWE based key-exchange mechanisms and compare two implementations of Ring-LWE based key-exchange protocol: BCNS and NewHope. This is important as NewHope protocol implementation outperforms state-of-the art elliptic curve based Diffie-Hellman key-exchange X25519, thus showing that using quantum safe key-exchange is not only a viable option but also a faster one. Specifically, this thesis compares different reconciliation methods, parameter choices, noise sampling algorithms and performance

    Modern Cryptography Volume 1

    Get PDF
    This open access book systematically explores the statistical characteristics of cryptographic systems, the computational complexity theory of cryptographic algorithms and the mathematical principles behind various encryption and decryption algorithms. The theory stems from technology. Based on Shannon's information theory, this book systematically introduces the information theory, statistical characteristics and computational complexity theory of public key cryptography, focusing on the three main algorithms of public key cryptography, RSA, discrete logarithm and elliptic curve cryptosystem. It aims to indicate what it is and why it is. It systematically simplifies and combs the theory and technology of lattice cryptography, which is the greatest feature of this book. It requires a good knowledge in algebra, number theory and probability statistics for readers to read this book. The senior students majoring in mathematics, compulsory for cryptography and science and engineering postgraduates will find this book helpful. It can also be used as the main reference book for researchers in cryptography and cryptographic engineering areas

    Decoding by Embedding: Correct Decoding Radius and DMT Optimality

    Get PDF
    The closest vector problem (CVP) and shortest (nonzero) vector problem (SVP) are the core algorithmic problems on Euclidean lattices. They are central to the applications of lattices in many problems of communications and cryptography. Kannan's \emph{embedding technique} is a powerful technique for solving the approximate CVP, yet its remarkable practical performance is not well understood. In this paper, the embedding technique is analyzed from a \emph{bounded distance decoding} (BDD) viewpoint. We present two complementary analyses of the embedding technique: We establish a reduction from BDD to Hermite SVP (via unique SVP), which can be used along with any Hermite SVP solver (including, among others, the Lenstra, Lenstra and Lov\'asz (LLL) algorithm), and show that, in the special case of LLL, it performs at least as well as Babai's nearest plane algorithm (LLL-aided SIC). The former analysis helps to explain the folklore practical observation that unique SVP is easier than standard approximate SVP. It is proven that when the LLL algorithm is employed, the embedding technique can solve the CVP provided that the noise norm is smaller than a decoding radius λ1/(2γ)\lambda_1/(2\gamma), where λ1\lambda_1 is the minimum distance of the lattice, and γO(2n/4)\gamma \approx O(2^{n/4}). This substantially improves the previously best known correct decoding bound γO(2n)\gamma \approx {O}(2^{n}). Focusing on the applications of BDD to decoding of multiple-input multiple-output (MIMO) systems, we also prove that BDD of the regularized lattice is optimal in terms of the diversity-multiplexing gain tradeoff (DMT), and propose practical variants of embedding decoding which require no knowledge of the minimum distance of the lattice and/or further improve the error performance.Comment: To appear in IEEE Transactions on Information Theor

    Modern Cryptography Volume 1

    Get PDF
    This open access book systematically explores the statistical characteristics of cryptographic systems, the computational complexity theory of cryptographic algorithms and the mathematical principles behind various encryption and decryption algorithms. The theory stems from technology. Based on Shannon's information theory, this book systematically introduces the information theory, statistical characteristics and computational complexity theory of public key cryptography, focusing on the three main algorithms of public key cryptography, RSA, discrete logarithm and elliptic curve cryptosystem. It aims to indicate what it is and why it is. It systematically simplifies and combs the theory and technology of lattice cryptography, which is the greatest feature of this book. It requires a good knowledge in algebra, number theory and probability statistics for readers to read this book. The senior students majoring in mathematics, compulsory for cryptography and science and engineering postgraduates will find this book helpful. It can also be used as the main reference book for researchers in cryptography and cryptographic engineering areas

    Cryptanalysis of a Message Authentication Code due to Cary and Venkatesan

    Full text link

    Ring Learning With Errors: A crossroads between postquantum cryptography, machine learning and number theory

    Get PDF
    The present survey reports on the state of the art of the different cryptographic functionalities built upon the ring learning with errors problem and its interplay with several classical problems in algebraic number theory. The survey is based to a certain extent on an invited course given by the author at the Basque Center for Applied Mathematics in September 2018.Comment: arXiv admin note: text overlap with arXiv:1508.01375 by other authors/ comment of the author: quotation has been added to Theorem 5.

    Development of Visual Cryptography Technique for Authentication Using Facial Images

    Get PDF
    Security in the real world is an important issue to be taken care and to be encountered with various aspects and preventive measures. In the present era, whole major security concerns is the protection of this multimedia web is coming closer from text data to multimedia data, one of the data. Image, which covers the highest percentage of the multimedia data, its protection is very important. These might include Military Secrets, Commercial Secrets and Information of individuals. This can be achieved by visual Cryptography. It is one kind of image encryption. Incurrent technology, most of visual cryptography areembedded a secret using multiple shares. Visual is secret sharing technique used in visual cryptography which divides the secret image into multiple shares and by superimposing those shares the original secret image is going to be revealed, but it create a threat when an intruder get shares with which the image is going to be decrypted easily. However in these project work, an extremely useful bitwise operation is perform on every pixel with the help of key. The key is provided by new concept of sterilization algorithm. Initially Red, Green and Blue channels get separated from image and are going to be encrypted on multiple levels using multiple shares, convert an image into unreadable format and by combining all the shares in proper sequence the original secret image revealed

    Shifted Eisenstein polynomials, irreducible compositions of polynomials and group key exchanges

    Full text link
    In my dissertation, I have covered multiple different topics. First, we consider the concept of natural density over the integers, and extend it to holomorphy rings over function fields. This allows us to give a function field analogue of Cesàro’s theorem, which gives the “probability” that an m-tuple of random elements of the holomorphy ring is oprime. We also generalize this and consider the density of k × m matrices over holomorphy rings which can be extended to unimodular m × m matrices. In the second part, we determine the natural density of shifted Eisenstein polynomials. This means that we compute the density of integer polynomials f(x) of a fixed degree n for which some shift f(x + i) for an integer i satisfies Eisenstein’s irreducibility criterion. We then also compute the density of affine Eisenstein polynomials. Thirdly, we consider an arbitrary set of monic quadratic polynomials over a finite field and ask ourselves which compositions of copies of them are irreducible. We first give a criterion to decide whether all such compositions are irreducible, and then show that in general, the irreducible compositions have the structure of a regular language. In the final chapter, we study cryptographic protocols for key exchange in ad-hoc groups. We first translate some protocols from the literature to the more general setting of semigroup actions, and then propose our own variants of these protocols, which aim to have improved security or efficiency. Then, we demonstrate a couple of active attacks on certain such protocols which are in some ways more powerful than man-in-the-middle attacks

    Про нові потоковi алгоритми створення чутливих дайджестiв електронних документів

    No full text
    Для прийняття обґрунтованих планових рішень у суспільно-економічній сфері спеціалісти повинні користуватися перевіреними документами. До засобів перевірки документів належать криптографічно стабільні алгоритми компресії великого файлу в дайджест визначеного розміру, чутливий до будь-якої зміни символів на вході. Пропонуються нові швидкі алгоритми компресії, криптографічна стабільність яких пов’язується зі складними алгебраїчними проблемами, такими як дослідження систем алгебраїчних рівнянь великої степені та задача розкладу нелінійного відображення простору за твірними. Запропоновані алгоритми створення чутливих до змін дайджестів документів будуть використані для виявлення кібератак та аудиту усіх файлів системи після зареєстрованого втручання.Specialists must use well checked documents to elaborate well founded,decisions and plans in the socio-economic field. Check tools include cryptographically stable algorithms for compressing a large file into a digest of a specified size, sensitive to any change in the characters on the input. New fast compression algorithms are proposed, whose cryptographic stability is associated with complex algebraic problems, such as the study of systems of algebraic equations of large power and the problem of the expansion of nonlinear mapping of space by generators. The proposed algorithms for creation of change-sensitive digests will be used to detect cyberattacks and audit all system files after a registered intervention
    corecore