417 research outputs found

    A Computational Study of Icart\u27s Function

    Get PDF
    A hash function maps some elements of a larger, initial set to elements of a smaller, resultant set. By nature, this leads to collisions and, sometimes, not all elements in the smaller set will be mapped to as a result. The set in consideration here is all points on an elliptic curve. This is a special class of curve with two variables, which takes the form here as y2 = x3 + ax + b. A hash function is useful in offering a deterministic way to map an input to a pair of x and y values that satisfy such an equation. This paper experimentally verifies that an asymptotic result on the size of the image for Icart\u27s hash function provided by Fouque and Tibouchi is true for small primes less than 219 and for all curves of conductor less than or equal to 100. Combined with Fouque and Tibouchi\u27s asymptotic result, this proves that the coverage of Icart\u27s hash function is a 5/8 fraction of the points (with some error)

    Quantum attacks on Bitcoin, and how to protect against them

    Get PDF
    The key cryptographic protocols used to secure the internet and financial transactions of today are all susceptible to attack by the development of a sufficiently large quantum computer. One particular area at risk are cryptocurrencies, a market currently worth over 150 billion USD. We investigate the risk of Bitcoin, and other cryptocurrencies, to attacks by quantum computers. We find that the proof-of-work used by Bitcoin is relatively resistant to substantial speedup by quantum computers in the next 10 years, mainly because specialized ASIC miners are extremely fast compared to the estimated clock speed of near-term quantum computers. On the other hand, the elliptic curve signature scheme used by Bitcoin is much more at risk, and could be completely broken by a quantum computer as early as 2027, by the most optimistic estimates. We analyze an alternative proof-of-work called Momentum, based on finding collisions in a hash function, that is even more resistant to speedup by a quantum computer. We also review the available post-quantum signature schemes to see which one would best meet the security and efficiency requirements of blockchain applications.Comment: 21 pages, 6 figures. For a rough update on the progress of Quantum devices and prognostications on time from now to break Digital signatures, see https://www.quantumcryptopocalypse.com/quantum-moores-law

    Progress in Cryptology - LATINCRYPT 2010

    Get PDF
    International audienceThis book constitutes the proceedings of the First International Conference on Cryptology and Information Security in Latin America, LATINCRYPT 2010, held in Puebla, Mexico, on August 8-11, 2010. The 19 papers presented together with four invited talks were carefully reviewed and selected from 62 submissions. The topics covered are encryption, elliptic curves, implementation of pairings, implementation of cryptographic algorithms, cryptographic protocols and foundations, cryptanalysis of symmetric primitives, post-quantum cryptography, and side-channel attack

    Efficient Indifferentiable Hashing into Ordinary Elliptic Curves

    Get PDF
    We provide the first construction of a hash function into ordinary elliptic curves that is indifferentiable from a random oracle, based on Icart\u27s deterministic encoding from Crypto 2009. While almost as efficient as Icart\u27s encoding, this hash function can be plugged into any cryptosystem that requires hashing into elliptic curves, while not compromising proofs of security in the random oracle model. We also describe a more general (but less efficient) construction that works for a large class of encodings into elliptic curves, for example the Shallue-Woestijne-Ulas (SWU) algorithm. Finally we describe the first deterministic encoding algorithm into elliptic curves in characteristic 3

    Overview of blockchain technology cryptographic security

    Get PDF
    This thesis work is aimed at developing understanding of the hash functions and algorithms being used in blockchain technologies Bitcoin in comparison to Ethereum and private blockchain hash functions. This study attempts to answer one fundamental research question: “What considerations are important in assessing blockchain cryptographic security, with an emphasis on hash functions”. The study was carried out qualitatively using a desk research approach and combining this approach with using two public blockchains-based cryptocurrencies; Ethereum and Bitcoin as case studies. The research aims to provide a holistic view of blockchain cryptographic security comparing Bitcoin and Ethereum as use cases, and thus providing a consolidated document which students studying cryptography can access to obtain a better understanding of what is involved in blockchain security. From an academic perspective, the research aims at providing a model which can be used in assessing what is important to consider in the cryptographic security of blockchains. Three main categories of factors considered were presented in the proposed model which were strategical factors, complexity attributes and technical drivers. This results in a base crucial metrics such as absence of secret seeds, efficiency of verification, preimage collision resistance, fixed output size, low collision probability, and even distribution of preimages in output

    Computationally efficient search for large primes

    Get PDF
    To satisfy the speed of communication and to meet the demand for the continuously larger prime numbers, the primality testing and prime numbers generating algorithms require continuous advancement. To find the most efficient algorithm, a need for a survey of methods arises. Concurrently, an urge for the analysis of algorithms\u27 performances emanates. The critical criteria in the analysis of the prime numbers generation are the number of probes, number of generated primes, and an average time required in producing one prime. Hence, the purpose of this thesis is to indicate the best performing algorithm. The survey the methods, establishment of the comparison criteria, and comparison of approaches are the required steps to find the best performing algorithm. In the first step of this research paper the methods were surveyed and classified using the approach described in Menezes [66]. Wifle chapter 2 sorted, described, compared, and summarized primality testing methods, chapter 3 sorted, described, compared, and summarized prime numbers generating methods. In the next step applying a uniform technique, the computer programs were written to the selected algorithms. The programs were installed on the Unix operating system, running on the Sun 5.8 server to perform the computer experiments. The computer experiments\u27 results pertaining to the selected algorithms, provided required parameters to compare the algorithms\u27 performances. The results from the computer experiments were tabulated to compare the parameters and to indicate the best performing algorithm. Survey of methods indicated that the deterministic and randomized are the main approaches in prime numbers generation. Random number generation found application in the cryptographic keys generation. Contemporaneously, a need for deterministically generated provable primes emerged in the code encryption, decryption, and in the other cryptographic areas. The analysis of algorithms\u27 performances indicated that the prime nurnbers generated through the randomized techniques required smaller number of probes. This is due to the method that eliminates the non-primes in the initial step, that pre-tests randomly generated primes for possible divisibility factors. Analysis indicated that the smaller number of probes increases algorithm\u27s efficiency. Further analysis indicated that a ratio of randomly generated primes to the expected number of primes, generated in the specific interval is smaller than the deterministically generated primes. In this comparison the Miller-Rabin\u27s and the Gordon\u27s algorithms that randomly generate primes were compared versus the SFA and the Sequences Containing Primes. The name Sequences Containing Primes algorithm is abbreviated in this thesis as 6kseq. In the interval [99000,1000001 the Miller Rabin method generated 57 out of 87 expected primes, the SFA algorithm generated 83 out of 87 approximated primes. The expected number of primes was computed using the approximation n/ln(n) presented by Menezes [66]. The average consumed time of originating one prime in the [99000, 100000] interval recorded 0.056 [s] for Miller-Rabin test, 0.0001 [s] for SFA, and 0.0003 [s] for 6kseq. The Gordon\u27s algorithm in the interval [1,100000] required 100578 probes and generated 32 out of 8686 expected number of primes. Algorithm Parametric Representation of Composite Twins and Generation of Prime and Quasi Prime Numbers invented by Doctor Verkhovsky [1081 verifies and generates primes and quasi primes using special mathematical constructs. This algorithm indicated best performance in the interval [1,1000] generating and verifying 3585 variances of provable primes or quasi primes. The Parametric Representation of Composite Twins algorithm consumed an average time per prime, or quasi prime of 0.0022315 [s]. The Parametric Representation of Composite Twins and Generation of Prime and Quasi Prime Numbers algorithm implements very unique method of testing both primes and quasi-primes. Because of the uniqueness of the method that verifies both primes and quasi-primes, this algorithm cannot be compared with the other primality testing or prime numbers generating algorithms. The ((a!)^2)*((-1^b) Function In Generating Primes algorithm [105] developed by Doctor Verkhovsky was compared versus extended Fermat algorithm. In the range of [1,10001 the [105] algorithm exhausted an average 0.00001 [s] per prime, originated 167 primes, while the extended Fermat algorithm also produced 167 primes, but consumed an average 0.00599 [s] per prime. Thus, the computer experiments and comparison of methods proved that the SFA algorithm is deterministic, that originates provable primes. The survey of methods and analysis of selected approaches indicated that the SFA sieve algorithm that sequentially generates primes is computationally efficient, indicated better performance considering the computational speed, the simplicity of method, and the number of generated primes in the specified intervals

    Divisibility, Smoothness and Cryptographic Applications

    Get PDF
    This paper deals with products of moderate-size primes, familiarly known as smooth numbers. Smooth numbers play a crucial role in information theory, signal processing and cryptography. We present various properties of smooth numbers relating to their enumeration, distribution and occurrence in various integer sequences. We then turn our attention to cryptographic applications in which smooth numbers play a pivotal role

    Theoretical and practical efficiency aspects in cryptography

    Get PDF
    EThOS - Electronic Theses Online ServiceGBUnited Kingdo
    • …
    corecore