274 research outputs found

    Implementing the asymptotically fast version of the elliptic curve primality proving algorithm

    Get PDF
    The elliptic curve primality proving (ECPP) algorithm is one of the current fastest practical algorithms for proving the primality of large numbers. Its running time cannot be proven rigorously, but heuristic arguments show that it should run in time O ((log N)^5) to prove the primality of N. An asymptotically fast version of it, attributed to J. O. Shallit, runs in time O ((log N)^4). The aim of this article is to describe this version in more details, leading to actual implementations able to handle numbers with several thousands of decimal digits

    Cryptanalysis of the Randomized Version of a Lattice-Based Signature Scheme from PKC'08

    Get PDF
    International audienceIn PKC'08, Plantard, Susilo and Win proposed a lattice-based signature scheme, whose security is based on the hardness of the closest vector problem with the infinity norm (CVP∞). This signature scheme was proposed as a countermeasure against the Nguyen-Regev attack, which improves the security and the efficiency of the Goldreich, Goldwasser and Halevi scheme (GGH). Furthermore, to resist potential side channel attacks, the authors suggested modifying the determinis-tic signing algorithm to be randomized. In this paper, we propose a chosen message attack against the randomized version. Note that the randomized signing algorithm will generate different signature vectors in a relatively small cube for the same message, so the difference of any two signature vectors will be relatively short lattice vector. Once collecting enough such short difference vectors, we can recover the whole or the partial secret key by lattice reduction algorithms, which implies that the randomized version is insecure under the chosen message attack

    Post-quantum cryptography

    Get PDF
    Cryptography is essential for the security of online communication, cars and implanted medical devices. However, many commonly used cryptosystems will be completely broken once large quantum computers exist. Post-quantum cryptography is cryptography under the assumption that the attacker has a large quantum computer; post-quantum cryptosystems strive to remain secure even in this scenario. This relatively young research area has seen some successes in identifying mathematical operations for which quantum algorithms offer little advantage in speed, and then building cryptographic systems around those. The central challenge in post-quantum cryptography is to meet demands for cryptographic usability and flexibility without sacrificing confidence.</p

    Factorization of a 512 bit RSA modulus

    Get PDF
    This paper reports on the factorization of the 512 bit number RSA-155 by the number field Sieve factoring method (NFS) and discusses the implications for RS

    Knowledge discovery from trajectories

    Get PDF
    Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial TechnologiesAs a newly proliferating study area, knowledge discovery from trajectories has attracted more and more researchers from different background. However, there is, until now, no theoretical framework for researchers gaining a systematic view of the researches going on. The complexity of spatial and temporal information along with their combination is producing numerous spatio-temporal patterns. In addition, it is very probable that a pattern may have different definition and mining methodology for researchers from different background, such as Geographic Information Science, Data Mining, Database, and Computational Geometry. How to systematically define these patterns, so that the whole community can make better use of previous research? This paper is trying to tackle with this challenge by three steps. First, the input trajectory data is classified; second, taxonomy of spatio-temporal patterns is developed from data mining point of view; lastly, the spatio-temporal patterns appeared on the previous publications are discussed and put into the theoretical framework. In this way, researchers can easily find needed methodology to mining specific pattern in this framework; also the algorithms needing to be developed can be identified for further research. Under the guidance of this framework, an application to a real data set from Starkey Project is performed. Two questions are answers by applying data mining algorithms. First is where the elks would like to stay in the whole range, and the second is whether there are corridors among these regions of interest

    Fast Lattice Basis Reduction Suitable for Massive Parallelization and Its Application to the Shortest Vector Problem

    Get PDF
    The hardness of the shortest vector problem for lattices is a fundamental assumption underpinning the security of many lattice-based cryptosystems, and therefore, it is important to evaluate its difficulty. Here, recent advances in studying the hardness of problems in large-scale lattice computing have pointed to need to study the design and methodology for exploiting the performance of massive parallel computing environments. In this paper, we propose a lattice basis reduction algorithm suitable for massive parallelization. Our parallelization strategy is an extension of the Fukase-Kashiwabara algorithm~(J. Information Processing, Vol. 23, No. 1, 2015). In our algorithm, given a lattice basis as input, variants of the lattice basis are generated, and then each process reduces its lattice basis; at this time, the processes cooperate and share auxiliary information with each other to accelerate lattice basis reduction. In addition, we propose a new strategy based on our evaluation function of a lattice basis in order to decrease the sum of squared lengths of orthogonal basis vectors. We applied our algorithm to problem instances from the SVP Challenge. We solved a 150-dimension problem instance in about 394 days by using large clusters, and we also solved problem instances of dimensions 134, 138, 140, 142, 144, 146, and 148. Since the previous world record is the problem of dimension 132, these results demonstrate the effectiveness of our proposal
    • …
    corecore