15 research outputs found

    Prime and Prejudice:Primality Testing Under Adversarial Conditions

    Get PDF
    This work provides a systematic analysis of primality testing under adversarial conditions, where the numbers being tested for primality are not generated randomly, but instead provided by a possibly malicious party. Such a situation can arise in secure messaging protocols where a server supplies Diffie-Hellman parameters to the peers, or in a secure communications protocol like TLS where a developer can insert such a number to be able to later passively spy on client-server data. We study a broad range of cryptographic libraries and assess their performance in this adversarial setting. As examples of our findings, we are able to construct 2048-bit composites that are declared prime with probability 1/161/16 by OpenSSL\u27s primality testing in its default configuration; the advertised performance is 2−802^{-80}. We can also construct 1024-bit composites that always pass the primality testing routine in GNU GMP when configured with the recommended minimum number of rounds. And, for a number of libraries (Cryptlib, LibTomCrypt, JavaScript Big Number, WolfSSL), we can construct composites that always pass the supplied primality tests. We explore the implications of these security failures in applications, focusing on the construction of malicious Diffie-Hellman parameters. We show that, unless careful primality testing is performed, an adversary can supply parameters (p,q,g)(p,q,g) which on the surface look secure, but where the discrete logarithm problem in the subgroup of order qq generated by gg is easy. We close by making recommendations for users and developers. In particular, we promote the Baillie-PSW primality test which is both efficient and conjectured to be robust even in the adversarial setting for numbers up to a few thousand bits

    SOME REMARKS ON LUCAS PSEUDOPRIMES

    Get PDF
    We present a way of viewing Lucas pseudoprimes, Euler-Lucas pseudoprimes and strong Lucas pseudoprimes in the context of group schemes. This enables us to treat the Lucas pseudoprimalities in parallel to establish pseudoprimes, Euler pseudoprimes and strong pseudoprimes

    Strengthening the Baillie-PSW primality test

    Full text link
    The Baillie-PSW primality test combines Fermat and Lucas probable prime tests. It reports that a number is either composite or probably prime. No odd composite integer has been reported to pass this combination of primality tests if the parameters are chosen in an appropriate way. Here, we describe a significant strengthening of this test that comes at almost no additional computational cost. This is achieved by including in the test what we call Lucas-V pseudoprimes, of which there are only five less than 101510^{15}.Comment: 25 page

    Fooling primality tests on smartcards

    Get PDF
    We analyse whether the smartcards of the JavaCard platform correctly validate primality of domain parameters. The work is inspired by the paper Prime and prejudice: primality testing under adversarial conditions, where the authors analysed many open-source libraries and constructed pseudoprimes fooling the primality testing functions. However, in the case of smartcards, often there is no way to invoke the primality test directly, so we trigger it by replacing (EC)DSA and (EC)DH prime domain parameters by adversarial composites. Such a replacement results in vulnerability to Pohlig-Hellman style attacks, leading to private key recovery. Out of nine smartcards (produced by five major manufacturers) we tested, all but one have no primality test in parameter validation. As the JavaCard platform provides no public primality testing API, the problem cannot be fixed by an extra parameter check, %an additional check before the parameters are passed to existing (EC)DSA and (EC)DH functions, making it difficult to mitigate in already deployed smartcards

    Primality Tests on Commutator Curves

    Get PDF
    Das Thema dieser Dissertation sind effiziente Primzahltests. Zunächst wird die Kommutatorkurve eingeführt, die durch einen skalaren Parameter in der zweidimensionalen speziellen linearen Gruppe bestimmt wird. Nach Erforschung der Grundlagen dieser Kurve wird sie in verschiedene Pseudoprimzahltests (z.B. Fermat-Test, Solovay-Strassen-Test) eingebunden. Als wichtigster Pseudoprimzahltest ist dabei der Kommutatorkurventest zu nennen. Es wird bewiesen, dass dieser Test nach einer festen Anzahl von Probedivisionen (alle Primzahlen kleiner 80) das Ergebnis 'wahr' für eine zusammengesetzte Zahl mit einer Wahrscheinlichkeit ausgibt, die kleiner als 1/16 ist. Darüberhinaus wird bewiesen, dass der Miller-Primzahltest unter der Annahme der Korrektheit der Erweiterten Riemannschen Hypothese zur Überprüfung einer Zahl n nur noch für alle Primzahlbasen kleiner als 3/2*ln(n)^2 durchgeführt werden muss. Im Beweis des Primzahltests von G. L. Miller konnte dabei die Notwendigkeit der Erweiterten Riemannschen Hypothese auf nur noch ein Schlüssellemma eingegrenzt werden.This thesis is about efficient primality tests. First, the commutator curve which is described by one scalar parameter in the two-dimensional special linear group will be introduced. After fundamental research of of this curve, it will be included into different compositeness tests (e.g. Fermat's test, Solovay-Strassen test). The most important commutator test is the Commutator Curve Test. Besides, it will be proved that this test after a fixed number of trial divisions (all prime numbers up to 80) returns the result 'true' for a composite number with a probability less than 1/16. Moreover, it will be shown that Miller's test to check a number n only has to be carried out for all prime bases less than 3/2*ln(n)^2. This happens under the assumption that the Extended Riemann Hypothesis is true. The necessity of the Extended Riemann Hypothesis to prove the primality test of G. L. Miller can be reduced to a single key lemma

    Computationally efficient search for large primes

    Get PDF
    To satisfy the speed of communication and to meet the demand for the continuously larger prime numbers, the primality testing and prime numbers generating algorithms require continuous advancement. To find the most efficient algorithm, a need for a survey of methods arises. Concurrently, an urge for the analysis of algorithms\u27 performances emanates. The critical criteria in the analysis of the prime numbers generation are the number of probes, number of generated primes, and an average time required in producing one prime. Hence, the purpose of this thesis is to indicate the best performing algorithm. The survey the methods, establishment of the comparison criteria, and comparison of approaches are the required steps to find the best performing algorithm. In the first step of this research paper the methods were surveyed and classified using the approach described in Menezes [66]. Wifle chapter 2 sorted, described, compared, and summarized primality testing methods, chapter 3 sorted, described, compared, and summarized prime numbers generating methods. In the next step applying a uniform technique, the computer programs were written to the selected algorithms. The programs were installed on the Unix operating system, running on the Sun 5.8 server to perform the computer experiments. The computer experiments\u27 results pertaining to the selected algorithms, provided required parameters to compare the algorithms\u27 performances. The results from the computer experiments were tabulated to compare the parameters and to indicate the best performing algorithm. Survey of methods indicated that the deterministic and randomized are the main approaches in prime numbers generation. Random number generation found application in the cryptographic keys generation. Contemporaneously, a need for deterministically generated provable primes emerged in the code encryption, decryption, and in the other cryptographic areas. The analysis of algorithms\u27 performances indicated that the prime nurnbers generated through the randomized techniques required smaller number of probes. This is due to the method that eliminates the non-primes in the initial step, that pre-tests randomly generated primes for possible divisibility factors. Analysis indicated that the smaller number of probes increases algorithm\u27s efficiency. Further analysis indicated that a ratio of randomly generated primes to the expected number of primes, generated in the specific interval is smaller than the deterministically generated primes. In this comparison the Miller-Rabin\u27s and the Gordon\u27s algorithms that randomly generate primes were compared versus the SFA and the Sequences Containing Primes. The name Sequences Containing Primes algorithm is abbreviated in this thesis as 6kseq. In the interval [99000,1000001 the Miller Rabin method generated 57 out of 87 expected primes, the SFA algorithm generated 83 out of 87 approximated primes. The expected number of primes was computed using the approximation n/ln(n) presented by Menezes [66]. The average consumed time of originating one prime in the [99000, 100000] interval recorded 0.056 [s] for Miller-Rabin test, 0.0001 [s] for SFA, and 0.0003 [s] for 6kseq. The Gordon\u27s algorithm in the interval [1,100000] required 100578 probes and generated 32 out of 8686 expected number of primes. Algorithm Parametric Representation of Composite Twins and Generation of Prime and Quasi Prime Numbers invented by Doctor Verkhovsky [1081 verifies and generates primes and quasi primes using special mathematical constructs. This algorithm indicated best performance in the interval [1,1000] generating and verifying 3585 variances of provable primes or quasi primes. The Parametric Representation of Composite Twins algorithm consumed an average time per prime, or quasi prime of 0.0022315 [s]. The Parametric Representation of Composite Twins and Generation of Prime and Quasi Prime Numbers algorithm implements very unique method of testing both primes and quasi-primes. Because of the uniqueness of the method that verifies both primes and quasi-primes, this algorithm cannot be compared with the other primality testing or prime numbers generating algorithms. The ((a!)^2)*((-1^b) Function In Generating Primes algorithm [105] developed by Doctor Verkhovsky was compared versus extended Fermat algorithm. In the range of [1,10001 the [105] algorithm exhausted an average 0.00001 [s] per prime, originated 167 primes, while the extended Fermat algorithm also produced 167 primes, but consumed an average 0.00599 [s] per prime. Thus, the computer experiments and comparison of methods proved that the SFA algorithm is deterministic, that originates provable primes. The survey of methods and analysis of selected approaches indicated that the SFA sieve algorithm that sequentially generates primes is computationally efficient, indicated better performance considering the computational speed, the simplicity of method, and the number of generated primes in the specified intervals
    corecore