14 research outputs found

    Number Theoretic Transform and Its Applications in Lattice-based Cryptosystems: A Survey

    Full text link
    Number theoretic transform (NTT) is the most efficient method for multiplying two polynomials of high degree with integer coefficients, due to its series of advantages in terms of algorithm and implementation, and is consequently widely-used and particularly fundamental in the practical implementations of lattice-based cryptographic schemes. Especially, recent works have shown that NTT can be utilized in those schemes without NTT-friendly rings, and can outperform other multiplication algorithms. In this paper, we first review the basic concepts of polynomial multiplication, convolution and NTT. Subsequently, we systematically introduce basic radix-2 fast NTT algorithms in an algebraic way via Chinese Remainder Theorem. And then, we elaborate recent advances about the methods to weaken restrictions on parameter conditions of NTT. Furthermore, we systematically introduce how to choose appropriate strategy of NTT algorithms for the various given rings. Later, we introduce the applications of NTT in the lattice-based cryptographic schemes of NIST post-quantum cryptography standardization competition. Finally, we try to present some possible future research directions

    Post Quantum Cryptography

    Get PDF
    Riassunto tesi magistrale: Post Quantum Cryptography Candidato: VAIRA Antonio Durante la stesura della mia tesi, su cui ho lavorato quest'ultimo anno della mia carriera universitaria, ho cercato di rispondere alla domanda: Qual è lo stato dell'arte della crittografia odierna in grado di sostenere un attacco da parte di utente malevolo in possesso di un “grande” computer quantistico? Per “grande” si intende un computer quantistico che abbia un registro di diverse migliaia di qubit e quindi sia in grado di far girare degli algoritmi quantistici effettivamente utilizzabili. Ad oggi sono stati ideati solamente due algoritmi quantistici per scopi cripto-analitici ed uno di questi, l’algoritmo di Shor, permette di fattorizzare grandi numeri in un tempo ridotto. Questo “semplice” problema matematico (o meglio algoritmico) è alla base della sicurezza dei cripto-sistemi a chiave pubblica utilizzati oggi. In un sistema a chiave pubblica odierno, come l'RSA, la difficoltà di manomissione è legata alla difficoltà algoritmica di fattorizzare grandi numeri, dove difficoltà implica non l'impossibilità ma bensì un dispendio di risorse/tempo per l’hacking maggiore del valore stesso dell'informazione che si otterrebbe da tale hacking. Nel mio lavoro inizialmente ho preso in esame, in linea di massima, sia i cripto-sistemi utilizzati oggi sia gli algoritmi quantistici utilizzabili in un ipotetico futuro per comprometterli. Successivamente, ho focalizzato la mia attenzione sulle alternative “mainstream” ai cripto-sistemi impiegati oggi: ovvero algoritmi post-quantum, proposti dalla comunità di cripto-analisti attivi nel campo, e ho provato a costruire un framework per valutarne l'effettivo impiego. In questo framework, quindi, ho scelto di approfondire lo studio della famiglia dei lattice-based (algoritmi la cui sicurezza è basata sulla difficoltà di risolvere problemi relativi ai lattici multidimensionali). All'interno di questa famiglia di cripto-sistemi ne ho individuato uno in particolare, lo NTRU, particolarmente promettente per un impiego, nell'immediato futuro, all'interno di una corporate PKI (un esempio a me vicino è la PKI sviluppata all'interno dell'Airbus, per la quale ho lavorato durante quest'ultimo anno). Ho in seguito approfondito lo studio di un altro algoritmo appartenente alla medesima famiglia dei cripto-sistemi lattice-based : il ring-LWE in quanto molto più interessante da un punto di vista accademico ma ancora relativamente sconosciuto. Successivamente ho elaborato alcune modifiche per lo stesso algoritmo con lo scopo di renderlo più veloce e affidabile, incrementando così la probabilità di recuperare un messaggio corretto dal corrispondente crittogramma. Come ultima tappa, ho implementato l'algoritmo di criptazione modificato utilizzando un linguaggio ad alto livello (molto vicino a python) e ho comparato sia i tempi di esecuzione sia le risorse utilizzate con un versione non modificata ma implementata con lo stesso linguaggio, rendendo così i confronti il più coerenti possibile. Dai risultati ottenuti risulta che la versione modificata del ring-LWE è estremamente promettente ma necessita di una più approfondila analisi da un punto di vista cripto-analitico, ovvero è necessario stressare l'algoritmo per capire se introduca o meno ulteriori vulnerabilità rispetto alla versione originale. In conclusione l'algoritmo ring-LWE, che ho maggiormente approfondito, offre diversi e interessanti spunti di riflessione ma è ben lontano dall'essere implementato in una infrastruttura reale. Nel evenienza di una sostituzione immediata è in generale buona norma in crittografia ripiegare su strade già battute e implementazioni più vecchie e fidate, un esempio all'interno degli algoritmi post-quantum è sicuramente lo NTRU (già dal 2008 standard IEEE: Std 1363.1). Come ultima riflessione personale vorrei aggiungere che malgrado in questo mio lavoro di tesi l'aspetto fisico appaia marginale è solo grazie alla preparazione, che ho maturato nel mio percorso di studi, che ho potuto svolgerlo senza grandi difficoltà e facendo un esperienza davvero costruttiva in una grande azienda come l'Airbus. Dopo tutto un bagaglio fisico ci rende dei “problem-solver” nelle situazioni più disparate

    The Cryptographic Imagination

    Get PDF
    Originally published in 1996. In The Cryptographic Imagination, Shawn Rosenheim uses the writings of Edgar Allan Poe to pose a set of questions pertaining to literary genre, cultural modernity, and technology. Rosenheim argues that Poe's cryptographic writing—his essays on cryptography and the short stories that grew out of them—requires that we rethink the relation of poststructural criticism to Poe's texts and, more generally, reconsider the relation of literature to communication. Cryptography serves not only as a template for the language, character, and themes of much of Poe's late fiction (including his creation, the detective story) but also as a "secret history" of literary modernity itself. "Both postwar fiction and literary criticism," the author writes, "are deeply indebted to the rise of cryptography in World War II." Still more surprising, in Rosenheim's view, Poe is not merely a source for such literary instances of cryptography as the codes in Conan Doyle's "The Dancing-Men" or in Jules Verne, but, through his effect on real cryptographers, Poe's writing influenced the outcome of World War II and the development of the Cold War. However unlikely such ideas sound, The Cryptographic Imagination offers compelling evidence that Poe's cryptographic writing clarifies one important avenue by which the twentieth century called itself into being. "The strength of Rosenheim's work extends to a revisionistic understanding of the entirety of literary history (as a repression of cryptography) and then, in a breathtaking shift of register, interlinks Poe's exercises in cryptography with the hyperreality of the CIA, the Cold War, and the Internet. What enables this extensive range of applications is the stipulated tension Rosenheim discerns in the relationship between the forms of the literary imagination and the condition of its mode of production. Cryptography, in this account, names the technology of literary production—the diacritical relationship between decoding and encoding—that the literary imagination dissimulates as hieroglyphics—the hermeneutic relationship between a sign and its content."—Donald E. Pease, Dartmouth Colleg

    Non-Markovian Dynamics in Continuous Variable Quantum Systems

    Get PDF
    The present manuscript represents the completion of a research path carried forward during my doctoral studies in the University of Turku. It contains information regarding my scientific contribution to the field of open quantum systems, accomplished in collaboration with other scientists. The main subject investigated in the thesis is the non-Markovian dynamics of open quantum systems with focus on continuous variable quantum channels, e.g. quantum Brownian motion models. Non-Markovianity is here interpreted as a manifestation of the existence of a flow of information exchanged by the system and environment during the dynamical evolution. While in Markovian systems the flow is unidirectional, i.e. from the system to the environment, in non-Markovian systems there are time windows in which the flow is reversed and the quantum state of the system may regain coherence and correlations previously lost. Signatures of a non-Markovian behavior have been studied in connection with the dynamics of quantum correlations like entanglement or quantum discord. Moreover, in the attempt to recognisee non-Markovianity as a resource for quantum technologies, it is proposed, for the first time, to consider its effects in practical quantum key distribution protocols. It has been proven that security of coherent state protocols can be enhanced using non-Markovian properties of the transmission channels. The thesis is divided in two parts: in the first part I introduce the reader to the world of continuous variable open quantum systems and non-Markovian dynamics. The second part instead consists of a collection of five publications inherent to the topic.Siirretty Doriast

    Protocolos de intercambio racional

    Get PDF
    An exchange protocol describes a sequence of steps by which several entities are capable of exchanging certain pieces of information in a particular context. Rational{exchange protocols serve that core purpose with several important advantages over the existing exchange paradigms, those referred to as fair{exchange solutions. Traditional fair{exchange protocols impose strong restrictions on the protocol exe- cution context. They ensure fairness to participants but at the expense of entities such as TTPs (trusted third parties) having to be involved in the exchange. By con- trast, rational schemes, although not ensuring fairness, assure that rational entities would have no reason to deviate from the steps described in the protocol and, have the enormous advantage of not needing the services of a TTP. Rational{exchange protocols therefore represent the only viable option in many modern ad{hoc and unstructured environments. The main goal of this thesis is to apply concepts from Game Theory to both the analysis and design of rational{exchange protocols. In our opinion, signi¯cant contributions have been made in both directions: ² In terms of the formal analysis of these schemes, our work has focused on the proposal of two extensions to an existing formalism. The viability and e®ec- tiveness of our proposals is corroborated by the application of both formalisms to the analysis and veri¯cation of several exchange schemes. ² With regard to the design of rational protocols, our approach is based on applying heuristic search to automate the process, and to generate exchange protocols which can be proven rational within an underlying game theoretical framework. Experimental work is carried out to illustrate the proposed methodology in a particular three-entity exchanging scenario as well as in several randomized environments. Di®erent heuristic techniques are implemented and their results compared, measuring success rates and the average number of protocols eval- uated until an optimal solution is obtained. Furthermore, as a result of this experimental work, a whole family of multi{party rational exchange protocols is presented. ____________________________________________________________________Durante siglos el comportamiento racional de la especie humana ha sido extensamente estudiado por filósofos, sociólogos, psicólogos, etc. Considerado siempre como un concepto abstracto, a mediados del siglo veinte el desarrollo de la Teoría de Juegos proporcionó, por primera vez, un marco matemático para la definición formal del comportamiento racional de las entidades participantes de un juego. A partir de entonces la Teoría de Juegos se ha convertido en el modelo matemático que sustenta importantes resultados en campos tan diversos como la Biología, la Economía, la Inteligencia Artificial o la Criptografía. Este trabajo se encuentra englobado dentro del campo de la Criptografía Racional. La Criptografía Racional nace de la aplicación de los resultados teóricos sobre juegos al campo de la Criptografía. Nielsen et al. en [Nielsen et al., 2007] establecen una relación de los avances más significativos llevados a cabo hasta el momento en esta área de reciente creación. En particular, especialmente relevantes para esta tesis serían los trabajos de Syverson [Syverson, 1998] y Buttyán et al. [Buttyán, 2001] centrados respectivamente en el diseño y análisis formal de protocolos seguros de intercambio racional

    Implementation and Evaluation of Algorithmic Skeletons: Parallelisation of Computer Algebra Algorithms

    Get PDF
    This thesis presents design and implementation approaches for the parallel algorithms of computer algebra. We use algorithmic skeletons and also further approaches, like data parallel arithmetic and actors. We have implemented skeletons for divide and conquer algorithms and some special parallel loops, that we call ‘repeated computation with a possibility of premature termination’. We introduce in this thesis a rational data parallel arithmetic. We focus on parallel symbolic computation algorithms, for these algorithms our arithmetic provides a generic parallelisation approach. The implementation is carried out in Eden, a parallel functional programming language based on Haskell. This choice enables us to encode both the skeletons and the programs in the same language. Moreover, it allows us to refrain from using two different languages—one for the implementation and one for the interface—for our implementation of computer algebra algorithms. Further, this thesis presents methods for evaluation and estimation of parallel execution times. We partition the parallel execution time into two components. One of them accounts for the quality of the parallelisation, we call it the ‘parallel penalty’. The other is the sequential execution time. For the estimation, we predict both components separately, using statistical methods. This enables very confident estimations, although using drastically less measurement points than other methods. We have applied both our evaluation and estimation approaches to the parallel programs presented in this thesis. We haven also used existing estimation methods. We developed divide and conquer skeletons for the implementation of fast parallel multiplication. We have implemented the Karatsuba algorithm, Strassen’s matrix multiplication algorithm and the fast Fourier transform. The latter was used to implement polynomial convolution that leads to a further fast multiplication algorithm. Specially for our implementation of Strassen algorithm we have designed and implemented a divide and conquer skeleton basing on actors. We have implemented the parallel fast Fourier transform, and not only did we use new divide and conquer skeletons, but also developed a map-and-transpose skeleton. It enables good parallelisation of the Fourier transform. The parallelisation of Karatsuba multiplication shows a very good performance. We have analysed the parallel penalty of our programs and compared it to the serial fraction—an approach, known from literature. We also performed execution time estimations of our divide and conquer programs. This thesis presents a parallel map+reduce skeleton scheme. It allows us to combine the usual parallel map skeletons, like parMap, farm, workpool, with a premature termination property. We use this to implement the so-called ‘parallel repeated computation’, a special form of a speculative parallel loop. We have implemented two probabilistic primality tests: the Rabin–Miller test and the Jacobi sum test. We parallelised both with our approach. We analysed the task distribution and stated the fitting configurations of the Jacobi sum test. We have shown formally that the Jacobi sum test can be implemented in parallel. Subsequently, we parallelised it, analysed the load balancing issues, and produced an optimisation. The latter enabled a good implementation, as verified using the parallel penalty. We have also estimated the performance of the tests for further input sizes and numbers of processing elements. Parallelisation of the Jacobi sum test and our generic parallelisation scheme for the repeated computation is our original contribution. The data parallel arithmetic was defined not only for integers, which is already known, but also for rationals. We handled the common factors of the numerator or denominator of the fraction with the modulus in a novel manner. This is required to obtain a true multiple-residue arithmetic, a novel result of our research. Using these mathematical advances, we have parallelised the determinant computation using the Gauß elimination. As always, we have performed task distribution analysis and estimation of the parallel execution time of our implementation. A similar computation in Maple emphasised the potential of our approach. Data parallel arithmetic enables parallelisation of entire classes of computer algebra algorithms. Summarising, this thesis presents and thoroughly evaluates new and existing design decisions for high-level parallelisations of computer algebra algorithms

    Annales Mathematicae et Informaticae 2020

    Get PDF
    corecore