288 research outputs found

    On the prediction of pseudo relative permeability curves: meta-heuristics versus Quasi-Monte Carlo

    Get PDF
    International audienceThis article reports the first application of the Quasi-Monte Carlo (QMC) method for estimation of the pseudo relative permeability curves. In this regards, the performance of several meta-heuristics algorithms have also been compared versus QMC, including the Genetic Algorithm (GA), Particle Swarm Optimization (PSO), and the Artificial Bee Colony (ABC). The mechanism of minimizing the objective-function has been studied, for each method. The QMC has outperformed its counterparts in terms of accuracy and efficiently sweeping the entire search domain. Nevertheless, its computational time requirement is obtained in excess to the meta-heuristics algorithms

    Architectures for Code-based Post-Quantum Cryptography

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    A Framework for Efficient Adaptively Secure Composable Oblivious Transfer in the ROM

    Get PDF
    Oblivious Transfer (OT) is a fundamental cryptographic protocol that finds a number of applications, in particular, as an essential building block for two-party and multi-party computation. We construct a round-optimal (2 rounds) universally composable (UC) protocol for oblivious transfer secure against active adaptive adversaries from any OW-CPA secure public-key encryption scheme with certain properties in the random oracle model (ROM). In terms of computation, our protocol only requires the generation of a public/secret-key pair, two encryption operations and one decryption operation, apart from a few calls to the random oracle. In~terms of communication, our protocol only requires the transfer of one public-key, two ciphertexts, and three binary strings of roughly the same size as the message. Next, we show how to instantiate our construction under the low noise LPN, McEliece, QC-MDPC, LWE, and CDH assumptions. Our instantiations based on the low noise LPN, McEliece, and QC-MDPC assumptions are the first UC-secure OT protocols based on coding assumptions to achieve: 1) adaptive security, 2) optimal round complexity, 3) low communication and computational complexities. Previous results in this setting only achieved static security and used costly cut-and-choose techniques.Our instantiation based on CDH achieves adaptive security at the small cost of communicating only two more group elements as compared to the gap-DH based Simplest OT protocol of Chou and Orlandi (Latincrypt 15), which only achieves static security in the ROM

    Optimization in Quasi-Monte Carlo Methods for Derivative Valuation

    No full text
    Computational complexity in financial theory and practice has seen an immense rise recently. Monte Carlo simulation has proved to be a robust and adaptable approach, well suited for supplying numerical solutions to a large class of complex problems. Although Monte Carlo simulation has been widely applied in the pricing of financial derivatives, it has been argued that the need to sample the relevant region as uniformly as possible is very important. This led to the development of quasi-Monte Carlo methods that use deterministic points to minimize the integration error. A major disadvantage of low-discrepancy number generators is that they tend to lose their ability of homogeneous coverage as the dimensionality increases. This thesis develops a novel approach to quasi-Monte Carlo methods to evaluate complex financial derivatives more accurately by optimizing the sample coordinates in such a way so as to minimize the discrepancies that appear when using lowdiscrepancy sequences. The main focus is to develop new methods to, optimize the sample coordinate vector, and to test their performance against existing quasi-Monte Carlo methods in pricing complicated multidimensional derivatives. Three new methods are developed, the Gear, the Simulated Annealing and the Stochastic Tunneling methods. These methods are used to evaluate complex multi-asset financial derivatives (geometric average and rainbow options) for dimensions up to 2000. It is shown that the two stochastic methods, Simulated Annealing and Stochastic Tunneling, perform better than existing quasi-Monte Carlo methods, Faure' and Sobol'. This difference in performance is more evident in higher dimensions, particularly when a low number of points is used in the Monte Carlo simulations. Overall, the Stochastic Tunneling method yields the smallest percentage root mean square relative error and requires less computational time to converge to a global solution, proving to be the most promising method in pricing complex derivativesImperial Users onl

    Hardware Architectures for Post-Quantum Cryptography

    Get PDF
    The rapid development of quantum computers poses severe threats to many commonly-used cryptographic algorithms that are embedded in different hardware devices to ensure the security and privacy of data and communication. Seeking for new solutions that are potentially resistant against attacks from quantum computers, a new research field called Post-Quantum Cryptography (PQC) has emerged, that is, cryptosystems deployed in classical computers conjectured to be secure against attacks utilizing large-scale quantum computers. In order to secure data during storage or communication, and many other applications in the future, this dissertation focuses on the design, implementation, and evaluation of efficient PQC schemes in hardware. Four PQC algorithms, each from a different family, are studied in this dissertation. The first hardware architecture presented in this dissertation is focused on the code-based scheme Classic McEliece. The research presented in this dissertation is the first that builds the hardware architecture for the Classic McEliece cryptosystem. This research successfully demonstrated that complex code-based PQC algorithm can be run efficiently on hardware. Furthermore, this dissertation shows that implementation of this scheme on hardware can be easily tuned to different configurations by implementing support for flexible choices of security parameters as well as configurable hardware performance parameters. The successful prototype of the Classic McEliece scheme on hardware increased confidence in this scheme, and helped Classic McEliece to get recognized as one of seven finalists in the third round of the NIST PQC standardization process. While Classic McEliece serves as a ready-to-use candidate for many high-end applications, PQC solutions are also needed for low-end embedded devices. Embedded devices play an important role in our daily life. Despite their typically constrained resources, these devices require strong security measures to protect them against cyber attacks. Towards securing this type of devices, the second research presented in this dissertation focuses on the hash-based digital signature scheme XMSS. This research is the first that explores and presents practical hardware based XMSS solution for low-end embedded devices. In the design of XMSS hardware, a heterogenous software-hardware co-design approach was adopted, which combined the flexibility of the soft core with the acceleration from the hard core. The practicability and efficiency of the XMSS software-hardware co-design is further demonstrated by providing a hardware prototype on an open-source RISC-V based System-on-a-Chip (SoC) platform. The third research direction covered in this dissertation focuses on lattice-based cryptography, which represents one of the most promising and popular alternatives to today\u27s widely adopted public key solutions. Prior research has presented hardware designs targeting the computing blocks that are necessary for the implementation of lattice-based systems. However, a recurrent issue in most existing designs is that these hardware designs are not fully scalable or parameterized, hence limited to specific cryptographic primitives and security parameter sets. The research presented in this dissertation is the first that develops hardware accelerators that are designed to be fully parameterized to support different lattice-based schemes and parameters. Further, these accelerators are utilized to realize the first software-harware co-design of provably-secure instances of qTESLA, which is a lattice-based digital signature scheme. This dissertation demonstrates that even demanding, provably-secure schemes can be realized efficiently with proper use of software-hardware co-design. The final research presented in this dissertation is focused on the isogeny-based scheme SIKE, which recently made it to the final round of the PQC standardization process. This research shows that hardware accelerators can be designed to offload compute-intensive elliptic curve and isogeny computations to hardware in a versatile fashion. These hardware accelerators are designed to be fully parameterized to support different security parameter sets of SIKE as well as flexible hardware configurations targeting different user applications. This research is the first that presents versatile hardware accelerators for SIKE that can be mapped efficiently to both FPGA and ASIC platforms. Based on these accelerators, an efficient software-hardwareco-design is constructed for speeding up SIKE. In the end, this dissertation demonstrates that, despite being embedded with expensive arithmetic, the isogeny-based SIKE scheme can be run efficiently by exploiting specialized hardware. These four research directions combined demonstrate the practicability of building efficient hardware architectures for complex PQC algorithms. The exploration of efficient PQC solutions for different hardware platforms will eventually help migrate high-end servers and low-end embedded devices towards the post-quantum era

    Tree based credible set estimation

    Get PDF
    Estimating a joint Highest Posterior Density credible set for a multivariate posterior density is challenging as dimension gets larger. Credible intervals for univariate marginals are usually presented for ease of computation and visualisation. There are often two layers of approximation, as we may need to compute a credible set for a target density which is itself only an approximation to the true posterior density. We obtain joint Highest Posterior Density credible sets for density estimation trees given by Li et al. (2016) approximating a density truncated to a compact subset of R^d as this is preferred to a copula construction. These trees approximate a joint posterior distribution from posterior samples using a piecewise constant function defined by sequential binary splits. We use a consistent estimator to measure of the symmetric difference between our credible set estimate and the true HPD set of the target density samples. This quality measure can be computed without the need to know the true set. We show how the true-posterior-coverage of an approximate credible set estimated for an approximate target density may be estimated in doubly intractable cases where posterior samples are not available. We illustrate our methods with simulation studies and find that our estimator is competitive with existing methods

    Quantum algorithms for problems in number theory, algebraic geometry, and group theory

    Full text link
    Quantum computers can execute algorithms that sometimes dramatically outperform classical computation. Undoubtedly the best-known example of this is Shor's discovery of an efficient quantum algorithm for factoring integers, whereas the same problem appears to be intractable on classical computers. Understanding what other computational problems can be solved significantly faster using quantum algorithms is one of the major challenges in the theory of quantum computation, and such algorithms motivate the formidable task of building a large-scale quantum computer. This article will review the current state of quantum algorithms, focusing on algorithms for problems with an algebraic flavor that achieve an apparent superpolynomial speedup over classical computation.Comment: 20 pages, lecture notes for 2010 Summer School on Diversities in Quantum Computation/Information at Kinki Universit

    How to Backdoor (Classic) McEliece and How to Guard Against Backdoors

    Get PDF
    We show how to backdoor the McEliece cryptosystem such that a backdoored public key is indistinguishable from a usual public key, but allows to efficiently retrieve the underlying secret key. For good cryptographic reasons, McEliece uses a small random seed that generates via some pseudo random generator (PRG) the randomness that determines the secret key. Our backdoor mechanism works by encoding an encryption of into the public key. Retrieving then allows to efficiently recover the (backdoored) secret key. Interestingly, McEliece can be used itself to encrypt , thereby protecting our backdoor mechanism with strong post-quantum security guarantees. Our construction also works for the current Classic McEliece NIST standard proposal for non-compressed secret keys, and therefore opens the door for widespread maliciously backdoored implementations. Fortunately, our backdoor mechanism can be detected by the owner of the (backdoored) secret key if is stored after key generation as specified by the Classic McEliece proposal. Thus, our results provide strong advice for implementers to store inside the secret key and use to guard against backdoor mechanisms

    A survey on adaptive random testing

    Get PDF
    Random testing (RT) is a well-studied testing method that has been widely applied to the testing of many applications, including embedded software systems, SQL database systems, and Android applications. Adaptive random testing (ART) aims to enhance RT's failure-detection ability by more evenly spreading the test cases over the input domain. Since its introduction in 2001, there have been many contributions to the development of ART, including various approaches, implementations, assessment and evaluation methods, and applications. This paper provides a comprehensive survey on ART, classifying techniques, summarizing application areas, and analyzing experimental evaluations. This paper also addresses some misconceptions about ART, and identifies open research challenges to be further investigated in the future work
    • …
    corecore