30 research outputs found

    Applications of a quantum random number generator to simulations in condense matter physics

    Get PDF
    We study the importance of the quality of random numbers in Monte Carlo simulations of 2D Ising systems. Simulations are carried out at critical temperature to find the dynamic scaling law of the linear relaxation time. Our aim is to show that statistical correlations that appear in large Ising simulations performed with pseudorandom numbers can be corrected using a quantum random number generator (QRNG). To achieve high speeds and large systems, Ising lattices are simulated on a field programmable gate array (FPGA) with an optical QRNG

    Randomized Algorithms over Finite Fields for the Exact Parity Base Problem

    Get PDF
    AbstractWe present three randomized pseudo-polynomial algorithms for the problem of finding a base of specified value in a weighted represented matroid subject to parity conditions. These algorithms, the first two being an improved version of those presented by P. M. Camerini et al. (1992, J. Algorithms13, 258–273) use fast arithmetic working over a finite field chosen at random among a set of appropriate fields. We show that the choice of a best algorithm among those presented depends on a conjecture related to the best value of the so-called Linnik constant concerning the distribution of prime numbers in arithmetic progressions. This conjecture, which we call the C-conjecture, is a strengthened version of a conjecture formulated in 1934 by S. Chowla. If the C-conjecture is true, the choice of a best algorithm is simple, since the last algorithm exhibits the best performance, either when the performance is measured in arithmetic operations, or when it is measured in bit operations and mild assumptions hold. If the C-conjecture is false we are still able to identify a best algorithm, but in this case the choice is between the first two algorithms and depends on the asymptotic growth of m with respect to those of U and n, where 2n, 2m, U are the rank, the number of elements, and the maximum weight assigned to the elements of the matroid, respectively

    Master index: volumes 31–40

    Get PDF

    Privacy-preserving record linkage using Bloom filters

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Combining multiple databases with disjunctive or additional information on the same person is occurring increasingly throughout research. If unique identification numbers for these individuals are not available, probabilistic record linkage is used for the identification of matching record pairs. In many applications, identifiers have to be encrypted due to privacy concerns.</p> <p>Methods</p> <p>A new protocol for privacy-preserving record linkage with encrypted identifiers allowing for errors in identifiers has been developed. The protocol is based on Bloom filters on <it>q</it>-grams of identifiers.</p> <p>Results</p> <p>Tests on simulated and actual databases yield linkage results comparable to non-encrypted identifiers and superior to results from phonetic encodings.</p> <p>Conclusion</p> <p>We proposed a protocol for privacy-preserving record linkage with encrypted identifiers allowing for errors in identifiers. Since the protocol can be easily enhanced and has a low computational burden, the protocol might be useful for many applications requiring privacy-preserving record linkage.</p

    A comparison of several algorithms for the single individual SNP haplotyping reconstruction problem

    Get PDF
    Motivation: Single nucleotide polymorphisms are the most common form of variation in human DNA, and are involved in many research fields, from molecular biology to medical therapy. The technological opportunity to deal with long DNA sequences using shotgun sequencing has raised the problem of fragment recombination. In this regard, Single Individual Haplotyping (SIH) problem has received considerable attention over the past few years

    Partial Loopholes Free Device Independent Quantum Random Number Generator Using IBM's Quantum Computers

    Full text link
    Random numbers form an intrinsic part of modern day computing with applications in a wide variety of fields. But due to their limitations, the use of pseudo random number generators (PRNGs) is certainly not desirable for sensitive applications. Quantum systems due to their intrinsic randomness form a suitable candidate for generation of true random numbers that can also be certified. In this work, the violation of CHSH inequality has been used to propose a scheme by which one can generate device independent quantum random numbers by use of IBM quantum computers that are available on the cloud. The generated random numbers have been tested for their source of origin through experiments based on the testing of CHSH inequality through available IBM quantum computers. The performance of each quantum computer against the CHSH test has been plotted and characterized. Further, efforts have been made to close as many loopholes as possible to produce device independent quantum random number generators. This study will provide new directions for the development of self-testing and semi-self-testing random number generators using quantum computers.Comment: We present a scheme by which one can generate device independent quantum random numbers by use of IBM quantum computers that are available on the clou

    Applying Genetic Algorithm to Generation of High-Dimensional Item Response Data

    Get PDF
    The item response data is the nm-dimensional data based on the responses made by m examinees to the questionnaire consisting of n items. It is used to estimate the ability of examinees and item parameters in educational evaluation. For estimates to be valid, the simulation input data must reflect reality. This paper presents the effective combination of the genetic algorithm (GA) and Monte Carlo methods for the generation of item response data as simulation input data similar to real data. To this end, we generated four types of item response data using Monte Carlo and the GA and evaluated how similarly the generated item response data represents the real item response data with the item parameters (item difficulty and discrimination). We adopt two types of measurement, which are root mean square error and Kullback-Leibler divergence, for comparison of item parameters between real data and four types of generated data. The results show that applying the GA to initial population generated by Monte Carlo is the most effective in generating item response data that is most similar to real item response data. This study is meaningful in that we found that the GA contributes to the generation of more realistic simulation input data

    On Derandomized Approximation Algorithms

    Get PDF
    With the design of powerful randomized algorithms the transformation of a randomized algorithm or probabilistic existence result for combinatorial problems into an efficient deterministic algorithm (called derandomization) became an important issue in algorithmic discrete mathematics. In the last years several interesting examples of derandomization have been published, like discrepancy in hypergraph colouring, packing integer programs and an algorithmic version of the Lovász-Local-Lemma. In this paper the derandomization method of conditional probabilities of Raghavan/Spencer is extended using discrete martingales. As a main result pessimistic estimators are constructed for combinatorial approximation problems involving non-linear objective functions with bounded martingale differences. The theory gives polynomial-time algorithms for the linear and quadratic lattice approximation problem and a quadratic variant of the matrix balancing problem extending results of Spencer, Beck/Fiala and Raghavan. Finally a probabilistic existence result of Erdös on the average graph bisection is transformed into a deterministic algorithm
    corecore