143,265 research outputs found

    Efficiently Generating Random Bits from Finite State Markov Chains

    Get PDF
    The problem of random number generation from an uncorrelated random source (of unknown probability distribution) dates back to von Neumann's 1951 work. Elias (1972) generalized von Neumann's scheme and showed how to achieve optimal efficiency in unbiased random bits generation. Hence, a natural question is what if the sources are correlated? Both Elias and Samuelson proposed methods for generating unbiased random bits in the case of correlated sources (of unknown probability distribution), specifically, they considered finite Markov chains. However, their proposed methods are not efficient or have implementation difficulties. Blum (1986) devised an algorithm for efficiently generating random bits from degree-2 finite Markov chains in expected linear time, however, his beautiful method is still far from optimality on information-efficiency. In this paper, we generalize Blum's algorithm to arbitrary degree finite Markov chains and combine it with Elias's method for efficient generation of unbiased bits. As a result, we provide the first known algorithm that generates unbiased random bits from an arbitrary finite Markov chain, operates in expected linear time and achieves the information-theoretic upper bound on efficiency

    Multiple scattering in random mechanical systems and diffusion approximation

    Full text link
    This paper is concerned with stochastic processes that model multiple (or iterated) scattering in classical mechanical systems of billiard type, defined below. From a given (deterministic) system of billiard type, a random process with transition probabilities operator P is introduced by assuming that some of the dynamical variables are random with prescribed probability distributions. Of particular interest are systems with weak scattering, which are associated to parametric families of operators P_h, depending on a geometric or mechanical parameter h, that approaches the identity as h goes to 0. It is shown that (P_h -I)/h converges for small h to a second order elliptic differential operator L on compactly supported functions and that the Markov chain process associated to P_h converges to a diffusion with infinitesimal generator L. Both P_h and L are selfadjoint (densely) defined on the space L2(H,{\eta}) of square-integrable functions over the (lower) half-space H in R^m, where {\eta} is a stationary measure. This measure's density is either (post-collision) Maxwell-Boltzmann distribution or Knudsen cosine law, and the random processes with infinitesimal generator L respectively correspond to what we call MB diffusion and (generalized) Legendre diffusion. Concrete examples of simple mechanical systems are given and illustrated by numerically simulating the random processes.Comment: 34 pages, 13 figure

    Sorting using complete subintervals and the maximum number of runs in a randomly evolving sequence

    Full text link
    We study the space requirements of a sorting algorithm where only items that at the end will be adjacent are kept together. This is equivalent to the following combinatorial problem: Consider a string of fixed length n that starts as a string of 0's, and then evolves by changing each 0 to 1, with then changes done in random order. What is the maximal number of runs of 1's? We give asymptotic results for the distribution and mean. It turns out that, as in many problems involving a maximum, the maximum is asymptotically normal, with fluctuations of order n^{1/2}, and to the first order well approximated by the number of runs at the instance when the expectation is maximized, in this case when half the elements have changed to 1; there is also a second order term of order n^{1/3}. We also treat some variations, including priority queues. The proofs use methods originally developed for random graphs.Comment: 31 PAGE

    Random Number Generation: Types and Techniques

    Get PDF
    What does it mean to have random numbers? Without understanding where a group of numbers came from, it is impossible to know if they were randomly generated. However, common sense claims that if the process to generate these numbers is truly understood, then the numbers could not be random. Methods that are able to let their internal workings be known without sacrificing random results are what this paper sets out to describe. Beginning with a study of what it really means for something to be random, this paper dives into the topic of random number generators and summarizes the key areas. It covers the two main groups of generators, true-random and pseudo-random, and gives practical examples of both. To make the information more applicable, real life examples of currently used and currently available generators are provided as well. Knowing the how and why of a number sequence without knowing the values that will come is possible, and this thesis explains how it is accomplished
    • 

    corecore