10,893 research outputs found

    RAGE: A Java-implemented Visual Random Generator

    Get PDF
    Carefully designed Java applications turn out to be efficient and platform independent tools that can compete well with classical implementations of statistical software. The project presented here is an example underlining this statement for random variate generation. An end-user application called RAGE (Random Variate Generator) is developed to generate random variates from probability distributions. A Java class library called JDiscreteLib has been designed and implemented for the simulation of random variables from the most usual discrete distributions inside RAGE. For each distribution, specific and general algorithms are available for this purpose. RAGE can also be used as an interactive simulation tool for data and data summary visualization.

    Neutron monitor generated data distributions in quantum variational Monte Carlo

    Get PDF
    We have assessed the potential applications of the neutron monitor hardware as random number generator for normal and uniform distributions. The data tables from the acquisition channels with no extreme changes in the signal level were chosen as the retrospective model. The stochastic component was extracted by fitting the raw data with splines and then subtracting the fit. Scaling the extracted data to zero mean and variance of one is sufficient to obtain a stable standard normal random variate. Distributions under consideration pass all available normality tests. Inverse transform sampling is suggested to use as a source of the uniform random numbers. Variational Monte Carlo method for quantum harmonic oscillator was used to test the quality of our random numbers. If the data delivery rate is of importance and the conventional one minute resolution neutron count is insufficient, we could always settle for an efficient seed generator to feed into the faster algorithmic random number generator or create a buffer

    Microprogramming For Probability Distribution Sampling

    Get PDF
    Microprogramming of special instructions for sampling of random variates from any probability distribution is a means of increasing sampling speed. The diversity of sampling techniques is narrowed to one general algorithm: conditional bit sampling. Conditional bit sampling uses a high-speed uniform random number generator based on feedback shift registers to sample one bit at a time. The probability of a bit being a one in the j-th position of a binary expanded variate is stored in a table of conditional probabilities. A comparison with the pseudorandom number yields a one or zero. The table of conditional probabilities is generated once and passed through an instruction to the microprogram which performs the sampling. One user instruction is issued for each variate returned

    Class library ranlip for multivariate nonuniform random variate generation

    Full text link
    This paper describes generation of nonuniform random variates from Lipschitz-continuous densities using acceptance/rejection, and the class library ranlip which implements this method. It is assumed that the required distribution has Lipschitz-continuous density, which is either given analytically or as a black box. The algorithm builds a piecewise constant upper approximation to the density (the hat function), using a large number of its values and subdivision of the domain into hyperrectangles. The class library ranlip provides very competitive preprocessing and generation times, and yields small rejection constant, which is a measure of efficiency of the generation step. It exhibits good performance for up to five variables, and provides the user with a black box nonuniform random variate generator for a large class of distributions, in particular, multimodal distributions. It will be valuable for researchers who frequently face the task of sampling from unusual distributions, for which specialized random variate generators are not available.<br /

    Driving Markov chain Monte Carlo with a dependent random stream

    Full text link
    Markov chain Monte Carlo is a widely-used technique for generating a dependent sequence of samples from complex distributions. Conventionally, these methods require a source of independent random variates. Most implementations use pseudo-random numbers instead because generating true independent variates with a physical system is not straightforward. In this paper we show how to modify some commonly used Markov chains to use a dependent stream of random numbers in place of independent uniform variates. The resulting Markov chains have the correct invariant distribution without requiring detailed knowledge of the stream's dependencies or even its marginal distribution. As a side-effect, sometimes far fewer random numbers are required to obtain accurate results.Comment: 16 pages, 4 figure

    Pseudorandom Generators from Polarizing Random Walks

    Get PDF
    We propose a new framework for constructing pseudorandom generators for n-variate Boolean functions. It is based on two new notions. First, we introduce fractional pseudorandom generators, which are pseudorandom distributions taking values in [-1,1]^n. Next, we use a fractional pseudorandom generator as steps of a random walk in [-1,1]^n that converges to {-1,1}^n. We prove that this random walk converges fast (in time logarithmic in n) due to polarization. As an application, we construct pseudorandom generators for Boolean functions with bounded Fourier tails. We use this to obtain a pseudorandom generator for functions with sensitivity s, whose seed length is polynomial in s. Other examples include functions computed by branching programs of various sorts or by bounded depth circuits

    Joint Mixability of Elliptical Distributions and Related Families

    Full text link
    In this paper, we further develop the theory of complete mixability and joint mixability for some distribution families. We generalize a result of R\"uschendorf and Uckelmann (2002) related to complete mixability of continuous distribution function having a symmetric and unimodal density. Two different proofs to a result of Wang and Wang (2016) which related to the joint mixability of elliptical distributions with the same characteristic generator are present. We solve the Open Problem 7 in Wang (2015) by constructing a bimodal-symmetric distribution. The joint mixability of slash-elliptical distributions and skew-elliptical distributions is studied and the extension to multivariate distributions is also investigated.Comment: 15page

    On Buffon Machines and Numbers

    Get PDF
    The well-know needle experiment of Buffon can be regarded as an analog (i.e., continuous) device that stochastically "computes" the number 2/pi ~ 0.63661, which is the experiment's probability of success. Generalizing the experiment and simplifying the computational framework, we consider probability distributions, which can be produced perfectly, from a discrete source of unbiased coin flips. We describe and analyse a few simple Buffon machines that generate geometric, Poisson, and logarithmic-series distributions. We provide human-accessible Buffon machines, which require a dozen coin flips or less, on average, and produce experiments whose probabilities of success are expressible in terms of numbers such as, exp(-1), log 2, sqrt(3), cos(1/4), aeta(5). Generally, we develop a collection of constructions based on simple probabilistic mechanisms that enable one to design Buffon experiments involving compositions of exponentials and logarithms, polylogarithms, direct and inverse trigonometric functions, algebraic and hypergeometric functions, as well as functions defined by integrals, such as the Gaussian error function.Comment: Largely revised version with references and figures added. 12 pages. In ACM-SIAM Symposium on Discrete Algorithms (SODA'2011
    • …
    corecore