253 research outputs found
A Comparative Study of Some Pseudorandom Number Generators
We present results of an extensive test program of a group of pseudorandom
number generators which are commonly used in the applications of physics, in
particular in Monte Carlo simulations. The generators include public domain
programs, manufacturer installed routines and a random number sequence produced
from physical noise. We start by traditional statistical tests, followed by
detailed bit level and visual tests. The computational speed of various
algorithms is also scrutinized. Our results allow direct comparisons between
the properties of different generators, as well as an assessment of the
efficiency of the various test methods. This information provides the best
available criterion to choose the best possible generator for a given problem.
However, in light of recent problems reported with some of these generators, we
also discuss the importance of developing more refined physical tests to find
possible correlations not revealed by the present test methods.Comment: University of Helsinki preprint HU-TFT-93-22 (minor changes in Tables
2 and 7, and in the text, correspondingly
Portable random number generators
Computers are deterministic devices, and a computer-generated random number is a contradiction in terms. As a result, computer-generated pseudorandom numbers are fraught with peril for the unwary. We summarize much that is known about the most well-known pseudorandom number generators: congruential generators. We also provide machine-independent programs to implement the generators in any language that has 32-bit signed integers-for example C, C++, and FORTRAN. Based on an extensive search, we provide parameter values better than those previously available.Programming (Mathematics) ; Computers
GeantV: Results from the prototype of concurrent vector particle transport simulation in HEP
Full detector simulation was among the largest CPU consumer in all CERN
experiment software stacks for the first two runs of the Large Hadron Collider
(LHC). In the early 2010's, the projections were that simulation demands would
scale linearly with luminosity increase, compensated only partially by an
increase of computing resources. The extension of fast simulation approaches to
more use cases, covering a larger fraction of the simulation budget, is only
part of the solution due to intrinsic precision limitations. The remainder
corresponds to speeding-up the simulation software by several factors, which is
out of reach using simple optimizations on the current code base. In this
context, the GeantV R&D project was launched, aiming to redesign the legacy
particle transport codes in order to make them benefit from fine-grained
parallelism features such as vectorization, but also from increased code and
data locality. This paper presents extensively the results and achievements of
this R&D, as well as the conclusions and lessons learnt from the beta
prototype.Comment: 34 pages, 26 figures, 24 table
Distribution of Random Streams for Simulation Practitioners
International audienceThere is an increasing interest in the distribution of parallel random number streamsin the high-performance computing community particularly, with the manycore shift. Even ifwe have at our disposal statistically sound random number generators according to the latestand thorough testing libraries, their parallelization can still be a delicate problem. Indeed, aset of recent publications shows it still has to be mastered by the scientific community. Withthe arrival of multi-core and manycore processor architectures on the scientist desktop, modelerswho are non-specialists in parallelizing stochastic simulations need help and advice in distributingrigorously their experimental plans and replications according to the state of the art in pseudo-random numbers parallelization techniques. In this paper, we discuss the different partitioningtechniques currently in use to provide independent streams with their corresponding software. Inaddition to the classical approaches in use to parallelize stochastic simulations on regular processors,this paper also presents recent advances in pseudo-random number generation for general-purposegraphical processing units. The state of the art given in this paper is written for simulationpractitioners
Digitally modulated bit error rate measurement system for microwave component evaluation
The NASA Lewis Research Center has developed a unique capability for evaluation of the microwave components of a digital communication system. This digitally modulated bit-error-rate (BER) measurement system (DMBERMS) features a continuous data digital BER test set, a data processor, a serial minimum shift keying (SMSK) modem, noise generation, and computer automation. Application of the DMBERMS has provided useful information for the evaluation of existing microwave components and of design goals for future components. The design and applications of this system for digitally modulated BER measurements are discussed
Computers and Liquid State Statistical Mechanics
The advent of electronic computers has revolutionised the application of
statistical mechanics to the liquid state. Computers have permitted, for
example, the calculation of the phase diagram of water and ice and the folding
of proteins. The behaviour of alkanes adsorbed in zeolites, the formation of
liquid crystal phases and the process of nucleation. Computer simulations
provide, on one hand, new insights into the physical processes in action, and
on the other, quantitative results of greater and greater precision. Insights
into physical processes facilitate the reductionist agenda of physics, whilst
large scale simulations bring out emergent features that are inherent (although
far from obvious) in complex systems consisting of many bodies. It is safe to
say that computer simulations are now an indispensable tool for both the
theorist and the experimentalist, and in the future their usefulness will only
increase.
This chapter presents a selective review of some of the incredible advances
in condensed matter physics that could only have been achieved with the use of
computers.Comment: 22 pages, 2 figures. Chapter for a boo
Pseudo-Random Number Generators for Vector Processors and Multicore Processors
Large scale Monte Carlo applications need a good pseudo-random number generator capable of utilizing both the vector processing capabilities and multiprocessing capabilities of modern computers in order to get the maximum performance. The requirements for such a generator are discussed. New ways of avoiding overlapping subsequences by combining two generators are proposed. Some fundamental philosophical problems in proving independence of random streams are discussed. Remedies for hitherto ignored quantization errors are offered. An open source C++ implementation is provided for a generator that meets these needs
- âŠ