1,649 research outputs found
On a new limit theorem in probability theory (Translation of 'Sur un nouveau th\'eor\`eme-limite de la th\'eorie des probabilit\'es')
This is a translation of Harald Cram\'er's article, 'On a new limit theorem
in probability theory', published in French in 1938 and deriving what is
considered by mathematicians to be the first large deviation result. My hope is
that this translation will help disseminate this historically important work,
80 years after its publication.Comment: 20 pages, endnote links are not supported by the hyperref package,
Hugo Touchette was the translator; v2: minor corrections, 1 reference adde
The phase transition in the configuration model
Let be a random graph with a given degree sequence , such as a
random -regular graph where is fixed and . We study
the percolation phase transition on such graphs , i.e., the emergence as
increases of a unique giant component in the random subgraph obtained by
keeping edges independently with probability . More generally, we study the
emergence of a giant component in itself as varies. We show that a
single method can be used to prove very precise results below, inside and above
the `scaling window' of the phase transition, matching many of the known
results for the much simpler model . This method is a natural extension
of that used by Bollobas and the author to study , itself based on work
of Aldous and of Nachmias and Peres; the calculations are significantly more
involved in the present setting.Comment: 37 page
Optimal evaluation of single-molecule force spectroscopy experiments
The forced rupture of single chemical bonds under external load is addressed.
A general framework is put forward to optimally utilize the experimentally
observed rupture force data for estimating the parameters of a theoretical
model. As an application we explore to what extent a distinction between
several recently proposed models is feasible on the basis of realistic
experimental data sets.Comment: 4 pages, 3 figures, accepted for publication in Phys. Rev.
Using simulation studies to evaluate statistical methods
Simulation studies are computer experiments that involve creating data by
pseudorandom sampling. The key strength of simulation studies is the ability to
understand the behaviour of statistical methods because some 'truth' (usually
some parameter/s of interest) is known from the process of generating the data.
This allows us to consider properties of methods, such as bias. While widely
used, simulation studies are often poorly designed, analysed and reported. This
tutorial outlines the rationale for using simulation studies and offers
guidance for design, execution, analysis, reporting and presentation. In
particular, this tutorial provides: a structured approach for planning and
reporting simulation studies, which involves defining aims, data-generating
mechanisms, estimands, methods and performance measures ('ADEMP'); coherent
terminology for simulation studies; guidance on coding simulation studies; a
critical discussion of key performance measures and their estimation; guidance
on structuring tabular and graphical presentation of results; and new graphical
presentations. With a view to describing recent practice, we review 100
articles taken from Volume 34 of Statistics in Medicine that included at least
one simulation study and identify areas for improvement.Comment: 31 pages, 9 figures (2 in appendix), 8 tables (1 in appendix
Mixed state Pauli channel parameter estimation
The accuracy of any physical scheme used to estimate the parameter describing
the strength of a single qubit Pauli channel can be quantified using standard
techniques from quantum estimation theory. It is known that the optimal
estimation scheme, with m channel invocations, uses initial states for the
systems which are pure and unentangled and provides an uncertainty of
O[1/m^(1/2)]. This protocol is analogous to a classical repetition and
averaging scheme. We consider estimation schemes where the initial states
available are not pure and compare a protocol involving quantum correlated
states to independent state protocols analogous to classical repetition
schemes. We show, that unlike the pure state case, the quantum correlated state
protocol can yield greater estimation accuracy than any independent state
protocol. We show that these gains persist even when the system states are
separable and, in some cases, when quantum discord is absent after channel
invocation. We describe the relevance of these protocols to nuclear magnetic
resonance measurements
Primes in short intervals
Contrary to what would be predicted on the basis of Cram\'er's model
concerning the distribution of prime numbers, we develop evidence that the
distribution of , for , is approximately
normal with mean and variance , when .Comment: 29 page
Chains of large gaps between primes
Let denote the -th prime, and for any and sufficiently
large , define the quantity which measures the occurrence of
chains of consecutive large gaps of primes. Recently, with Green and
Konyagin, the authors showed that for sufficiently large . In this
note, we combine the arguments in that paper with the Maier matrix method to
show that for any fixed and sufficiently large . The
implied constant is effective and independent of .Comment: 16 pages, no figure
Large deviation principles for non-uniformly hyperbolic rational maps
We show some level-2 large deviation principles for rational maps satisfying
a strong form of non-uniform hyperbolicity, called "Topological
Collet-Eckmann". More precisely, we prove a large deviation principle for the
distribution of iterated preimages, periodic points, and Birkhoff averages. For
this purpose we show that each H{\"o}lder continuous potential admits a unique
equilibrium state, and that the pressure function can be characterized in terms
of iterated preimages, periodic points, and Birkhoff averages. Then we use a
variant of a general result of Kifer.Comment: Final version; to appear in Ergodic Theory and Dynamical System
MintHint: Automated Synthesis of Repair Hints
Being able to automatically repair programs is an extremely challenging task.
In this paper, we present MintHint, a novel technique for program repair that
is a departure from most of today's approaches. Instead of trying to fully
automate program repair, which is often an unachievable goal, MintHint performs
statistical correlation analysis to identify expressions that are likely to
occur in the repaired code and generates, using pattern-matching based
synthesis, repair hints from these expressions. Intuitively, these hints
suggest how to rectify a faulty statement and help developers find a complete,
actual repair. MintHint can address a variety of common faults, including
incorrect, spurious, and missing expressions.
We present a user study that shows that developers' productivity can improve
manyfold with the use of repair hints generated by MintHint -- compared to
having only traditional fault localization information. We also apply MintHint
to several faults of a widely used Unix utility program to further assess the
effectiveness of the approach. Our results show that MintHint performs well
even in situations where (1) the repair space searched does not contain the
exact repair, and (2) the operational specification obtained from the test
cases for repair is incomplete or even imprecise
- …