2,535 research outputs found
Bayesian Analysis
After making some general remarks, I consider two examples that illustrate
the use of Bayesian Probability Theory. The first is a simple one, the
physicist's favorite "toy," that provides a forum for a discussion of the key
conceptual issue of Bayesian analysis: the assignment of prior probabilities.
The other example illustrates the use of Bayesian ideas in the real world of
experimental physics.Comment: 14 pages, 5 figures, Workshop on Confidence Limits, CERN, 17-18
January, 200
Strategy for discovering a low-mass Higgs boson at the Fermilab Tevatron
We have studied the potential of the CDF and DZero experiments to discover a
low-mass Standard Model Higgs boson, during Run II, via the processes
-> WH -> , -> ZH ->
and -> ZH ->. We
show that a multivariate analysis using neural networks, that exploits all the
information contained within a set of event variables, leads to a significant
reduction, with respect to {\em any} equivalent conventional analysis, in the
integrated luminosity required to find a Standard Model Higgs boson in the mass
range 90 GeV/c**2 < M_H < 130 GeV/c**2. The luminosity reduction is sufficient
to bring the discovery of the Higgs boson within reach of the Tevatron
experiments, given the anticipated integrated luminosities of Run II, whose
scope has recently been expanded.Comment: 26 pages, 8 figures, 7 tables, to appear in Physical Review D, Minor
fixes and revision
Optimizing Event Selection with the Random Grid Search
The random grid search (RGS) is a simple, but efficient, stochastic algorithm
to find optimal cuts that was developed in the context of the search for the
top quark at Fermilab in the mid-1990s. The algorithm, and associated code,
have been enhanced recently with the introduction of two new cut types, one of
which has been successfully used in searches for supersymmetry at the Large
Hadron Collider. The RGS optimization algorithm is described along with the
recent developments, which are illustrated with two examples from particle
physics. One explores the optimization of the selection of vector boson fusion
events in the four-lepton decay mode of the Higgs boson and the other optimizes
SUSY searches using boosted objects and the razor variables.Comment: 26 pages, 9 figure
Simulation-Based Frequentist Inference with Tractable and Intractable Likelihoods
High-fidelity simulators that connect theoretical models with observations
are indispensable tools in many sciences. When coupled with machine learning, a
simulator makes it possible to infer the parameters of a theoretical model
directly from real and simulated observations without explicit use of the
likelihood function. This is of particular interest when the latter is
intractable. We introduce a simple modification of the recently proposed
likelihood-free frequentist inference (LF2I) approach that has some
computational advantages. The utility of our algorithm is illustrated by
applying it to three pedagogically interesting examples: the first is from
cosmology, the second from high-energy physics and astronomy, both with
tractable likelihoods, while the third, with an intractable likelihood, is from
epidemiology
Reference priors for high energy physics
Bayesian inferences in high energy physics often use uniform prior
distributions for parameters about which little or no information is available
before data are collected. The resulting posterior distributions are therefore
sensitive to the choice of parametrization for the problem and may even be
improper if this choice is not carefully considered. Here we describe an
extensively tested methodology, known as reference analysis, which allows one
to construct parametrization-invariant priors that embody the notion of minimal
informativeness in a mathematically well-defined sense. We apply this
methodology to general cross section measurements and show that it yields
sensible results. A recent measurement of the single top quark cross section
illustrates the relevant techniques in a realistic situation
- …