8,743 research outputs found
Bayesian Estimation of Hardness Ratios: Modeling and Computations
A commonly used measure to summarize the nature of a photon spectrum is the
so-called Hardness Ratio, which compares the number of counts observed in
different passbands. The hardness ratio is especially useful to distinguish
between and categorize weak sources as a proxy for detailed spectral fitting.
However, in this regime classical methods of error propagation fail, and the
estimates of spectral hardness become unreliable. Here we develop a rigorous
statistical treatment of hardness ratios that properly deals with detected
photons as independent Poisson random variables and correctly deals with the
non-Gaussian nature of the error propagation. The method is Bayesian in nature,
and thus can be generalized to carry out a multitude of
source-population--based analyses. We verify our method with simulation
studies, and compare it with the classical method. We apply this method to real
world examples, such as the identification of candidate quiescent Low-mass
X-ray binaries in globular clusters, and tracking the time evolution of a flare
on a low-mass star.Comment: 43 pages, 10 figures, 3 tables; submitted to Ap
Incorporating Uncertainties in Atomic Data Into the Analysis of Solar and Stellar Observations: A Case Study in Fe XIII
Information about the physical properties of astrophysical objects cannot be
measured directly but is inferred by interpreting spectroscopic observations in
the context of atomic physics calculations. Ratios of emission lines, for
example, can be used to infer the electron density of the emitting plasma.
Similarly, the relative intensities of emission lines formed over a wide range
of temperatures yield information on the temperature structure. A critical
component of this analysis is understanding how uncertainties in the underlying
atomic physics propagates to the uncertainties in the inferred plasma
parameters. At present, however, atomic physics databases do not include
uncertainties on the atomic parameters and there is no established methodology
for using them even if they did. In this paper we develop simple models for the
uncertainties in the collision strengths and decay rates for Fe XIII and apply
them to the interpretation of density sensitive lines observed with the EUV
Imagining spectrometer (EIS) on Hinode. We incorporate these uncertainties in a
Bayesian framework. We consider both a pragmatic Bayesian method where the
atomic physics information is unaffected by the observed data, and a fully
Bayesian method where the data can be used to probe the physics. The former
generally increases the uncertainty in the inferred density by about a factor
of 5 compared with models that incorporate only statistical uncertainties. The
latter reduces the uncertainties on the inferred densities, but identifies
areas of possible systematic problems with either the atomic physics or the
observed intensities.Comment: in press at Ap
Anatomy of the Higgs fits: a first guide to statistical treatments of the theoretical uncertainties
The studies of the Higgs boson couplings based on the recent and upcoming LHC
data open up a new window on physics beyond the Standard Model. In this paper,
we propose a statistical guide to the consistent treatment of the theoretical
uncertainties entering the Higgs rate fits. Both the Bayesian and frequentist
approaches are systematically analysed in a unified formalism. We present
analytical expressions for the marginal likelihoods, useful to implement
simultaneously the experimental and theoretical uncertainties. We review the
various origins of the theoretical errors (QCD, EFT, PDF, production mode
contamination...). All these individual uncertainties are thoroughly combined
with the help of moment-based considerations. The theoretical correlations
among Higgs detection channels appear to affect the location and size of the
best-fit regions in the space of Higgs couplings. We discuss the recurrent
question of the shape of the prior distributions for the individual theoretical
errors and find that a nearly Gaussian prior arises from the error
combinations. We also develop the bias approach, which is an alternative to
marginalisation providing more conservative results. The statistical framework
to apply the bias principle is introduced and two realisations of the bias are
proposed. Finally, depending on the statistical treatment, the Standard Model
prediction for the Higgs signal strengths is found to lie within either the
or confidence level region obtained from the latest analyses of
the and TeV LHC datasets.Comment: 62 pages, 10 figure
Bayesian Methods for Exoplanet Science
Exoplanet research is carried out at the limits of the capabilities of
current telescopes and instruments. The studied signals are weak, and often
embedded in complex systematics from instrumental, telluric, and astrophysical
sources. Combining repeated observations of periodic events, simultaneous
observations with multiple telescopes, different observation techniques, and
existing information from theory and prior research can help to disentangle the
systematics from the planetary signals, and offers synergistic advantages over
analysing observations separately. Bayesian inference provides a
self-consistent statistical framework that addresses both the necessity for
complex systematics models, and the need to combine prior information and
heterogeneous observations. This chapter offers a brief introduction to
Bayesian inference in the context of exoplanet research, with focus on time
series analysis, and finishes with an overview of a set of freely available
programming libraries.Comment: Invited revie
Longitudinal quantile regression in presence of informative drop-out through longitudinal-survival joint modeling
We propose a joint model for a time-to-event outcome and a quantile of a
continuous response repeatedly measured over time. The quantile and survival
processes are associated via shared latent and manifest variables. Our joint
model provides a flexible approach to handle informative drop-out in quantile
regression. A general Monte Carlo Expectation Maximization strategy based on
importance sampling is proposed, which is directly applicable under any
distributional assumption for the longitudinal outcome and random effects, and
parametric and non-parametric assumptions for the baseline hazard. Model
properties are illustrated through a simulation study and an application to an
original data set about dilated cardiomyopathies
- …