5,060,789 research outputs found
Statistical methods in cosmology
The advent of large data-set in cosmology has meant that in the past 10 or 20
years our knowledge and understanding of the Universe has changed not only
quantitatively but also, and most importantly, qualitatively. Cosmologists rely
on data where a host of useful information is enclosed, but is encoded in a
non-trivial way. The challenges in extracting this information must be overcome
to make the most of a large experimental effort. Even after having converged to
a standard cosmological model (the LCDM model) we should keep in mind that this
model is described by 10 or more physical parameters and if we want to study
deviations from it, the number of parameters is even larger. Dealing with such
a high dimensional parameter space and finding parameters constraints is a
challenge on itself. Cosmologists want to be able to compare and combine
different data sets both for testing for possible disagreements (which could
indicate new physics) and for improving parameter determinations. Finally,
cosmologists in many cases want to find out, before actually doing the
experiment, how much one would be able to learn from it. For all these reasons,
sophisiticated statistical techniques are being employed in cosmology, and it
has become crucial to know some statistical background to understand recent
literature in the field. I will introduce some statistical tools that any
cosmologist should know about in order to be able to understand recently
published results from the analysis of cosmological data sets. I will not
present a complete and rigorous introduction to statistics as there are several
good books which are reported in the references. The reader should refer to
those.Comment: 31, pages, 6 figures, notes from 2nd Trans-Regio Winter school in
Passo del Tonale. To appear in Lectures Notes in Physics, "Lectures on
cosmology: Accelerated expansion of the universe" Feb 201
Modern Statistical Methods for GLAST Event Analysis
We describe a statistical reconstruction methodology for the GLAST LAT. The
methodology incorporates in detail the statistics of the interactions of
photons and charged particles with the tungsten layers in the LAT, and uses the
scattering distributions to compute the full probability distribution over the
energy and direction of the incident photons. It uses model selection methods
to estimate the probabilities of the possible geometrical configurations of the
particles produced in the detector, and numerical marginalization over the
energy loss and scattering angles at each layer. Preliminary results show that
it can improve on the tracker-only energy estimates for muons and electrons
incident on the LAT.Comment: To appear in the proceedings of the First GLAST Symposium (held at
Stanford University, 5-8 February 2007
Quantum Monte Carlo Methods in Statistical Mechanics
This paper deals with the optimization of trial states for the computation of
dominant eigenvalues of operators and very large matrices. In addition to
preliminary results for the energy spectrum of van der Waals clusters, we
review results of the application of this method to the computation of
relaxation times of independent relaxation modes at the Ising critical point in
two dimensions.Comment: 11 pages, 1 figur
Statistical Software for State Space Methods
In this paper we review the state space approach to time series analysis and establish the notation that is adopted in this special volume of the Journal of Statistical Software. We first provide some background on the history of state space methods for the analysis of time series. This is followed by a concise overview of linear Gaussian state space analysis including the modelling framework and appropriate estimation methods. We discuss the important class of unobserved component models which incorporate a trend, a seasonal, a cycle, and fixed explanatory and intervention variables for the univariate and multivariate analysis of time series. We continue the discussion by presenting methods for the computation of different estimates for the unobserved state vector: filtering, prediction, and smoothing. Estimation approaches for the other parameters in the model are also considered. Next, we discuss how the estimation procedures can be used for constructing confidence intervals, detecting outlier observations and structural breaks, and testing model assumptions of residual independence, homoscedasticity, and normality. We then show how ARIMA and ARIMA components models fit in the state space framework to time series analysis. We also provide a basic introduction for non-Gaussian state space models. Finally, we present an overview of the software tools currently available for the analysis of time series with state space methods as they are discussed in the other contributions to this special volume.
- …
