1,970 research outputs found
Statistical techniques in cosmology
In these lectures I cover a number of topics in cosmological data analysis. I concentrate on general techniques which are common in cosmology, or techniques which have been developed in a cosmological context. In fact they have very general applicability, for problems in which the data are interpreted in the context of a theoretical model, and thus lend themselves to a Bayesian treatment. We consider the general problem of estimating parameters from data, and consider how one can use Fisher matrices to analyse survey designs before any data are taken, to see whether the survey will actually do what is required. We outline numerical methods for estimating parameters from data, including Monte Carlo Markov Chains and the Hamiltonian Monte Carlo method. We also look at Model Selection, which covers various scenarios such as whether an extra parameter is preferred by the data, or answering wider questions such as which theoretical framework is favoured, using General Relativity and braneworld gravity as an example. These notes are not a literature review, so there are relatively few references
Generalisations of Fisher Matrices
Fisher matrices play an important role in experimental design and in data
analysis. Their primary role is to make predictions for the inference of model
parameters - both their errors and covariances. In this short review, I outline
a number of extensions to the simple Fisher matrix formalism, covering a number
of recent developments in the field. These are: (a) situations where the data
(in the form of (x,y) pairs) have errors in both x and y; (b) modifications to
parameter inference in the presence of systematic errors, or through fixing the
values of some model parameters; (c) Derivative Approximation for LIkelihoods
(DALI) - higher-order expansions of the likelihood surface, going beyond the
Gaussian shape approximation; (d) extensions of the Fisher-like formalism, to
treat model selection problems with Bayesian evidence.Comment: Invited review article for Entropy special issue on 'Applications of
Fisher Information in Sciences'. Accepted versio
Statistical techniques in cosmology
In these lectures I cover a number of topics in cosmological data analysis. I
concentrate on general techniques which are common in cosmology, or techniques
which have been developed in a cosmological context. In fact they have very
general applicability, for problems in which the data are interpreted in the
context of a theoretical model, and thus lend themselves to a Bayesian
treatment.
We consider the general problem of estimating parameters from data, and
consider how one can use Fisher matrices to analyse survey designs before any
data are taken, to see whether the survey will actually do what is required. We
outline numerical methods for estimating parameters from data, including Monte
Carlo Markov Chains and the Hamiltonian Monte Carlo method. We also look at
Model Selection, which covers various scenarios such as whether an extra
parameter is preferred by the data, or answering wider questions such as which
theoretical framework is favoured, using General Relativity and braneworld
gravity as an example. These notes are not a literature review, so there are
relatively few references.Comment: Typos corrected and exercises adde
Intrinsic Galaxy Alignments and Weak Gravitational Lensing
Gravitational lensing causes background galaxy images to become aligned, and
the statistical characteristics of the image alignments can then be used to
constrain the power spectrum of mass fluctuations. Analyses of gravitational
lensing assume that intrinsic galaxy alignments are negligible, but if this
assumption does not hold, then the interpretation of image alignments will be
in error. As gravitational lensing experiments become more ambitious and seek
very low-level alignments arising from lensing by large-scale structure, it
becomes more important to estimate the level of intrinsic alignment in the
galaxy population. In this article, I review the cluster of independent
theoretical studies of this issue, as well as the current observational status.
Theoretically, the calculation of intrinsic alignments is by no means
straightforward, but some consensus has emerged from the existing works,
despite each making very different assumptions. This consensus is that a)
intrinsic alignments are a small but non-negligible (< 10%) contaminant of the
lensing ellipticity correlation function, for samples with a median redshift z
= 1; b) intrinsic alignments dominate the signal for low-redshift samples (z =
0.1), as expected in the SuperCOSMOS lensing survey and the Sloan Digital Sky
SurveyComment: 8 pages. Invited talk at Yale Workshop on `The Shapes of Galaxies and
their halos', May 200
Weak gravitational lensing: reducing the contamination by intrinsic alignments
Intrinsic alignments of galaxies can mimic to an extent the effects of shear
caused by weak gravitational lensing. Previous studies have shown that for
shallow surveys with median redshifts z_m = 0.1, the intrinsic alignment
dominates the lensing signal. For deep surveys with z_m = 1, intrinsic
alignments are believed to be a significant contaminant of the lensing signal,
preventing high-precision measurements of the matter power spectrum. In this
paper we show how distance information, either spectroscopic or photometric
redshifts, can be used to down-weight nearby pairs in an optimised way, to
reduce the errors in the shear signal arising from intrinsic alignments.
Provided a conservatively large intrinsic alignment is assumed, the optimised
weights will essentially remove all traces of contamination. For the Sloan
spectroscopic galaxy sample, residual shot noise continues to render it
unsuitable for weak lensing studies. However, a dramatic improvement for the
slightly deeper Sloan photometric survey is found, whereby the intrinsic
contribution, at angular scales greater than 1 arcminute, is reduced from about
80 times the lensing signal to a 10% effect. For deeper surveys such as the
COMBO-17 survey with z_m = 0.6, the optimisation reduces the error from a
largely systematic 220% error at small angular scales to a much smaller and
largely statistical error of only 17% of the expected lensing signal. We
therefore propose that future weak lensing surveys be accompanied by the
acquisition of photometric redshifts, in order to remove fully the unknown
intrinsic alignment errors from weak lensing detections.Comment: 10 pages, 6 figures, MNRAS accepted. Minor changes to match accepted
version. RCS and ODT predictions are modifie
Objective Bayesian analysis of neutrino masses and hierarchy
Given the precision of current neutrino data, priors still impact noticeably
the constraints on neutrino masses and their hierarchy. To avoid our
understanding of neutrinos being driven by prior assumptions, we construct a
prior that is mathematically minimally informative. Using the constructed
uninformative prior, we find that the normal hierarchy is favoured but with
inconclusive posterior odds of 5.1:1. Better data is hence needed before the
neutrino masses and their hierarchy can be well constrained. We find that the
next decade of cosmological data should provide conclusive evidence if the
normal hierarchy with negligible minimum mass is correct, and if the
uncertainty in the sum of neutrino masses drops below 0.025 eV. On the other
hand, if neutrinos obey the inverted hierarchy, achieving strong evidence will
be difficult with the same uncertainties. Our uninformative prior was
constructed from principles of the Objective Bayesian approach. The prior is
called a reference prior and is minimally informative in the specific sense
that the information gain after collection of data is maximised. The prior is
computed for the combination of neutrino oscillation data and cosmological data
and still applies if the data improve.Comment: 15 pages. Minor changes to match accepted version in JCA
- …