5 research outputs found
Bayesian analysis of Friedmannless cosmologies
Assuming only a homogeneous and isotropic universe and using both the 'Gold'
Supernova Type Ia sample of Riess et al. and the results from the Supernova
Legacy Survey, we calculate the Bayesian evidence of a range of different
parameterizations of the deceleration parameter. We consider both spatially
flat and curved models. Our results show that although there is strong evidence
in the data for an accelerating universe, there is little evidence that the
deceleration parameter varies with redshift.Comment: 7 pages, 3 figure
Fisher Matrix Preloaded -- Fisher4Cast
The Fisher Matrix is the backbone of modern cosmological forecasting. We
describe the Fisher4Cast software: a general-purpose, easy-to-use, Fisher
Matrix framework. It is open source, rigorously designed and tested and
includes a Graphical User Interface (GUI) with automated LATEX file creation
capability and point-and-click Fisher ellipse generation. Fisher4Cast was
designed for ease of extension and, although written in Matlab, is easily
portable to open-source alternatives such as Octave and Scilab. Here we use
Fisher4Cast to present new 3-D and 4-D visualisations of the forecasting
landscape and to investigate the effects of growth and curvature on future
cosmological surveys. Early releases have been available at
http://www.cosmology.org.za since May 2008 with 750 downloads in the first
year. Version 2.2 is made public with this paper and includes a Quick Start
guide and the code used to produce the figures in this paper, in the hope that
it will be useful to the cosmology and wider scientific communities.Comment: 30 Pages, 15 figures. Minor revisions to match published version,
with some additional functionality described to match the current version
(2.2) of the code. Software available at http://www.cosmology.org.za. Usage,
structure and flow of the software, as well as tests performed are described
in the accompanying Users' Manua
Gamma Ray Bursts as standard candles to constrain the cosmological parameters
Gamma Ray Bursts (GRBs) are among the most powerful sources in the Universe:
they emit up to 10^54 erg in the hard X-ray band in few tens of seconds. The
cosmological origin of GRBs has been confirmed by several spectroscopic
measurements of their redshifts, distributed in the range 0.1-6.3. These two
properties make GRBs very appealing to investigate the far Universe. The
energetics implied by the observed fluences and redshifts span at least four
orders of magnitudes. Therefore, at first sight, GRBs are all but standard
candles. But there are correlations among some observed quantities which allow
us to know the total energy or the peak luminosity emitted by a specific burst
with a great accuracy. Through these correlations, GRBs become "known" candles
to constrain the cosmological parameters. One of these correlation is between
the rest frame peak spectral energy E_peak and the total energy emitted in
gamma--rays E_gamma, properly corrected for the collimation factor. Another
correlation, discovered very recently, relates the total GRB luminosity L_iso,
its peak spectral energy E_peak and a characteristic timescale T_0.45, related
to the variability of the prompt emission. It is based only on prompt emission
properties, it is completely phenomenological, model independent and
assumption--free. The constraints found through these correlations on the
Omega_M and Omega_Lambda parameters are consistent with the concordance model.
The present limited sample of bursts and the lack of low redshift events,
necessary to calibrate these correlations, makes the cosmological constraints
obtained with GRBs still large compared to those obtained with other
cosmological probes (e.g. SNIa or CMB). However, the newly born field of
GRB--cosmology is very promising for the future.Comment: 39 pages, 23 figures, 2 tables. Accepted for publication in the New
Journal of Physics focus issue, "Focus on Gamma--Ray bursts in the Swift Era"
(Eds. D. H. Hartmann, C. D. Dermer, J. Greiner
AIC, BIC, Bayesian evidence against the interacting dark energy model
Recent astronomical observations have indicated that the Universe is in the
phase of accelerated expansion. While there are many cosmological models which
try to explain this phenomenon, we focus on the interacting CDM model
where the interaction between the dark energy and dark matter sectors takes
place. This model is compared to its simpler alternative---the CDM
model. To choose between these models the likelihood ratio test was applied as
well as the model comparison methods (employing Occam's principle): the Akaike
information criterion (AIC), the Bayesian information criterion (BIC) and the
Bayesian evidence. Using the current astronomical data: SNIa (Union2.1),
, BAO, Alcock--Paczynski test and CMB we evaluated both models. The
analyses based on the AIC indicated that there is less support for the
interacting CDM model when compared to the CDM model, while
those based on the BIC indicated that there is the strong evidence against it
in favor the CDM model. Given the weak or almost none support for the
interacting CDM model and bearing in mind Occam's razor we are
inclined to reject this model.Comment: LaTeX svjour3, 12 pages, 3 figure
Bayesian experimental design and model selection forecasting
Introduction Common applications of Bayesian methods in cosmology involve the computation of model probabilities and of posterior probability distributions for the parameters of those models. However, Bayesian statistics is not limited to applications based on existing data, but can equally well handle questions about expectations for the future performance of planned experiments, based on our current knowledge. This is an important topic, especially with a number of future cosmology experiments and surveys currently being planned. To give a taste, they include: large-scale optical surveys such as Pan-STARRS (Panoramic Survey Telescope and Rapid Response System), DES (the Dark Energy Survey) and LSST (Large Synoptic Survey Telescope), massive spectroscopic surveys such as WFMOS (Wide-Field Fibrefed Multi-Object Spectrograph), satellite missions such as JDEM (the Joint Dark Energy Explorer) and EUCLID, continental-sized radio telescopes such as SKA (the Square Kilometer Array) and future cosmic microwave background experiments such as B-Pol searching for primordial gravitational waves. As the amount of available resources is limited, the question of how to optimize them in order to obtain the greatest possible science return, given present knowledge, will be of increasing importance. In this chapter we address the issue of experimental forecasting and optimization, starting with the general aspects and a simple example. We then discuss the so-called Fisher Matrix approach, which allows one to compute forecasts rapidly, before looking at a real-world application. Finally, we cover forecasts of model comparison outcomes and model selection Figures of Merit