113 research outputs found
Cosmic Shear Systematics: Software-Hardware Balance
Cosmic shear measurements rely on our ability to measure and correct the
Point Spread Function (PSF) of the observations. This PSF is measured using
stars in the field, which give a noisy measure at random points in the field.
Using Wiener filtering, we show how errors in this PSF correction process
propagate into shear power spectrum errors. This allows us to test future
space-based missions, such as Euclid or JDEM, thereby allowing us to set clear
engineering specifications on PSF variability. For ground-based surveys, where
the variability of the PSF is dominated by the environment, we briefly discuss
how our approach can also be used to study the potential of mitigation
techniques such as correlating galaxy shapes in different exposures. To
illustrate our approach we show that for a Euclid-like survey to be statistics
limited, an initial pre-correction PSF ellipticity power spectrum, with a
power-law slope of -3 must have an amplitude at l =1000 of less than 2 x
10^{-13}. This is 1500 times smaller than the typical lensing signal at this
scale. We also find that the power spectrum of PSF size \dR^2) at this scale
must be below 2 x 10^{-12}. Public code available as part of iCosmo at
http://www.icosmo.orgComment: 5 pages, 3 figures. Submitted to MNRA
Optimal PSF modeling for weak lensing: complexity and sparsity
We investigate the impact of point spread function (PSF) fitting errors on
cosmic shear measurements using the concepts of complexity and sparsity.
Complexity, introduced in a previous paper, characterizes the number of degrees
of freedom of the PSF. For instance, fitting an underlying PSF with a model
with low complexity will lead to small statistical errors on the model
parameters, however these parameters could suffer from large biases.
Alternatively, fitting with a large number of parameters will tend to reduce
biases at the expense of statistical errors. We perform an optimisation of
scatters and biases by studying the mean squared error of a PSF model. We also
characterize a model sparsity, which describes how efficiently the model is
able to represent the underlying PSF using a limited number of free parameters.
We present the general case and illustrate it for a realistic example of PSF
fitted with shapelet basis sets. We derive the relation between complexity and
sparsity of the PSF model, signal-to-noise ratio of stars and systematic errors
on cosmological parameters. With the constraint of maintaining the systematics
below the statistical uncertainties, this lead to a relation between the
required number of stars to calibrate the PSF and the sparsity. We discuss the
impact of our results for current and future cosmic shear surveys. In the
typical case where the biases can be represented as a power law of the
complexity, we show that current weak lensing surveys can calibrate the PSF
with few stars, while future surveys will require hard constraints on the
sparsity in order to calibrate the PSF with 50 stars.Comment: accepted by A&A, 9 pages, 6 figure
Information Gains from Cosmological Probes
In light of the growing number of cosmological observations, it is important
to develop versatile tools to quantify the constraining power and consistency
of cosmological probes. Originally motivated from information theory, we use
the relative entropy to compute the information gained by Bayesian updates in
units of bits. This measure quantifies both the improvement in precision and
the 'surprise', i.e. the tension arising from shifts in central values. Our
starting point is a WMAP9 prior which we update with observations of the
distance ladder, supernovae (SNe), baryon acoustic oscillations (BAO), and weak
lensing as well as the 2015 Planck release. We consider the parameters of the
flat CDM concordance model and some of its extensions which include
curvature and Dark Energy equation of state parameter . We find that,
relative to WMAP9 and within these model spaces, the probes that have provided
the greatest gains are Planck (10 bits), followed by BAO surveys (5.1 bits) and
SNe experiments (3.1 bits). The other cosmological probes, including weak
lensing (1.7 bits) and {} measures (1.7 bits), have contributed
information but at a lower level. Furthermore, we do not find any significant
surprise when updating the constraints of WMAP9 with any of the other
experiments, meaning that they are consistent with WMAP9. However, when we
choose Planck15 as the prior, we find that, accounting for the full
multi-dimensionality of the parameter space, the weak lensing measurements of
CFHTLenS produce a large surprise of 4.4 bits which is statistically
significant at the 8 level. We discuss how the relative entropy
provides a versatile and robust framework to compare cosmological probes in the
context of current and future surveys.Comment: 26 pages, 5 figure
PSF calibration requirements for dark energy from cosmic shear
The control of systematic effects when measuring galaxy shapes is one of the
main challenges for cosmic shear analyses. In this context, we study the
fundamental limitations on shear accuracy due to the measurement of the Point
Spread Function (PSF) from the finite number of stars. In order to do that, we
translate the accuracy required for cosmological parameter estimation to the
minimum number of stars over which the PSF must be calibrated. We first derive
our results analytically in the case of infinitely small pixels (i.e.
infinitely high resolution). Then image simulations are used to validate these
results and investigate the effect of finite pixel size in the case of an
elliptical gaussian PSF. Our results are expressed in terms of the minimum
number of stars required to calibrate the PSF in order to ensure that
systematic errors are smaller than statistical errors when estimating the
cosmological parameters. On scales smaller than the area containing this
minimum number of stars, there is not enough information to model the PSF. In
the case of an elliptical gaussian PSF and in the absence of dithering, 2
pixels per PSF Full Width at Half Maximum (FWHM) implies a 20% increase of the
minimum number of stars compared to the ideal case of infinitely small pixels;
0.9 pixels per PSF FWHM implies a factor 100 increase. In the case of a good
resolution and a typical Signal-to-Noise Ratio distribution of stars, we find
that current surveys need the PSF to be calibrated over a few stars, which may
explain residual systematics on scales smaller than a few arcmins. Future
all-sky cosmic shear surveys require the PSF to be calibrated over a region
containing about 50 stars.Comment: 13 pages, 4 figures, accepted by A&
iCosmo: an Interactive Cosmology Package
Aims: The interactive software package iCosmo, designed to perform
cosmological calculations is described. Methods: iCosmo is a software package
to perform interactive cosmological calculations for the low redshift universe.
Computing distance measures, the matter power spectrum, and the growth factor
is supported for any values of the cosmological parameters. It also computes
derived observed quantities for several cosmological probes such as cosmic
shear, baryon acoustic oscillations and type Ia supernovae. The associated
errors for these observables can be derived for customised surveys, or for
pre-set values corresponding to current or planned instruments. The code also
allows for the calculation of cosmological forecasts with Fisher matrices which
can be manipulated to combine different surveys and cosmological probes. The
code is written in the IDL language and thus benefits from the convenient
interactive features and scientific library available in this language. iCosmo
can also be used as an engine to perform cosmological calculations in batch
mode, and forms a convenient adaptive platform for the development of further
cosmological modules. With its extensive documentation, it may also serve as a
useful resource for teaching and for newcomers in the field of cosmology.
Results: The iCosmo package is described with various examples and command
sequences. The code is freely available with documentation at
http://www.icosmo.org, along with an interactive web interface and is part of
the Initiative for Cosmology, a common archive for cosmological resources.Comment: 6 pages including 2 tables and 4 figures. Accepted and published in
Astronomy and Astrophysics. Public code and further resources available at
http://www.icosmo.or
Computer Vision Pedestrian Awareness System
Pedestrians face dangers during their daily commutes through urban and rural environments. When considering fast-moving cars and bicycles and even other civilians, vigilance is imperative to maintaining pedestrian safety. The Computer Vision Pedestrian Awareness System is a wearable device that seeks to increase pedestrian rear-awareness to potential dangers by alerting users to dangerous events before the events occur.https://ecommons.udayton.edu/stander_posters/2742/thumbnail.jp
Cosmological models discrimination with Weak Lensing
Weak gravitational lensing provides a unique method to map directly the dark
matter in the Universe. The majority of lensing analyses uses the two-point
statistics of the cosmic shear field to constrain the cosmological model
yielding degeneracies, such as that between sigma_8 and Omega_M respectively
the r.m.s. of the mass fluctuations at a scale of 8 Mpc/h and the matter
density parameter both at z = 0. However, the two-point statistics only measure
the Gaussian properties of the field and the weak lensing field is
non-Gaussian. It has been shown that the estimation of non-Gaussian statistics
on weak lensing data can improve the constraints on cosmological parameters. In
this paper, we systematically compare a wide range of non-Gaussian estimators
in order to determine which one provides tighter constraints on the
cosmological parameters. These statistical methods include skewness, kurtosis
and the Higher Criticism test in several sparse representations such as wavelet
and curvelet; as well as the bispectrum, peak counting and a new introduced
statistic called Wavelet Peak Counting (WPC). Comparisons based on sparse
representations show that the wavelet transform is the most sensitive to
non-Gaussian cosmological structures. It appears also that the best statistic
for non-Gaussian characterization in weak lensing mass maps is the WPC.
Finally, we show that the sigma_8 -Omega_m degeneracy could be even better
broken if the WPC estimation is performed on weak lensing mass maps filtered by
the wavelet method, MRLens.Comment: Submitted to A&
Cosmological systematics beyond nuisance parameters: form-filling functions
In the absence of any compelling physical model, cosmological systematics are often misrepresented as statistical effects and the approach of marginalizing over extra nuisance systematic parameters is used to gauge the effect of the systematic. In this article, we argue that such an approach is risky at best since the key choice of function can have a large effect on the resultant cosmological errors. As an alternative we present a functional form-filling technique in which an unknown, residual, systematic is treated as such. Since the underlying function is unknown, we evaluate the effect of every functional form allowed by the information available (either a hard boundary or some data). Using a simple toy model, we introduce the formalism of functional form filling. We show that parameter errors can be dramatically affected by the choice of function in the case of marginalizing over a systematic, but that in contrast the functional form-filling approach is independent of the choice of basis set. We then apply the technique to cosmic shear shape measurement systematics and show that a shear calibration bias of |m(z)| ≲ 10−3 (1 +z)0.7 is required for a future all-sky photometric survey to yield unbiased cosmological parameter constraints to per cent accuracy. A module associated with the work in this paper is available through the open source icosmo code available at http://www.icosmo.or
- …