10 research outputs found
Correlations in the three-dimensional Lyman-alpha forest contaminated by high column density absorbers
Correlations measured in three dimensions in the Lyman-alpha forest are
contaminated by the presence of the damping wings of high column density (HCD)
absorbing systems of neutral hydrogen (HI; having column densities
), which
extend significantly beyond the redshift-space location of the absorber. We
measure this effect as a function of the column density of the HCD absorbers
and redshift by measuring 3D flux power spectra in cosmological hydrodynamical
simulations from the Illustris project. Survey pipelines exclude regions
containing the largest damping wings. We find that, even after this procedure,
there is a scale-dependent correction to the 3D Lyman-alpha forest flux power
spectrum from residual contamination. We model this residual using a simple
physical model of the HCD absorbers as linearly biased tracers of the matter
density distribution, convolved with their Voigt profiles and integrated over
the column density distribution function. We recommend the use of this model
over existing models used in data analysis, which approximate the damping wings
as top-hats and so miss shape information in the extended wings. The simple
'linear Voigt model' is statistically consistent with our simulation results
for a mock residual contamination up to small scales (). It does not account for the effect of the highest
column density absorbers on the smallest scales (e.g., for small damped Lyman-alpha absorbers; HCD
absorbers with ). However, these systems are in any
case preferentially removed from survey data. Our model is appropriate for an
accurate analysis of the baryon acoustic oscillations feature. It is
additionally essential for reconstructing the full shape of the 3D flux power
spectrum.Comment: 13 pages, 11 figures. Minor changes to match version published in
MNRA
An Emulator for the Lyman-alpha Forest
We present methods for interpolating between the 1-D flux power spectrum of
the Lyman- forest, as output by cosmological hydrodynamic simulations.
Interpolation is necessary for cosmological parameter estimation due to the
limited number of simulations possible. We construct an emulator for the
Lyman- forest flux power spectrum from small simulations using
Latin hypercube sampling and Gaussian process interpolation. We show that this
emulator has a typical accuracy of 1.5% and a worst-case accuracy of 4%, which
compares well to the current statistical error of 3 - 5% at from BOSS
DR9. We compare to the previous state of the art, quadratic polynomial
interpolation. The Latin hypercube samples the entire volume of parameter
space, while quadratic polynomial emulation samples only lower-dimensional
subspaces. The Gaussian process provides an estimate of the emulation error and
we show using test simulations that this estimate is reasonable. We construct a
likelihood function and use it to show that the posterior constraints generated
using the emulator are unbiased. We show that our Gaussian process emulator has
lower emulation error than quadratic polynomial interpolation and thus produces
tighter posterior confidence intervals, which will be essential for future
Lyman- surveys such as DESI.Comment: 28 pages, 10 figures, accepted to JCAP with minor change
Simulating the effect of high column density absorbers on the one-dimensional Lyman-alpha forest flux power spectrum
We measure the effect of high column density absorbing systems of neutral
hydrogen (HI) on the one-dimensional (1D) Lyman-alpha forest flux power
spectrum using cosmological hydrodynamical simulations from the Illustris
project. High column density absorbers (which we define to be those with HI
column densities ) cause broadened absorption lines
with characteristic damping wings. These damping wings bias the 1D Lyman-alpha
forest flux power spectrum by causing absorption in quasar spectra away from
the location of the absorber itself. We investigate the effect of high column
density absorbers on the Lyman-alpha forest using hydrodynamical simulations
for the first time. We provide templates as a function of column density and
redshift, allowing the flexibility to accurately model residual contamination,
i.e., if an analysis selectively clips out the largest damping wings. This
flexibility will improve cosmological parameter estimation, e.g., allowing more
accurate measurement of the shape of the power spectrum, with implications for
cosmological models containing massive neutrinos or a running of the spectral
index. We provide fitting functions to reproduce these results so that they can
be incorporated straightforwardly into a data analysis pipeline.Comment: 11 pages, 6 figures. Minor changes to match version published in
MNRA
Bayesian emulator optimisation for cosmology: application to the Lyman-alpha forest
The Lyman-alpha forest provides strong constraints on both cosmological
parameters and intergalactic medium astrophysics, which are forecast to improve
further with the next generation of surveys including eBOSS and DESI. As is
generic in cosmological inference, extracting this information requires a
likelihood to be computed throughout a high-dimensional parameter space.
Evaluating the likelihood requires a robust and accurate mapping between the
parameters and observables, in this case the 1D flux power spectrum.
Cosmological simulations enable such a mapping, but due to computational time
constraints can only be evaluated at a handful of sample points; "emulators"
are designed to interpolate between these. The problem then reduces to placing
the sample points such that an accurate mapping is obtained while minimising
the number of expensive simulations required. To address this, we introduce an
emulation procedure that employs Bayesian optimisation of the training set for
a Gaussian process interpolation scheme. Starting with a Latin hypercube
sampling (other schemes with good space-filling properties can be used), we
iteratively augment the training set with extra simulations at new parameter
positions which balance the need to reduce interpolation error while focussing
on regions of high likelihood. We show that smaller emulator error from the
Bayesian optimisation propagates to smaller widths on the posterior
distribution. Even with fewer simulations than a Latin hypercube, Bayesian
optimisation shrinks the 95% credible volume by 90% and, e.g., the 1 sigma
error on the amplitude of small-scale primordial fluctuations by 38%. This is
the first demonstration of Bayesian optimisation applied to large-scale
structure emulation, and we anticipate the technique will generalise to many
other probes such as galaxy clustering, weak lensing and 21cm.Comment: 23 pages, 4 figures. Minor changes to match version published in JCA
Cicada-MET: an efficient ecological monitoring protocol of cicada populations
Cicadas are a fascinating group of insects that play an essential role in terrestrial ecosystems. Their long-lasting association with plant roots encourages their use as indicators of vegetation and soil integrity. Cicada-MET is a novel, standardized method for monitoring cicada populations by counting cicada exuviae (i.e., the skin of emerged nymphs), providing an effective and efficient means to study their distribution, abundance, and ecology. The method involves annual exuviae counts along transects and fixed plots sampled throughout the emergence season. We validated Cicada-MET using a database of 466 counts from 64 transects over 10 years and the sampling of 60 plots for one season. Methodological aspects tested included sampling speed, exuviae detectability in successive counts, exuviae loss due to weather, and cicada species detection performance using exuviae counts compared to auditory methods. Transects captured approximately 10% of the total number of emerged nymphs across one season, demonstrating the protocolâs reliability in estimating emerging cicada population numbers. However, caution is needed when inferring densities for larger areas, separated from paths where transects are located. The standardized nature of Cicada-MET reduces spatial and temporal biases, allowing for interspecific comparisons and monitoring interannual variations in abundances and emergence timing. This method is well-suited for studying the impact of natural and anthropogenic disturbances. The high-resolution data obtained can be easily combined with environmental variables, enhancing the value of cicada data as bioindicators. In summary, Cicada-MET offers a versatile and efficient tool for monitoring cicada populations, with applications in ecological indication, conservation, and management. The adaptability of Cicada-MET to various research questions, spatial scales, and long-term approaches, along with its quantitative accuracy and ease of use, make it a valuable resource for researchers and practitioners working with cicadas and their associated ecosystems
Astro2020 Science White Paper: Primordial Non-Gaussianity
5 pages + references; Submitted to the Astro2020 call for science white papers. This version: fixed author listInternational audienceOur current understanding of the Universe is established through the pristine measurements of structure in the cosmic microwave background (CMB) and the distribution and shapes of galaxies tracing the large scale structure (LSS) of the Universe. One key ingredient that underlies cosmological observables is that the field that sources the observed structure is assumed to be initially Gaussian with high precision. Nevertheless, a minimal deviation from Gaussianityis perhaps the most robust theoretical prediction of models that explain the observed Universe; itis necessarily present even in the simplest scenarios. In addition, most inflationary models produce far higher levels of non-Gaussianity. Since non-Gaussianity directly probes the dynamics in the early Universe, a detection would present a monumental discovery in cosmology, providing clues about physics at energy scales as high as the GUT scale
The DESI experiment part I: science, targeting, and survey design
DESI (Dark Energy Spectroscopic Instrument) is a Stage IV ground-based dark energy experiment that will study baryon acoustic oscillations (BAO) and the growth of structure through redshift-space distortions with a wide-area galaxy and quasar redshift survey. To trace the underlying dark matter distribution, spectroscopic targets will be selected in four classes from imaging data. We will measure luminous red galaxies up to . To probe the Universe out to even higher redshift, DESI will target bright [O II] emission line galaxies up to . Quasars will be targeted both as direct tracers of the underlying dark matter distribution and, at higher redshifts (), for the Ly- forest absorption features in their spectra, which will be used to trace the distribution of neutral hydrogen. When moonlight prevents efficient observations of the faint targets of the baseline survey, DESI will conduct a magnitude-limited Bright Galaxy Survey comprising approximately 10 million galaxies with a median . In total, more than 30 million galaxy and quasar redshifts will be obtained to measure the BAO feature and determine the matter power spectrum, including redshift space distortions
The DESI Experiment Part II: Instrument Design
DESI (Dark Energy Spectropic Instrument) is a Stage IV ground-based dark energy experiment that will study baryon acoustic oscillations and the growth of structure through redshift-space distortions with a wide-area galaxy and quasar redshift survey. The DESI instrument is a robotically-actuated, fiber-fed spectrograph capable of taking up to 5,000 simultaneous spectra over a wavelength range from 360 nm to 980 nm. The fibers feed ten three-arm spectrographs with resolution between 2000 and 5500, depending on wavelength. The DESI instrument will be used to conduct a five-year survey designed to cover 14,000 deg. This powerful instrument will be installed at prime focus on the 4-m Mayall telescope in Kitt Peak, Arizona, along with a new optical corrector, which will provide a three-degree diameter field of view. The DESI collaboration will also deliver a spectroscopic pipeline and data management system to reduce and archive all data for eventual public use