36 research outputs found
Recommended from our members
The LSST DESC data challenge 1: Generation and analysis of synthetic images for next-generation surveys
Data Challenge 1 (DC1) is the first synthetic data set produced by the Rubin Observatory Legacy Survey of Space and Time (LSST) Dark Energy Science Collaboration (DESC). DC1 is designed to develop and validate data reduction and analysis and to study the impact of systematic effects that will affect the LSST data set. DC1 is comprised of r-band observations of 40 deg2 to 10 yr LSST depth. We present each stage of the simulation and analysis process: (a) generation, by synthesizing sources from cosmological N-body simulations in individual sensor-visit images with different observing conditions; (b) reduction using a development version of the LSST Science Pipelines; and (c) matching to the input cosmological catalogue for validation and testing. We verify that testable LSST requirements pass within the fidelity of DC1. We establish a selection procedure that produces a sufficiently clean extragalactic sample for clustering analyses and we discuss residual sample contamination, including contributions from inefficiency in star-galaxy separation and imperfect deblending. We compute the galaxy power spectrum on the simulated field and conclude that: (i) survey properties have an impact of 50 per cent of the statistical uncertainty for the scales and models used in DC1; (ii) a selection to eliminate artefacts in the catalogues is necessary to avoid biases in the measured clustering; and (iii) the presence of bright objects has a significant impact (2-6) in the estimated power spectra at small scales (> 1200), highlighting the impact of blending in studies at small angular scales in LSST
Spurious Shear in Weak Lensing with LSST
The complete 10-year survey from the Large Synoptic Survey Telescope (LSST)
will image 20,000 square degrees of sky in six filter bands every few
nights, bringing the final survey depth to , with over 4 billion
well measured galaxies. To take full advantage of this unprecedented
statistical power, the systematic errors associated with weak lensing
measurements need to be controlled to a level similar to the statistical
errors.
This work is the first attempt to quantitatively estimate the absolute level
and statistical properties of the systematic errors on weak lensing shear
measurements due to the most important physical effects in the LSST system via
high fidelity ray-tracing simulations. We identify and isolate the different
sources of algorithm-independent, \textit{additive} systematic errors on shear
measurements for LSST and predict their impact on the final cosmic shear
measurements using conventional weak lensing analysis techniques. We find that
the main source of the errors comes from an inability to adequately
characterise the atmospheric point spread function (PSF) due to its high
frequency spatial variation on angular scales smaller than in the
single short exposures, which propagates into a spurious shear correlation
function at the -- level on these scales. With the large
multi-epoch dataset that will be acquired by LSST, the stochastic errors
average out, bringing the final spurious shear correlation function to a level
very close to the statistical errors. Our results imply that the cosmological
constraints from LSST will not be severely limited by these
algorithm-independent, additive systematic effects.Comment: 22 pages, 12 figures, accepted by MNRA
Evidence for the presence of dust in intervening QSO absorbers from the Sloan Digital Sky Survey
We find evidence for dust in the intervening QSO absorbers from the spectra
of QSOs in the Sloan Digital Sky Survey Data Release 1. No evidence is found
for the 2175 A feature which is present in the Milky Way dust extinction curve.
The extinction curve resembles the SMC extinction curve. The observed
Delta(g-i) excess for QSOs with strong absorption systems appears to be a
result of the reddening due to dust in the intervening absorbers.Comment: Poster paper presented at the IAU Colloquium #199 on "Probing
Galaxies through Quasar Absorption Lines" held in Shanghai, China from March
14th to 18th, 200
Recommended from our members
Spurious Shear in Weak Lensing with LSST
The complete 10-year survey from the Large Synoptic Survey Telescope (LSST) will image {approx} 20,000 square degrees of sky in six filter bands every few nights, bringing the final survey depth to r {approx} 27.5, with over 4 billion well measured galaxies. To take full advantage of this unprecedented statistical power, the systematic errors associated with weak lensing measurements need to be controlled to a level similar to the statistical errors. This work is the first attempt to quantitatively estimate the absolute level and statistical properties of the systematic errors on weak lensing shear measurements due to the most important physical effects in the LSST system via high fidelity ray-tracing simulations. We identify and isolate the different sources of algorithm-independent, additive systematic errors on shear measurements for LSST and predict their impact on the final cosmic shear measurements using conventional weak lensing analysis techniques. We find that the main source of the errors comes from an inability to adequately characterise the atmospheric point spread function (PSF) due to its high frequency spatial variation on angular scales smaller than {approx} 10{prime} in the single short exposures, which propagates into a spurious shear correlation function at the 10{sup -4}-10{sup -3} level on these scales. With the large multi-epoch dataset that will be acquired by LSST, the stochastic errors average out, bringing the final spurious shear correlation function to a level very close to the statistical errors. Our results imply that the cosmological constraints from LSST will not be severely limited by these algorithm-independent, additive systematic effects
The Sloan Digital Sky Survey Reverberation Mapping Project: Hα and Hβ reverberation measurements from first-year spectroscopy and photometry
Funding: UK Sciences and Technology Facilities Council STFC grant ST/M001296/1 (KH).We present reverberation mapping results from the first year of combined spectroscopic and photometric observations of the Sloan Digital Sky Survey Reverberation Mapping Project. We successfully recover reverberation time delays between the g+i band emission and the broad Hβ emission line for a total of 44 quasars, and for the broad Hα emission line in 18 quasars. Time delays are computed using the JAVELIN and CREAM software and the traditional interpolated cross-correlation function (ICCF): using well-defined criteria, we report measurements of 32 Hβ and 13 Hα lags with JAVELIN, 42 Hβ and 17 Hα lags with CREAM, and 16 Hβ and eight Hα lags with the ICCF. Lag values are generally consistent among the three methods, though we typically measure smaller uncertainties with JAVELIN and CREAM than with the ICCF, given the more physically motivated light curve interpolation and more robust statistical modeling of the former two methods. The median redshift of our Hβ-detected sample of quasars is 0.53, significantly higher than that of the previous reverberation mapping sample. We find that in most objects, the time delay of the Hα emission is consistent with or slightly longer than that of Hβ. We measure black hole masses using our measured time delays and line widths for these quasars. These black hole mass measurements are mostly consistent with expectations based on the local – relationship, and are also consistent with single-epoch black hole mass measurements. This work increases the current sample size of reverberation-mapped active galaxies by about two-thirds and represents the first large sample of reverberation mapping observations beyond the local universe (z < 0.3).PostprintPeer reviewe
Narrative Models: a Database Approach to Modeling Medieval Cairo
This paper explores the use of three-dimensional simulations to investigate transformations of urban form in medieval Cairo, and lessons about using computers to support historical visualization. Our first attempt to create a single extremely detailed model of Cairo proved unworkable. From this experience we developed a database approach to organizing modelling projects of complex urban environments. The database consists of several complete models at different levels of abstraction. This approach has three advantages over the earlier one: the model is never viewed as incomplete, the framework supports both additive and subtractive chronological studies, and finally, the database is viewed as infinitely expandable. Using modelling software as a tool for inquiry into architectural history becomes more feasible with this new approach
Recommended from our members
Atmospheric PSF Interpolation for Weak Lensing in Short Exposure Imaging Data
A main science goal for the Large Synoptic Survey Telescope (LSST) is to measure the cosmic shear signal from weak lensing to extreme accuracy. One difficulty, however, is that with the short exposure time ({approx_equal}15 seconds) proposed, the spatial variation of the Point Spread Function (PSF) shapes may be dominated by the atmosphere, in addition to optics errors. While optics errors mainly cause the PSF to vary on angular scales similar or larger than a single CCD sensor, the atmosphere generates stochastic structures on a wide range of angular scales. It thus becomes a challenge to infer the multi-scale, complex atmospheric PSF patterns by interpolating the sparsely sampled stars in the field. In this paper we present a new method, psfent, for interpolating the PSF shape parameters, based on reconstructing underlying shape parameter maps with a multi-scale maximum entropy algorithm. We demonstrate, using images from the LSST Photon Simulator, the performance of our approach relative to a 5th-order polynomial fit (representing the current standard) and a simple boxcar smoothing technique. Quantitatively, psfent predicts more accurate PSF models in all scenarios and the residual PSF errors are spatially less correlated. This improvement in PSF interpolation leads to a factor of 3.5 lower systematic errors in the shear power spectrum on scales smaller than {approx} 13, compared to polynomial fitting. We estimate that with psfent and for stellar densities greater than {approx_equal}1/arcmin{sup 2}, the spurious shear correlation from PSF interpolation, after combining a complete 10-year dataset from LSST, is lower than the corresponding statistical uncertainties on the cosmic shear power spectrum, even under a conservative scenario
Recommended from our members
Erratum: The sloan digital sky survey reverberation mapping project: Hα and Hβ reverberation measurements from first-year spectroscopy and photometry (Astrophysical Journal (2017) 851 (21) DOI: 10.3847/1538-4357/aa98dc)
© 2018. The American Astronomical Society. All rights reserved. We found a bug in the formula used to calculate the uncertainties in the virial products. This error produced incorrect uncertainties for the virial products and consequently the black hole mass (MBH) measurements reported in Tables 4 and 5. These uncertainties were used to produce Figures 12, 13, and 14. This error is minor and does not affect any of our results or their interpretation, and thus no edits to the text are necessary. We here provide updated/corrected tables and figures produced with the correct MBH uncertainties. (Table Presented) and (Figure Presented)
Recommended from our members
The LSST DESC data challenge 1: Generation and analysis of synthetic images for next-generation surveys
Data Challenge 1 (DC1) is the first synthetic data set produced by the Rubin Observatory Legacy Survey of Space and Time (LSST) Dark Energy Science Collaboration (DESC). DC1 is designed to develop and validate data reduction and analysis and to study the impact of systematic effects that will affect the LSST data set. DC1 is comprised of r-band observations of 40 deg2 to 10 yr LSST depth. We present each stage of the simulation and analysis process: (a) generation, by synthesizing sources from cosmological N-body simulations in individual sensor-visit images with different observing conditions; (b) reduction using a development version of the LSST Science Pipelines; and (c) matching to the input cosmological catalogue for validation and testing. We verify that testable LSST requirements pass within the fidelity of DC1. We establish a selection procedure that produces a sufficiently clean extragalactic sample for clustering analyses and we discuss residual sample contamination, including contributions from inefficiency in star-galaxy separation and imperfect deblending. We compute the galaxy power spectrum on the simulated field and conclude that: (i) survey properties have an impact of 50 per cent of the statistical uncertainty for the scales and models used in DC1; (ii) a selection to eliminate artefacts in the catalogues is necessary to avoid biases in the measured clustering; and (iii) the presence of bright objects has a significant impact (2-6) in the estimated power spectra at small scales (> 1200), highlighting the impact of blending in studies at small angular scales in LSST