26 research outputs found

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be ∌24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with ÎŽ<+34.5∘\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r∌27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    Separating the Early Universe from the Late Universe: cosmological parameter estimation beyond the black box

    Full text link
    We present a method for measuring the cosmic matter budget without assumptions about speculative Early Universe physics, and for measuring the primordial power spectrum P*(k) non-parametrically, either by combining CMB and LSS information or by using CMB polarization. Our method complements currently fashionable ``black box'' cosmological parameter analysis, constraining cosmological models in a more physically intuitive fashion by mapping measurements of CMB, weak lensing and cluster abundance into k-space, where they can be directly compared with each other and with galaxy and Lyman alpha forest clustering. Including the new CBI results, we find that CMB measurements of P(k) overlap with those from 2dF galaxy clustering by over an order of magnitude in scale, and even overlap with weak lensing measurements. We describe how our approach can be used to raise the ambition level beyond cosmological parameter fitting as data improves, testing rather than assuming the underlying physics.Comment: Replaced to match accepted PRD version. Refs added. Combined CMB data and window functions at http://www.hep.upenn.edu/~max/pwindows.html or from [email protected]. 18 figs, 19 journal page

    The last stand before MAP: cosmological parameters from lensing, CMB and galaxy clustering

    Get PDF
    Cosmic shear measurements have now improved to the point where they deserve to be treated on par with CMB and galaxy clustering data for cosmological parameter analysis, using the full measured aperture mass variance curve rather than a mere phenomenological parametrization thereof. We perform a detailed 9-parameter analysis of recent lensing (RCS), CMB (up to Archeops) and galaxy clustering (2dF) data, both separately and jointly. CMB and 2dF data are consistent with a simple flat adiabatic scale-invariant model with Omega_Lambda=0.72+/-0.09, omega_cdm=0.115+/- 0.013, omega_b=0.024+/-0.003, and a hint of reionization around z~8. Lensing helps further tighten these constraints, but reveals tension regarding the power spectrum normalization: including the RCS survey results raises sigma8 significantly and forces other parameters to uncomfortable values. Indeed, sigma8 is emerging as the currently most controversial cosmological parameter, and we discuss possible resolutions of this sigma8 problem. We also comment on the disturbing fact that many recent analyses (including this one) obtain error bars smaller than the Fisher matrix bound. We produce a CMB power spectrum combining all existing experiments, and using it for a "MAP versus world" comparison next month will provide a powerful test of how realistic the error estimates have been in the cosmology community.Comment: Added references and Fisher error discussion. Combined CMB data, window and covariance matrix for January "MAP vs World" contest at http://www.hep.upenn.edu/~max/cmblsslens.html or from [email protected]

    Sodium Beacon Used for Adaptive Optics on the Multiple Mirror Telescope

    No full text

    Strategies to enable large-scale proteomics for reproducible research

    No full text
    Reproducible research is the bedrock of experimental science. To enable the deployment of large-scale proteomics, we assess the reproducibility of mass spectrometry (MS) over time and across instruments and develop computational methods for improving quantitative accuracy. We perform 1560 data independent acquisition (DIA)-MS runs of eight samples containing known proportions of ovarian and prostate cancer tissue and yeast, or control HEK293T cells. Replicates are run on six mass spectrometers operating continuously with varying maintenance schedules over four months, interspersed with ~5000 other runs. We utilise negative controls and replicates to remove unwanted variation and enhance biological signal, outperforming existing methods. We also design a method for reducing missing values. Integrating these computational modules into a pipeline (ProNorM), we mitigate variation among instruments over time and accurately predict tissue proportions. We demonstrate how to improve the quantitative analysis of large-scale DIA-MS data, providing a pathway toward clinical proteomics.ISSN:2041-172
    corecore