687 research outputs found

    The role in the Virtual Astronomical Observatory in the era of massive data sets

    Full text link

    The ESSENCE Supernova Survey: Survey Optimization, Observations, and Supernova Photometry

    Get PDF
    We describe the implementation and optimization of the ESSENCE supernova survey, which we have undertaken to measure the equation of state parameter of the dark energy. We present a method for optimizing the survey exposure times and cadence to maximize our sensitivity to the dark energy equation of state parameter w=P/rho c^2 for a given fixed amount of telescope time. For our survey on the CTIO 4m telescope, measuring the luminosity distances and redshifts for supernovae at modest redshifts (z~0.5 +- 0.2) is optimal for determining w. We describe the data analysis pipeline based on using reliable and robust image subtraction to find supernovae automatically and in near real-time. Since making cosmological inferences with supernovae relies crucially on accurate measurement of their brightnesses, we describe our efforts to establish a thorough calibration of the CTIO 4m natural photometric system. In its first four years, ESSENCE has discovered and spectroscopically confirmed 102 type Ia SNe, at redshifts from 0.10 to 0.78, identified through an impartial, effective methodology for spectroscopic classification and redshift determination. We present the resulting light curves for the all type Ia supernovae found by ESSENCE and used in our measurement of w, presented in Wood-Vasey et al, 2007.Comment: Submitted to ApJ. Companion paper to Wood-Vasey et al (2007). Electronic tables available at http://www.ctio.noao.edu/essence/wresult

    A galaxy cluster finding algorithm for large-scale photometric surveys

    Get PDF
    As the largest gravitationally bound objects in the Universe, galaxy clusters can be used to probe a variety of topics in astrophysics and cosmology. This thesis describes the development of an algorithm to find galaxy clusters using non-parameteric methods applied to catalogs of galaxies generated from multi-colour CCD observations. It is motivated by the emergence of increasingly large, photometric galaxy surveys and the measurement of key cosmological parameters through the evolution of the cluster mass function. The algorithm presented herein is a reconstruction of the successful, spectroscopic cluster finding algorithm, C4 (Miller et al., 2005), and adapting it to large photometric surveys with the goal of applying it to data from the Dark Energy Survey (DES). AperC4 uses statistical techniques to identify collections of galaxies that are unusually clustered in a multi-dimensional space. To characterize the new algorithm, it is tested with simulations produced by the DES Collaboration and I evaluate its application to photometric datasets. In doing so, I show how AperC4 functions as a cosmology independent cluster finder and formulate metrics for a \successful" cluster finder. Finally, I produce a galaxy catalog appropriate for statistical analysis. C4 is applied to the SDSS galaxy catalog and the resulting cluster catalog is presented with some initial analyses

    The SkyMapper Transient Survey

    Full text link
    The SkyMapper 1.3 m telescope at Siding Spring Observatory has now begun regular operations. Alongside the Southern Sky Survey, a comprehensive digital survey of the entire southern sky, SkyMapper will carry out a search for supernovae and other transients. The search strategy, covering a total footprint area of ~2000 deg2 with a cadence of 5\leq 5 days, is optimised for discovery and follow-up of low-redshift type Ia supernovae to constrain cosmic expansion and peculiar velocities. We describe the search operations and infrastructure, including a parallelised software pipeline to discover variable objects in difference imaging; simulations of the performance of the survey over its lifetime; public access to discovered transients; and some first results from the Science Verification data.Comment: 13 pages, 11 figures; submitted to PAS

    Accelerating Spatial Data Processing with MapReduce

    Full text link
    Abstract—MapReduce is a key-value based programming model and an associated implementation for processing large data sets. It has been adopted in various scenarios and seems promising. However, when spatial computation is expressed straightforward by this key-value based model, difficulties arise due to unfit features and performance degradation. In this paper, we present methods as follows: 1) a splitting method for balancing workload, 2) pending file structure and redundant data partition dealing with relation between spatial objects, 3) a strip-based two-direction plane sweep-ing algorithm for computation accelerating. Based on these methods, ANN(All nearest neighbors) query and astronomical cross-certification are developed. Performance evaluation shows that the MapReduce-based spatial applications outperform the traditional one on DBMS

    The New Horizon Run Cosmological N-Body Simulations

    Full text link
    We present two large cosmological N-body simulations, called Horizon Run 2 (HR2) and Horizon Run 3 (HR3), made using 6000^3 = 216 billions and 7210^3 = 374 billion particles, spanning a volume of (7.200 Gpc/h)^3 and (10.815 Gpc/h)^3, respectively. These simulations improve on our previous Horizon Run 1 (HR1) up to a factor of 4.4 in volume, and range from 2600 to over 8800 times the volume of the Millennium Run. In addition, they achieve a considerably finer mass resolution, down to 1.25x10^11 M_sun/h, allowing to resolve galaxy-size halos with mean particle separations of 1.2 Mpc/h and 1.5 Mpc/h, respectively. We have measured the power spectrum, correlation function, mass function and basic halo properties with percent level accuracy, and verified that they correctly reproduce the LCDM theoretical expectations, in excellent agreement with linear perturbation theory. Our unprecedentedly large-volume N-body simulations can be used for a variety of studies in cosmology and astrophysics, ranging from large-scale structure topology, baryon acoustic oscillations, dark energy and the characterization of the expansion history of the Universe, till galaxy formation science - in connection with the new SDSS-III. To this end, we made a total of 35 all-sky mock surveys along the past light cone out to z=0.7 (8 from the HR2 and 27 from the HR3), to simulate the BOSS geometry. The simulations and mock surveys are already publicly available at http://astro.kias.re.kr/Horizon-Run23/.Comment: 18 pages, 10 figures. Added clarification on Fig 6. Published in the Journal of the Korean Astronomical Society (JKAS). The paper with high-resolution figures is available at http://jkas.kas.org/journals/2011v44n6/v44n6.ht

    BOSS-LDG: A Novel Computational Framework that Brings Together Blue Waters, Open Science Grid, Shifter and the LIGO Data Grid to Accelerate Gravitational Wave Discovery

    Get PDF
    We present a novel computational framework that connects Blue Waters, the NSF-supported, leadership-class supercomputer operated by NCSA, to the Laser Interferometer Gravitational-Wave Observatory (LIGO) Data Grid via Open Science Grid technology. To enable this computational infrastructure, we configured, for the first time, a LIGO Data Grid Tier-1 Center that can submit heterogeneous LIGO workflows using Open Science Grid facilities. In order to enable a seamless connection between the LIGO Data Grid and Blue Waters via Open Science Grid, we utilize Shifter to containerize LIGO's workflow software. This work represents the first time Open Science Grid, Shifter, and Blue Waters are unified to tackle a scientific problem and, in particular, it is the first time a framework of this nature is used in the context of large scale gravitational wave data analysis. This new framework has been used in the last several weeks of LIGO's second discovery campaign to run the most computationally demanding gravitational wave search workflows on Blue Waters, and accelerate discovery in the emergent field of gravitational wave astrophysics. We discuss the implications of this novel framework for a wider ecosystem of Higher Performance Computing users.Comment: 10 pages, 10 figures. Accepted as a Full Research Paper to the 13th IEEE International Conference on eScienc
    corecore