41,648 research outputs found

    The Dark Energy Survey Data Management System

    Full text link
    The Dark Energy Survey collaboration will study cosmic acceleration with a 5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The DES data management (DESDM) system will be used to process and archive these data and the resulting science ready data products. The DESDM system consists of an integrated archive, a processing framework, an ensemble of astronomy codes and a data access framework. We are developing the DESDM system for operation in the high performance computing (HPC) environments at NCSA and Fermilab. Operating the DESDM system in an HPC environment offers both speed and flexibility. We will employ it for our regular nightly processing needs, and for more compute-intensive tasks such as large scale image coaddition campaigns, extraction of weak lensing shear from the full survey dataset, and massive seasonal reprocessing of the DES data. Data products will be available to the Collaboration and later to the public through a virtual-observatory compatible web portal. Our approach leverages investments in publicly available HPC systems, greatly reducing hardware and maintenance costs to the project, which must deploy and maintain only the storage, database platforms and orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we tested the current DESDM system on both simulated and real survey data. We used Teragrid to process 10 simulated DES nights (3TB of raw data), ingesting and calibrating approximately 250 million objects into the DES Archive database. We also used DESDM to process and calibrate over 50 nights of survey data acquired with the Mosaic2 camera. Comparison to truth tables in the case of the simulated data and internal crosschecks in the case of the real data indicate that astrometric and photometric data quality is excellent.Comment: To be published in the proceedings of the SPIE conference on Astronomical Instrumentation (held in Marseille in June 2008). This preprint is made available with the permission of SPIE. Further information together with preprint containing full quality images is available at http://desweb.cosmology.uiuc.edu/wik

    CERN Storage Systems for Large-Scale Wireless

    Get PDF
    The project aims at evaluating the use of CERN computing infrastructure for next generation sensor networks data analysis. The proposed system allows the simulation of a large-scale sensor array for traffic analysis, streaming data to CERN storage systems in an efficient way. The data are made available for offline and quasi-online analysis, enabling both long term planning and fast reaction on the environment

    Some statistical and computational challenges, and opportunities in astronomy

    Get PDF
    The data complexity and volume of astronomical findings have increased in recent decades due to major technological improvements in instrumentation and data collection methods. The contemporary astronomer is flooded with terabytes of raw data that produce enormous multidimensional catalogs of objects (stars, galaxies, quasars, etc.) numbering in the billions, with hundreds of measured numbers for each object. The astronomical community thus faces a key task: to enable efficient and objective scientific exploitation of enormous multifaceted data sets and the complex links between data and astrophysical theory. In recognition of this task, the National Virtual Observatory (NVO) initiative recently emerged to federate numerous large digital sky archives, and to develop tools to explore and understand these vast volumes of data. The effective use of such integrated massive data sets presents a variety of new challenging statistical and algorithmic problems that require methodological advances. An interdisciplinary team of statisticians, astronomers and computer scientists from The Pennsylvania State University, California Institute of Technology and Carnegie Mellon University is developing statistical methodology for the NVO. A brief glimpse into the Virtual Observatory and the work of the Penn State-led team is provided here

    Open-source digital technologies for low-cost monitoring of historical constructions

    Get PDF
    This paper shows new possibilities of using novel, open-source, low-cost platforms for the structural health monitoring of heritage structures. The objective of the study is to present an assessment of increasingly available open-source digital modeling and fabrication technologies in order to identify the suitable counterparts of the typical components of a continuous static monitoring system for a historical construction. The results of the research include a simple case-study, which is presented with low-cost, open-source, calibrated components, as well as an assessment of different alternatives for deploying basic structural health monitoring arrangements. The results of the research show the great potential of these existing technologies that may help to promote a widespread and cost-efficient monitoring of the built cultural heritage. Such scenario may contribute to the onset of commonplace digital records of historical constructions in an open-source, versatile and reliable fashion.Peer ReviewedPostprint (author's final draft

    Pathway to the Square Kilometre Array - The German White Paper -

    Full text link
    The Square Kilometre Array (SKA) is the most ambitious radio telescope ever planned. With a collecting area of about a square kilometre, the SKA will be far superior in sensitivity and observing speed to all current radio facilities. The scientific capability promised by the SKA and its technological challenges provide an ideal base for interdisciplinary research, technology transfer, and collaboration between universities, research centres and industry. The SKA in the radio regime and the European Extreme Large Telescope (E-ELT) in the optical band are on the roadmap of the European Strategy Forum for Research Infrastructures (ESFRI) and have been recognised as the essential facilities for European research in astronomy. This "White Paper" outlines the German science and R&D interests in the SKA project and will provide the basis for future funding applications to secure German involvement in the Square Kilometre Array.Comment: Editors: H. R. Kl\"ockner, M. Kramer, H. Falcke, D.J. Schwarz, A. Eckart, G. Kauffmann, A. Zensus; 150 pages (low resolution- and colour-scale images), published in July 2012, language English (including a foreword and an executive summary in German), the original file is available via the MPIfR homepag

    Status Report of the DPHEP Study Group: Towards a Global Effort for Sustainable Data Preservation in High Energy Physics

    Full text link
    Data from high-energy physics (HEP) experiments are collected with significant financial and human effort and are mostly unique. An inter-experimental study group on HEP data preservation and long-term analysis was convened as a panel of the International Committee for Future Accelerators (ICFA). The group was formed by large collider-based experiments and investigated the technical and organisational aspects of HEP data preservation. An intermediate report was released in November 2009 addressing the general issues of data preservation in HEP. This paper includes and extends the intermediate report. It provides an analysis of the research case for data preservation and a detailed description of the various projects at experiment, laboratory and international levels. In addition, the paper provides a concrete proposal for an international organisation in charge of the data management and policies in high-energy physics

    Ringo: Interactive Graph Analytics on Big-Memory Machines

    Full text link
    We present Ringo, a system for analysis of large graphs. Graphs provide a way to represent and analyze systems of interacting objects (people, proteins, webpages) with edges between the objects denoting interactions (friendships, physical interactions, links). Mining graphs provides valuable insights about individual objects as well as the relationships among them. In building Ringo, we take advantage of the fact that machines with large memory and many cores are widely available and also relatively affordable. This allows us to build an easy-to-use interactive high-performance graph analytics system. Graphs also need to be built from input data, which often resides in the form of relational tables. Thus, Ringo provides rich functionality for manipulating raw input data tables into various kinds of graphs. Furthermore, Ringo also provides over 200 graph analytics functions that can then be applied to constructed graphs. We show that a single big-memory machine provides a very attractive platform for performing analytics on all but the largest graphs as it offers excellent performance and ease of use as compared to alternative approaches. With Ringo, we also demonstrate how to integrate graph analytics with an iterative process of trial-and-error data exploration and rapid experimentation, common in data mining workloads.Comment: 6 pages, 2 figure

    The s Process: Nuclear Physics, Stellar Models, Observations

    Full text link
    Nucleosynthesis in the s process takes place in the He burning layers of low mass AGB stars and during the He and C burning phases of massive stars. The s process contributes about half of the element abundances between Cu and Bi in solar system material. Depending on stellar mass and metallicity the resulting s-abundance patterns exhibit characteristic features, which provide comprehensive information for our understanding of the stellar life cycle and for the chemical evolution of galaxies. The rapidly growing body of detailed abundance observations, in particular for AGB and post-AGB stars, for objects in binary systems, and for the very faint metal-poor population represents exciting challenges and constraints for stellar model calculations. Based on updated and improved nuclear physics data for the s-process reaction network, current models are aiming at ab initio solution for the stellar physics related to convection and mixing processes. Progress in the intimately related areas of observations, nuclear and atomic physics, and stellar modeling is reviewed and the corresponding interplay is illustrated by the general abundance patterns of the elements beyond iron and by the effect of sensitive branching points along the s-process path. The strong variations of the s-process efficiency with metallicity bear also interesting consequences for Galactic chemical evolution.Comment: 53 pages, 20 figures, 3 tables; Reviews of Modern Physics, accepte
    • …
    corecore