5,798 research outputs found

    A photometricity and extinction monitor at the Apache Point Observatory

    Full text link
    An unsupervised software ``robot'' that automatically and robustly reduces and analyzes CCD observations of photometric standard stars is described. The robot measures extinction coefficients and other photometric parameters in real time and, more carefully, on the next day. It also reduces and analyzes data from an all-sky 10ÎŒm10 \mu m camera to detect clouds; photometric data taken during cloudy periods are automatically rejected. The robot reports its findings back to observers and data analysts via the World-Wide Web. It can be used to assess photometricity, and to build data on site conditions. The robot's automated and uniform site monitoring represents a minimum standard for any observing site with queue scheduling, a public data archive, or likely participation in any future National Virtual Observatory.Comment: accepted for publication in A

    Planetary Science Virtual Observatory architecture

    Full text link
    In the framework of the Europlanet-RI program, a prototype of Virtual Observatory dedicated to Planetary Science was defined. Most of the activity was dedicated to the elaboration of standards to retrieve and visualize data in this field, and to provide light procedures to teams who wish to contribute with on-line data services. The architecture of this VO system and selected solutions are presented here, together with existing demonstrators

    SONoMA: A Service Oriented Network Measurement Architecture

    Get PDF
    Distributed network measurements are essential to characterize the structure, dynamics and operational state of the Internet. Although in the last decades several such systems have been created, the easy access of these infrastructures and the orchestration of complex measurements are not solved. We propose a system architecture that combines the flexibility of mature network measurement facilities such as PlanetLab or ETOMIC with the general accessibility and popularity of public services like Web based bandwidth measurement or traceroute servers. To realize these requirements we developed a network measurement platform, called SONoMA, based on Web Services and the basic principles of SOA, which is a well established paradigm in distributed business application development. Our approach opens the door to perform atomic and complex network measurements in real time, handles heterogeneous measurement devices, automatically stores the results in a public database and protects against malicious users as well. Furthermore, SONoMA is not only a tool for network researchers but it opens the door to developing novel applications and services requiring real-time and large scale network measurements

    Views from the coalface: chemo-sensors, sensor networks and the semantic sensor web

    Get PDF
    Currently millions of sensors are being deployed in sensor networks across the world. These networks generate vast quantities of heterogeneous data across various levels of spatial and temporal granularity. Sensors range from single-point in situ sensors to remote satellite sensors which can cover the globe. The semantic sensor web in principle should allow for the unification of the web with the real-word. In this position paper, we discuss the major challenges to this unification from the perspective of sensor developers (especially chemo-sensors) and integrating sensors data in real-world deployments. These challenges include: (1) identifying the quality of the data; (2) heterogeneity of data sources and data transport methods; (3) integrating data streams from different sources and modalities (esp. contextual information), and (4) pushing intelligence to the sensor level

    The Dark Energy Survey Data Management System

    Full text link
    The Dark Energy Survey collaboration will study cosmic acceleration with a 5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The DES data management (DESDM) system will be used to process and archive these data and the resulting science ready data products. The DESDM system consists of an integrated archive, a processing framework, an ensemble of astronomy codes and a data access framework. We are developing the DESDM system for operation in the high performance computing (HPC) environments at NCSA and Fermilab. Operating the DESDM system in an HPC environment offers both speed and flexibility. We will employ it for our regular nightly processing needs, and for more compute-intensive tasks such as large scale image coaddition campaigns, extraction of weak lensing shear from the full survey dataset, and massive seasonal reprocessing of the DES data. Data products will be available to the Collaboration and later to the public through a virtual-observatory compatible web portal. Our approach leverages investments in publicly available HPC systems, greatly reducing hardware and maintenance costs to the project, which must deploy and maintain only the storage, database platforms and orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we tested the current DESDM system on both simulated and real survey data. We used Teragrid to process 10 simulated DES nights (3TB of raw data), ingesting and calibrating approximately 250 million objects into the DES Archive database. We also used DESDM to process and calibrate over 50 nights of survey data acquired with the Mosaic2 camera. Comparison to truth tables in the case of the simulated data and internal crosschecks in the case of the real data indicate that astrometric and photometric data quality is excellent.Comment: To be published in the proceedings of the SPIE conference on Astronomical Instrumentation (held in Marseille in June 2008). This preprint is made available with the permission of SPIE. Further information together with preprint containing full quality images is available at http://desweb.cosmology.uiuc.edu/wik
    • 

    corecore