9,840 research outputs found
The Dark Energy Survey Data Management System
The Dark Energy Survey collaboration will study cosmic acceleration with a
5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The
DES data management (DESDM) system will be used to process and archive these
data and the resulting science ready data products. The DESDM system consists
of an integrated archive, a processing framework, an ensemble of astronomy
codes and a data access framework. We are developing the DESDM system for
operation in the high performance computing (HPC) environments at NCSA and
Fermilab. Operating the DESDM system in an HPC environment offers both speed
and flexibility. We will employ it for our regular nightly processing needs,
and for more compute-intensive tasks such as large scale image coaddition
campaigns, extraction of weak lensing shear from the full survey dataset, and
massive seasonal reprocessing of the DES data. Data products will be available
to the Collaboration and later to the public through a virtual-observatory
compatible web portal. Our approach leverages investments in publicly available
HPC systems, greatly reducing hardware and maintenance costs to the project,
which must deploy and maintain only the storage, database platforms and
orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we
tested the current DESDM system on both simulated and real survey data. We used
Teragrid to process 10 simulated DES nights (3TB of raw data), ingesting and
calibrating approximately 250 million objects into the DES Archive database. We
also used DESDM to process and calibrate over 50 nights of survey data acquired
with the Mosaic2 camera. Comparison to truth tables in the case of the
simulated data and internal crosschecks in the case of the real data indicate
that astrometric and photometric data quality is excellent.Comment: To be published in the proceedings of the SPIE conference on
Astronomical Instrumentation (held in Marseille in June 2008). This preprint
is made available with the permission of SPIE. Further information together
with preprint containing full quality images is available at
http://desweb.cosmology.uiuc.edu/wik
Design and implementation of a filter engine for semantic web documents
This report describes our project that addresses the challenge of changes in the semantic web. Some studies have already been done for the so-called adaptive semantic web, such as applying inferring rules. In this study, we apply the technology of Event Notification System (ENS). Treating changes as events, we
developed a notification system for such events
SEAD Virtual Archive: Building a Federation of Institutional Repositories for Long Term Data Preservation
Major research universities are grappling with their response to the deluge of scientific data emerging through research by their faculty. Many are looking to their libraries and the institutional repository as a solution. Scientific data introduces substantial challenges that the document-based institutional repository may not be suited to deal with. The Sustainable Environment - Actionable Data (SEAD) Virtual Archive specifically addresses the challenges of “long tail” scientific data. In this paper, we propose requirements, policy and architecture to support not only the preservation of scientific data today using institutional repositories, but also its rich access and use into the future
Remote real-time monitoring of subsurface landfill gas migration
The cost of monitoring greenhouse gas emissions from landfill sites is of major concern for regulatory authorities. The current monitoring procedure is recognised as labour intensive, requiring agency inspectors to physically travel to perimeter borehole wells in rough terrain and manually measure gas concentration levels with expensive hand-held instrumentation. In this article we present a cost-effective and efficient system for remotely monitoring landfill subsurface migration of methane and carbon dioxide concentration levels. Based purely on an autonomous sensing architecture, the proposed sensing platform was capable of performing complex analytical measurements in situ and successfully communicating the data remotely to a cloud database. A web tool was developed to present the sensed data to relevant stakeholders. We report our experiences in deploying such an approach in the field over a period of approximately 16 months
The SPHERE data center: a reference for high contrast imaging processing
The objective of the SPHERE Data Center is to optimize the scientific return
of SPHERE at the VLT, by providing optimized reduction procedures, services to
users and publicly available reduced data. This paper describes our motivation,
the implementation of the service (partners, infrastructure and developments),
services, description of the on-line data, and future developments. The SPHERE
Data Center is operational and has already provided reduced data with a good
reactivity to many observers. The first public reduced data have been made
available in 2017. The SPHERE Data Center is gathering a strong expertise on
SPHERE data and is in a very good position to propose new reduced data in the
future, as well as improved reduction procedures.Comment: SF2A proceeding
1st INCF Workshop on Sustainability of Neuroscience Databases
The goal of the workshop was to discuss issues related to the sustainability of neuroscience databases, identify problems and propose solutions, and formulate recommendations to the INCF. The report summarizes the discussions of invited participants from the neuroinformatics community as well as from other disciplines where sustainability issues have already been approached. The recommendations for the INCF involve rating, ranking, and supporting database sustainability
Technologies solutions and Oracle instruments used in the accomplishment of executive informatics systems (EIS)
The role of a system for the control of the data bases and the facilities offered by it is highly important in the success and performance of an executive informatics system. From this point of view, the analyze will take into account the facilities of working with evolved data bases and storages of data, the implementation of some OLAP functionalities an data mining but also the integration of data and applications coming from different sources, the way in which the process of extraction, transformation and loading of this data in the final storages takes place, the easiness in administration and the instruments offered for the developing of interfaces. One important point of this analyze refers to the performance in interrogation, both on operational data bases and the extraction of data from data storages.the executive Informatics System, OLAP, Data Mining
- …