21,727 research outputs found
CERN Storage Systems for Large-Scale Wireless
The project aims at evaluating the use of CERN computing infrastructure for next generation sensor networks data analysis. The proposed system allows the simulation of a large-scale sensor array for traffic analysis, streaming data to CERN storage systems in an efficient way. The data are made available for offline and quasi-online analysis, enabling both long term planning and fast reaction on the environment
Montage: a grid portal and software toolkit for science-grade astronomical image mosaicking
Montage is a portable software toolkit for constructing custom, science-grade
mosaics by composing multiple astronomical images. The mosaics constructed by
Montage preserve the astrometry (position) and photometry (intensity) of the
sources in the input images. The mosaic to be constructed is specified by the
user in terms of a set of parameters, including dataset and wavelength to be
used, location and size on the sky, coordinate system and projection, and
spatial sampling rate. Many astronomical datasets are massive, and are stored
in distributed archives that are, in most cases, remote with respect to the
available computational resources. Montage can be run on both single- and
multi-processor computers, including clusters and grids. Standard grid tools
are used to run Montage in the case where the data or computers used to
construct a mosaic are located remotely on the Internet. This paper describes
the architecture, algorithms, and usage of Montage as both a software toolkit
and as a grid portal. Timing results are provided to show how Montage
performance scales with number of processors on a cluster computer. In
addition, we compare the performance of two methods of running Montage in
parallel on a grid.Comment: 16 pages, 11 figure
Data Driven Discovery in Astrophysics
We review some aspects of the current state of data-intensive astronomy, its
methods, and some outstanding data analysis challenges. Astronomy is at the
forefront of "big data" science, with exponentially growing data volumes and
data rates, and an ever-increasing complexity, now entering the Petascale
regime. Telescopes and observatories from both ground and space, covering a
full range of wavelengths, feed the data via processing pipelines into
dedicated archives, where they can be accessed for scientific analysis. Most of
the large archives are connected through the Virtual Observatory framework,
that provides interoperability standards and services, and effectively
constitutes a global data grid of astronomy. Making discoveries in this
overabundance of data requires applications of novel, machine learning tools.
We describe some of the recent examples of such applications.Comment: Keynote talk in the proceedings of ESA-ESRIN Conference: Big Data
from Space 2014, Frascati, Italy, November 12-14, 2014, 8 pages, 2 figure
AstroGrid-D: Grid Technology for Astronomical Science
We present status and results of AstroGrid-D, a joint effort of
astrophysicists and computer scientists to employ grid technology for
scientific applications. AstroGrid-D provides access to a network of
distributed machines with a set of commands as well as software interfaces. It
allows simple use of computer and storage facilities and to schedule or monitor
compute tasks and data management. It is based on the Globus Toolkit middleware
(GT4). Chapter 1 describes the context which led to the demand for advanced
software solutions in Astrophysics, and we state the goals of the project. We
then present characteristic astrophysical applications that have been
implemented on AstroGrid-D in chapter 2. We describe simulations of different
complexity, compute-intensive calculations running on multiple sites, and
advanced applications for specific scientific purposes, such as a connection to
robotic telescopes. We can show from these examples how grid execution improves
e.g. the scientific workflow. Chapter 3 explains the software tools and
services that we adapted or newly developed. Section 3.1 is focused on the
administrative aspects of the infrastructure, to manage users and monitor
activity. Section 3.2 characterises the central components of our architecture:
The AstroGrid-D information service to collect and store metadata, a file
management system, the data management system, and a job manager for automatic
submission of compute tasks. We summarise the successfully established
infrastructure in chapter 4, concluding with our future plans to establish
AstroGrid-D as a platform of modern e-Astronomy.Comment: 14 pages, 12 figures Subjects: data analysis, image processing,
robotic telescopes, simulations, grid. Accepted for publication in New
Astronom
Cherenkov Telescope Array Data Management
Very High Energy gamma-ray astronomy with the Cherenkov Telescope Array (CTA)
is evolving towards the model of a public observatory. Handling, processing and
archiving the large amount of data generated by the CTA instruments and
delivering scientific products are some of the challenges in designing the CTA
Data Management. The participation of scientists from within CTA Consortium and
from the greater worldwide scientific community necessitates a sophisticated
scientific analysis system capable of providing unified and efficient user
access to data, software and computing resources. Data Management is designed
to respond to three main issues: (i) the treatment and flow of data from remote
telescopes; (ii) "big-data" archiving and processing; (iii) and open data
access. In this communication the overall technical design of the CTA Data
Management, current major developments and prototypes are presented.Comment: 8 pages, 2 figures, In Proceedings of the 34th International Cosmic
Ray Conference (ICRC2015), The Hague, The Netherlands. All CTA contributions
at arXiv:1508.0589
Using visual analytics to develop situation awareness in astrophysics
We present a novel collaborative visual analytics application for cognitively overloaded users in the astrophysics domain. The system was developed for scientists who need to analyze heterogeneous, complex data under time pressure, and make predictions and time-critical decisions rapidly and correctly under a constant influx of changing data. The Sunfall Data Taking system utilizes several novel visualization and analysis techniques to enable a team of geographically distributed domain specialists to effectively and remotely maneuver a custom-built instrument under challenging operational conditions. Sunfall Data Taking has been in production use for 2 years by a major international astrophysics collaboration (the largest data volume supernova search currently in operation), and has substantially improved the operational efficiency of its users. We describe the system design process by an interdisciplinary team, the system architecture and the results of an informal usability evaluation of the production system by domain experts in the context of Endsley's three levels of situation awareness
An Innovative Workspace for The Cherenkov Telescope Array
The Cherenkov Telescope Array (CTA) is an initiative to build the next
generation, ground-based gamma-ray observatories. We present a prototype
workspace developed at INAF that aims at providing innovative solutions for the
CTA community. The workspace leverages open source technologies providing web
access to a set of tools widely used by the CTA community. Two different user
interaction models, connected to an authentication and authorization
infrastructure, have been implemented in this workspace. The first one is a
workflow management system accessed via a science gateway (based on the Liferay
platform) and the second one is an interactive virtual desktop environment. The
integrated workflow system allows to run applications used in astronomy and
physics researches into distributed computing infrastructures (ranging from
clusters to grids and clouds). The interactive desktop environment allows to
use many software packages without any installation on local desktops
exploiting their native graphical user interfaces. The science gateway and the
interactive desktop environment are connected to the authentication and
authorization infrastructure composed by a Shibboleth identity provider and a
Grouper authorization solution. The Grouper released attributes are consumed by
the science gateway to authorize the access to specific web resources and the
role management mechanism in Liferay provides the attribute-role mapping
- …