1,354 research outputs found

    The Use of HepRep in GLAST

    Full text link
    HepRep is a generic, hierarchical format for description of graphics representables that can be augmented by physics information and relational properties. It was developed for high energy physics event display applications and is especially suited to client/server or component frameworks. The GLAST experiment, an international effort led by NASA for a gamma-ray telescope to launch in 2006, chose HepRep to provide a flexible, extensible and maintainable framework for their event display without tying their users to any one graphics application. To support HepRep in their GUADI infrastructure, GLAST developed a HepRep filler and builder architecture. The architecture hides the details of XML and CORBA in a set of base and helper classes allowing physics experts to focus on what data they want to represent. GLAST has two GAUDI services: HepRepSvc, which registers HepRep fillers in a global registry and allows the HepRep to be exported to XML, and CorbaSvc, which allows the HepRep to be published through a CORBA interface and which allows the client application to feed commands back to GAUDI (such as start next event, or run some GAUDI algorithm). GLAST's HepRep solution gives users a choice of client applications, WIRED (written in Java) or FRED (written in C++ and Ruby), and leaves them free to move to any future HepRep-compliant event display.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, 9 pages pdf, 15 figures. PSN THLT00

    Data Management and Mining in Astrophysical Databases

    Full text link
    We analyse the issues involved in the management and mining of astrophysical data. The traditional approach to data management in the astrophysical field is not able to keep up with the increasing size of the data gathered by modern detectors. An essential role in the astrophysical research will be assumed by automatic tools for information extraction from large datasets, i.e. data mining techniques, such as clustering and classification algorithms. This asks for an approach to data management based on data warehousing, emphasizing the efficiency and simplicity of data access; efficiency is obtained using multidimensional access methods and simplicity is achieved by properly handling metadata. Clustering and classification techniques, on large datasets, pose additional requirements: computational and memory scalability with respect to the data size, interpretability and objectivity of clustering or classification results. In this study we address some possible solutions.Comment: 10 pages, Late

    Self-Organising Networks for Classification: developing Applications to Science Analysis for Astroparticle Physics

    Full text link
    Physics analysis in astroparticle experiments requires the capability of recognizing new phenomena; in order to establish what is new, it is important to develop tools for automatic classification, able to compare the final result with data from different detectors. A typical example is the problem of Gamma Ray Burst detection, classification, and possible association to known sources: for this task physicists will need in the next years tools to associate data from optical databases, from satellite experiments (EGRET, GLAST), and from Cherenkov telescopes (MAGIC, HESS, CANGAROO, VERITAS)

    Grid services for the MAGIC experiment

    Full text link
    Exploring signals from the outer space has become an observational science under fast expansion. On the basis of its advanced technology the MAGIC telescope is the natural building block for the first large scale ground based high energy gamma-ray observatory. The low energy threshold for gamma-rays together with different background sources leads to a considerable amount of data. The analysis will be done in different institutes spread over Europe. Therefore MAGIC offers the opportunity to use the Grid technology to setup a distributed computational and data intensive analysis system with the nowadays available technology. Benefits of Grid computing for the MAGIC telescope are presented.Comment: 5 pages, 1 figures, to be published in the Proceedings of the 6th International Symposium ''Frontiers of Fundamental and Computational Physics'' (FFP6), Udine (Italy), Sep. 26-29, 200

    A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    Full text link
    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI papers published on JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/jins

    Off-line radiometric analysis of Planck/LFI data

    Get PDF
    The Planck Low Frequency Instrument (LFI) is an array of 22 pseudo-correlation radiometers on-board the Planck satellite to measure temperature and polarization anisotropies in the Cosmic Microwave Background (CMB) in three frequency bands (30, 44 and 70 GHz). To calibrate and verify the performances of the LFI, a software suite named LIFE has been developed. Its aims are to provide a common platform to use for analyzing the results of the tests performed on the single components of the instrument (RCAs, Radiometric Chain Assemblies) and on the integrated Radiometric Array Assembly (RAA). Moreover, its analysis tools are designed to be used during the flight as well to produce periodic reports on the status of the instrument. The LIFE suite has been developed using a multi-layered, cross-platform approach. It implements a number of analysis modules written in RSI IDL, each accessing the data through a portable and heavily optimized library of functions written in C and C++. One of the most important features of LIFE is its ability to run the same data analysis codes both using ground test data and real flight data as input. The LIFE software suite has been successfully used during the RCA/RAA tests and the Planck Integrated System Tests. Moreover, the software has also passed the verification for its in-flight use during the System Operations Verification Tests, held in October 2008.Comment: Planck LFI technical papers published by JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/1748-022

    Revised planet brightness temperatures using the Planck /LFI 2018 data release

    Get PDF
    Aims. We present new estimates of the brightness temperatures of Jupiter, Saturn, Uranus, and Neptune based on the measurements carried in 2009-2013 by Planck/LFI at 30, 44, and 70 GHz and released to the public in 2018. This work extends the results presented in the 2013 and 2015 Planck/LFI Calibration Papers, based on the data acquired in 2009-2011. Methods. Planck observed each planet up to eight times during the nominal mission. We processed time-ordered data from the 22 LFI radiometers to derive planet antenna temperatures for each planet and transit. We accounted for the beam shape, radiometer bandpasses, and several systematic effects. We compared our results with the results from the ninth year of WMAP, Planck/HFI observations, and existing data and models for planetary microwave emissivity. Results. For Jupiter, we obtain Tb = 144.9, 159.8, 170.5 K (\ub1 0.2 K at 1\u3c3, with temperatures expressed using the Rayleigh-Jeans scale) at 30, 44 and 70 GHz, respectively, or equivalently a band averaged Planck temperature Tb(ba) = 144.7, 160.3, 171.2 K in good agreement with WMAP and existing models. A slight excess at 30 GHz with respect to models is interpreted as an effect of synchrotron emission. Our measures for Saturn agree with the results from WMAP for rings Tb = 9.2 \ub1 1.4, 12.6 \ub1 2.3, 16.2 \ub1 0.8 K, while for the disc we obtain Tb = 140.0 \ub1 1.4, 147.2 \ub1 1.2, 150.2 \ub1 0.4 K, or equivalently a Tb(ba) = 139.7, 147.8, 151.0 K. Our measures for Uranus (Tb = 152 \ub1 6, 145 \ub1 3, 132.0 \ub1 2 K, or Tb(ba) = 152, 145, 133 K) and Neptune (Tb = 154 \ub1 11, 148 \ub1 9, 128 \ub1 3 K, or Tb(ba) = 154, 149, 128 K) agree closely with WMAP and previous data in literature

    CIWS-FW: a Customizable InstrumentWorkstation Software Framework for instrument-independent data handling

    Get PDF
    The CIWS-FW is aimed at providing a common and standard solution for the storage, processing and quick look at the data acquired from scientific instruments for astrophysics. The target system is the instrument workstation either in the context of the Electrical Ground Support Equipment for space-borne experiments, or in the context of the data acquisition system for instrumentation. The CIWS-FW core includes software developed by team members for previous experiments and provides new components and tools that improve the software reusability, configurability and extensibility attributes. The CIWS-FW mainly consists of two packages: the data processing system and the data access system. The former provides the software components and libraries to support the data acquisition, transformation, display and storage in near real time of either a data packet stream and/or a sequence of data files generated by the instrument. The latter is a meta-data and data management system, providing a reusable solution for the archiving and retrieval of the acquired data. A built-in operator GUI allows to control and configure the IW. In addition, the framework provides mechanisms for system error and logging handling. A web portal provides the access to the CIWS-FW documentation, software repository and bug tracking tools for CIWS-FW developers. We will describe the CIWS-FW architecture and summarize the project status.Comment: Accepted for pubblication on ADASS Conference Serie

    A novel background reduction strategy for high level triggers and processing in gamma-ray Cherenkov detectors

    Full text link
    Gamma ray astronomy is now at the leading edge for studies related both to fundamental physics and astrophysics. The sensitivity of gamma detectors is limited by the huge amount of background, constituted by hadronic cosmic rays (typically two to three orders of magnitude more than the signal) and by the accidental background in the detectors. By using the information on the temporal evolution of the Cherenkov light, the background can be reduced. We will present here the results obtained within the MAGIC experiment using a new technique for the reduction of the background. Particle showers produced by gamma rays show a different temporal distribution with respect to showers produced by hadrons; the background due to accidental counts shows no dependence on time. Such novel strategy can increase the sensitivity of present instruments.Comment: 4 pages, 3 figures, Proc. of the 9th Int. Syposium "Frontiers of Fundamental and Computational Physics" (FFP9), (AIP, Melville, New York, 2008, in press

    Simulating the High Energy Gamma-ray sky seen by the GLAST Large Area Telescope

    Full text link
    This paper presents the simulation of the GLAST high energy gamma-ray telescope. The simulation package, written in C++, is based on the Geant4 toolkit, and it is integrated into a general framework used to process events. A detailed simulation of the electronic signals inside Silicon detectors has been provided and it is used for the particle tracking, which is handled by a dedicated software. A unique repository for the geometrical description of the detector has been realized using the XML language and a C++ library to access this information has been designed and implemented. A new event display based on the HepRep protocol was implemented. The full simulation was used to simulate a full week of GLAST high energy gamma-ray observations. This paper outlines the contribution developed by the Italian GLAST software group.Comment: 6 pages, 4 figures, to be published in the Proceedings of the 6th International Symposium ''Frontiers of Fundamental and Computational Physics'' (FFP6), Udine (Italy), Sep. 26-29, 200
    • …
    corecore