67 research outputs found
A persistent object manager for HEP
We propose to perform research in the area of a Persistant Object Manager for HEP. Persistant Objects are those which continue to exist upon process termination and may be accessed by other processes. It is expected that any system based upon this research will work primarily but not necessarily exclusively in an Object Oriented environment. Target applications include follow on or replacement products for existing packages such as GEANT, HEPDB, FATMEN, BHBOOK, experiment specific code event storage. In this respect, it is expected that more functionality will be required than simple persistance. It will be one of the goals of the of the project to define this extra layer of functionality. Strong emphasis will be placed on the use of standards and/or existing solutions wherever possible
Abstract Interfaces for Data Analysis: Component Architecture for Data Analysis Tools
The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. This paper gives an overview of the architecture and design of the various components for data analysis as discussed in AIDA
The LOFT Ground Segment
LOFT, the Large Observatory For X-ray Timing, was one of the ESA M3 mission
candidates that completed their assessment phase at the end of 2013. LOFT is
equipped with two instruments, the Large Area Detector (LAD) and the Wide Field
Monitor (WFM). The LAD performs pointed observations of several targets per
orbit (~90 minutes), providing roughly ~80 GB of proprietary data per day (the
proprietary period will be 12 months). The WFM continuously monitors about 1/3
of the sky at a time and provides data for about ~100 sources a day, resulting
in a total of ~20 GB of additional telemetry. The LOFT Burst alert System
additionally identifies on-board bright impulsive events (e.g., Gamma-ray
Bursts, GRBs) and broadcasts the corresponding position and trigger time to the
ground using a dedicated system of ~15 VHF receivers. All WFM data are planned
to be made public immediately. In this contribution we summarize the planned
organization of the LOFT ground segment (GS), as established in the mission
Yellow Book 1 . We describe the expected GS contributions from ESA and the LOFT
consortium. A review is provided of the planned LOFT data products and the
details of the data flow, archiving and distribution. Despite LOFT was not
selected for launch within the M3 call, its long assessment phase (> 2 years)
led to a very solid mission design and an efficient planning of its ground
operations.Comment: Proc. SPIE 9144, Space Telescopes and Instrumentation 2014:
Ultraviolet to Gamma Ray, 91446
A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline
The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the
handling of the scientific and housekeeping telemetry. It is a critical
component of the Planck ground segment which has to strictly commit to the
project schedule to be ready for the launch and flight operations. In order to
guarantee the quality necessary to achieve the objectives of the Planck
mission, the design and development of the Level 1 software has followed the
ESA Software Engineering Standards. A fundamental step in the software life
cycle is the Verification and Validation of the software. The purpose of this
work is to show an example of procedures, test development and analysis
successfully applied to a key software project of an ESA mission. We present
the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by
detailing the methods used and the results obtained. Different approaches have
been used to test the scientific and housekeeping data processing. Scientific
data processing has been tested by injecting signals with known properties
directly into the acquisition electronics, in order to generate a test dataset
of real telemetry data and reproduce as much as possible nominal conditions.
For the HK telemetry processing, validation software have been developed to
inject known parameter values into a set of real housekeeping packets and
perform a comparison with the corresponding timelines generated by the Level 1.
With the proposed validation and verification procedure, where the on-board and
ground processing are viewed as a single pipeline, we demonstrated that the
scientific and housekeeping processing of the Planck-LFI raw data is correct
and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI
papers published on JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/jins
Optimization of Planck/LFI on--board data handling
To asses stability against 1/f noise, the Low Frequency Instrument (LFI)
onboard the Planck mission will acquire data at a rate much higher than the
data rate allowed by its telemetry bandwith of 35.5 kbps. The data are
processed by an onboard pipeline, followed onground by a reversing step. This
paper illustrates the LFI scientific onboard processing to fit the allowed
datarate. This is a lossy process tuned by using a set of 5 parameters Naver,
r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level
of distortion introduced by the onboard processing, EpsilonQ, as a function of
these parameters. It describes the method of optimizing the onboard processing
chain. The tuning procedure is based on a optimization algorithm applied to
unprocessed and uncompressed raw data provided either by simulations, prelaunch
tests or data taken from LFI operating in diagnostic mode. All the needed
optimization steps are performed by an automated tool, OCA2, which ends with
optimized parameters and produces a set of statistical indicators, among them
the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr =
2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup
the process an analytical model is developed that is able to extract most of
the relevant information on EpsilonQ and Cr as a function of the signal
statistics and the processing parameters. This model will be of interest for
the instrument data analysis. The method was applied during ground tests when
the instrument was operating in conditions representative of flight. Optimized
parameters were obtained and the performance has been verified, the required
data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of
3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx,
txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted
10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio
Off-line radiometric analysis of Planck/LFI data
The Planck Low Frequency Instrument (LFI) is an array of 22
pseudo-correlation radiometers on-board the Planck satellite to measure
temperature and polarization anisotropies in the Cosmic Microwave Background
(CMB) in three frequency bands (30, 44 and 70 GHz). To calibrate and verify the
performances of the LFI, a software suite named LIFE has been developed. Its
aims are to provide a common platform to use for analyzing the results of the
tests performed on the single components of the instrument (RCAs, Radiometric
Chain Assemblies) and on the integrated Radiometric Array Assembly (RAA).
Moreover, its analysis tools are designed to be used during the flight as well
to produce periodic reports on the status of the instrument. The LIFE suite has
been developed using a multi-layered, cross-platform approach. It implements a
number of analysis modules written in RSI IDL, each accessing the data through
a portable and heavily optimized library of functions written in C and C++. One
of the most important features of LIFE is its ability to run the same data
analysis codes both using ground test data and real flight data as input. The
LIFE software suite has been successfully used during the RCA/RAA tests and the
Planck Integrated System Tests. Moreover, the software has also passed the
verification for its in-flight use during the System Operations Verification
Tests, held in October 2008.Comment: Planck LFI technical papers published by JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/1748-022
The Large Observatory for x-ray timing
The Large Observatory For x-ray Timing (LOFT) was studied within ESA M3 Cosmic Vision framework and participated in the final down-selection for a launch slot in 2022-2024. Thanks to the unprecedented combination of effective area and spectral resolution of its main instrument, LOFT will study the behaviour of matter under extreme conditions, such as the strong gravitational field in the innermost regions of accretion flows close to black holes and neutron stars, and the supra-nuclear densities in the interior of neutron stars. The science payload is based on a Large Area Detector (LAD, 10 m2 effective area, 2-30 keV, 240 eV spectral resolution, 1° collimated field of view) and a WideField Monitor (WFM, 2-50 keV, 4 steradian field of view, 1 arcmin source location accuracy, 300 eV spectral resolution). The WFM is equipped with an on-board system for bright events (e.g. GRB) localization. The trigger time and position of these events are broadcast to the ground within 30 s from discovery. In this paper we present the status of the mission at the end of its Phase A study
- …