1,109 research outputs found

    Data Streams from the Low Frequency Instrument On-Board the Planck Satellite: Statistical Analysis and Compression Efficiency

    Get PDF
    The expected data rate produced by the Low Frequency Instrument (LFI) planned to fly on the ESA Planck mission in 2007, is over a factor 8 larger than the bandwidth allowed by the spacecraft transmission system to download the LFI data. We discuss the application of lossless compression to Planck/LFI data streams in order to reduce the overall data flow. We perform both theoretical analysis and experimental tests using realistically simulated data streams in order to fix the statistical properties of the signal and the maximal compression rate allowed by several lossless compression algorithms. We studied the influence of signal composition and of acquisition parameters on the compression rate Cr and develop a semiempirical formalism to account for it. The best performing compressor tested up to now is the arithmetic compression of order 1, designed for optimizing the compression of white noise like signals, which allows an overall compression rate = 2.65 +/- 0.02. We find that such result is not improved by other lossless compressors, being the signal almost white noise dominated. Lossless compression algorithms alone will not solve the bandwidth problem but needs to be combined with other techniques.Comment: May 3, 2000 release, 61 pages, 6 figures coded as eps, 9 tables (4 included as eps), LaTeX 2.09 + assms4.sty, style file included, submitted for the pubblication on PASP May 3, 200

    A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    Full text link
    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI papers published on JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/jins

    Organization of the Euclid Data Processing: Dealing with Complexity

    Get PDF
    The data processing development and operations for the Euclid mission (part of the ESA Cosmic Vision 2015-2025 Plan) is distributed within a Consortium composed of 14 countries and 1300+ persons: this imposes a high degree of complexity to the design and implementation of the data processing facilities. The focus of this paper is on the efforts to define an organisational structure capable of handling in manageable terms such a complexity

    Biochar and vermicompost as peat replacement for ornamental-plant production

    Get PDF
    Poster presentado en el Understanding Biochar Mechanisms for Practical Implementation. Final Meeting EU-COST Actions Biochar & 76Biochar is a by-product of the C-negative pyrolysis technology for production of bio-energy from organic materials. Containerized plant production in floriculture primarily utilizes soilless substrates such as peat moss. Environmental concerns about draining peat bogs have enhanced interests in research on complementary products that can be added to peat. Thus, a comparative study was conducted to assess the suitability of a biochar (B) and vermicompost (V) mix as a partial substitute for peat-based growing media for ornamental plant production.This work was partially supported by the project CTQ 2013-46804-C2-1-R of the Spanish Ministry of Economy and Competitiveness and the European Regional Development Funds (ERDF). The authors wish to thank the Horticultural Department and Carbon Sequestration and Management Center of Ohio State University for providing materials and facilities for this investigation, also he is deeply grateful to Mrs. Loewe and Dr. J. Altland from Application Technology Research Unit at Wooster OSU campus for their laboratory assistance in determining substrates mixes physical properties.Peer reviewe

    Applicability of the langley method for non-geostationary in-orbit satellite effective isotropic radiated power estimation

    Get PDF
    The Effective Isotropic Radiated Power (EIRP) is a crucial parameter characterizing the transmitting antennas of a radiofrequency satellite link. During the satellite commissioning phase, the requirements compliance of communication subsystems is tested. One of the required tests concerns the EIRP of the satellite transmitting antenna. Ground-based power measurements of the satellite-emitted signal are collected to measure EIRP, provided that an estimate of the atmospheric losses is available from independent ancillary measurements or model data. This paper demonstrates the applicability of the so-called Langley method to infer EIRP and atmospheric attenuation simultaneously from ground-based power measurements, with no need for ancillary measurements. It is shown that the proposed method gives results similar to more traditional methods, without prior information on atmospheric attenuation. Thus, the proposed method can be applied to monitor EIRP throughout the satellite life-time from ground-based power measurements alone

    CIWS-FW: a Customizable InstrumentWorkstation Software Framework for instrument-independent data handling

    Get PDF
    The CIWS-FW is aimed at providing a common and standard solution for the storage, processing and quick look at the data acquired from scientific instruments for astrophysics. The target system is the instrument workstation either in the context of the Electrical Ground Support Equipment for space-borne experiments, or in the context of the data acquisition system for instrumentation. The CIWS-FW core includes software developed by team members for previous experiments and provides new components and tools that improve the software reusability, configurability and extensibility attributes. The CIWS-FW mainly consists of two packages: the data processing system and the data access system. The former provides the software components and libraries to support the data acquisition, transformation, display and storage in near real time of either a data packet stream and/or a sequence of data files generated by the instrument. The latter is a meta-data and data management system, providing a reusable solution for the archiving and retrieval of the acquired data. A built-in operator GUI allows to control and configure the IW. In addition, the framework provides mechanisms for system error and logging handling. A web portal provides the access to the CIWS-FW documentation, software repository and bug tracking tools for CIWS-FW developers. We will describe the CIWS-FW architecture and summarize the project status.Comment: Accepted for pubblication on ADASS Conference Serie

    Imaging the first light: experimental challenges and future perspectives in the observation of the Cosmic Microwave Background Anisotropy

    Full text link
    Measurements of the cosmic microwave background (CMB) allow high precision observation of the Last Scattering Surface at redshift zz\sim1100. After the success of the NASA satellite COBE, that in 1992 provided the first detection of the CMB anisotropy, results from many ground-based and balloon-borne experiments have showed a remarkable consistency between different results and provided quantitative estimates of fundamental cosmological properties. During 2003 the team of the NASA WMAP satellite has released the first improved full-sky maps of the CMB since COBE, leading to a deeper insight into the origin and evolution of the Universe. The ESA satellite Planck, scheduled for launch in 2007, is designed to provide the ultimate measurement of the CMB temperature anisotropy over the full sky, with an accuracy that will be limited only by astrophysical foregrounds, and robust detection of polarisation anisotropy. In this paper we review the experimental challenges in high precision CMB experiments and discuss the future perspectives opened by second and third generation space missions like WMAP and Planck.Comment: To be published in "Recent Research Developments in Astronomy & Astrophysics Astrophysiscs" - Vol I

    The Grid in INAF

    Get PDF
    Abstract. This paper presents an overview of the Grid-related projects in which Insitutes of INAF (Istituto Nazionale di Astrofisica) were involved, starting from the GRID.IT project until the recent and currently in progress participation to EGEE (Enabling Grids for EsciencE), the main project for the setup of a Grid Infrastructure for Science in Europe. The paper will give an overview of these activities putting particular emphasis on some key pilot projects, like the simulations of the Planck mission and the development of tools to widen the Grid capabilities to meet the needs of astrophysical applications

    Optimization of Planck/LFI on--board data handling

    Get PDF
    To asses stability against 1/f noise, the Low Frequency Instrument (LFI) onboard the Planck mission will acquire data at a rate much higher than the data rate allowed by its telemetry bandwith of 35.5 kbps. The data are processed by an onboard pipeline, followed onground by a reversing step. This paper illustrates the LFI scientific onboard processing to fit the allowed datarate. This is a lossy process tuned by using a set of 5 parameters Naver, r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level of distortion introduced by the onboard processing, EpsilonQ, as a function of these parameters. It describes the method of optimizing the onboard processing chain. The tuning procedure is based on a optimization algorithm applied to unprocessed and uncompressed raw data provided either by simulations, prelaunch tests or data taken from LFI operating in diagnostic mode. All the needed optimization steps are performed by an automated tool, OCA2, which ends with optimized parameters and produces a set of statistical indicators, among them the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr = 2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup the process an analytical model is developed that is able to extract most of the relevant information on EpsilonQ and Cr as a function of the signal statistics and the processing parameters. This model will be of interest for the instrument data analysis. The method was applied during ground tests when the instrument was operating in conditions representative of flight. Optimized parameters were obtained and the performance has been verified, the required data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of 3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx, txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted 10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio

    Euclid space mission: a cosmological challenge for the next 15 years

    Get PDF
    Euclid is the next ESA mission devoted to cosmology. It aims at observing most of the extragalactic sky, studying both gravitational lensing and clustering over \sim15,000 square degrees. The mission is expected to be launched in year 2020 and to last six years. The sheer amount of data of different kinds, the variety of (un)known systematic effects and the complexity of measures require efforts both in sophisticated simulations and techniques of data analysis. We review the mission main characteristics, some aspects of the the survey and highlight some of the areas of interest to this meetingComment: to appear in Proceedings IAU Symposium No. 306, 2014, "Statistical Challenges in 21st Century Cosmology", A.F. Heavens, J.-L. Starck & A. Krone-Martins, ed
    corecore