723 research outputs found

    Data Streams from the Low Frequency Instrument On-Board the Planck Satellite: Statistical Analysis and Compression Efficiency

    Get PDF
    The expected data rate produced by the Low Frequency Instrument (LFI) planned to fly on the ESA Planck mission in 2007, is over a factor 8 larger than the bandwidth allowed by the spacecraft transmission system to download the LFI data. We discuss the application of lossless compression to Planck/LFI data streams in order to reduce the overall data flow. We perform both theoretical analysis and experimental tests using realistically simulated data streams in order to fix the statistical properties of the signal and the maximal compression rate allowed by several lossless compression algorithms. We studied the influence of signal composition and of acquisition parameters on the compression rate Cr and develop a semiempirical formalism to account for it. The best performing compressor tested up to now is the arithmetic compression of order 1, designed for optimizing the compression of white noise like signals, which allows an overall compression rate = 2.65 +/- 0.02. We find that such result is not improved by other lossless compressors, being the signal almost white noise dominated. Lossless compression algorithms alone will not solve the bandwidth problem but needs to be combined with other techniques.Comment: May 3, 2000 release, 61 pages, 6 figures coded as eps, 9 tables (4 included as eps), LaTeX 2.09 + assms4.sty, style file included, submitted for the pubblication on PASP May 3, 200

    Innovative Water-Reduced Injection Grouts for the Stabilisation of Wall Paintings in the Hadi Rani Mahal, Nagaur, India: Design, Testing and Implementation

    Get PDF
    The design and evaluation of site-specific injection grouts for the stabilisation of delaminated wall paintings is often challenging to perform in situ, due to constraints such as time, availability of materials and reliable testing procedures. In this research, a rigorous design and testing methodology, including the development of a new adhesion test, was adopted on-site for the development of injection grouts to be used in water-sensitive situations. Water-reduced mixtures were obtained by partly substituting water with ethanol. Previous research by the authors had demonstrated in the laboratory the potential suitability of water–ethanol grouts. In the present paper, water-reduced grouts were designed, tested and applied on-site for the first time

    On the loss of telemetry data in full-sky surveys from space

    Full text link
    In this paper we discuss the issue of loosing telemetry (TM) data due to different reasons (e.g. spacecraft-ground transmissions) while performing a full-sky survey with space-borne instrumentation. This is a particularly important issue considering the current and future space missions (like Planck from ESA and WMAP from NASA) operating from an orbit far from Earth with short periods of visibility from ground stations. We consider, as a working case, the Low Frequency Instrument (LFI) on-board the Planck satellite albeit the approach developed here can be easily applied to any kind of experiment that makes use of an observing (scanning) strategy which assumes repeated pointings of the same region of the sky on different time scales. The issue is addressed by means of a Monte Carlo approach. Our analysis clearly shows that, under quite general conditions, it is better to cover the sky more times with a lower fraction of TM retained than less times with a higher guaranteed TM fraction. In the case of Planck, an extension of mission time to allow a third sky coverage with 95% of the total TM guaranteed provides a significant reduction of the probability to loose scientific information with respect to an increase of the total guaranteed TM to 98% with the two nominal sky coverages.Comment: 17 pages, 6 figures, accepted for publication on New Astronom

    Organization of the Euclid Data Processing: Dealing with Complexity

    Get PDF
    The data processing development and operations for the Euclid mission (part of the ESA Cosmic Vision 2015-2025 Plan) is distributed within a Consortium composed of 14 countries and 1300+ persons: this imposes a high degree of complexity to the design and implementation of the data processing facilities. The focus of this paper is on the efforts to define an organisational structure capable of handling in manageable terms such a complexity

    Imaging the first light: experimental challenges and future perspectives in the observation of the Cosmic Microwave Background Anisotropy

    Full text link
    Measurements of the cosmic microwave background (CMB) allow high precision observation of the Last Scattering Surface at redshift z∼z\sim1100. After the success of the NASA satellite COBE, that in 1992 provided the first detection of the CMB anisotropy, results from many ground-based and balloon-borne experiments have showed a remarkable consistency between different results and provided quantitative estimates of fundamental cosmological properties. During 2003 the team of the NASA WMAP satellite has released the first improved full-sky maps of the CMB since COBE, leading to a deeper insight into the origin and evolution of the Universe. The ESA satellite Planck, scheduled for launch in 2007, is designed to provide the ultimate measurement of the CMB temperature anisotropy over the full sky, with an accuracy that will be limited only by astrophysical foregrounds, and robust detection of polarisation anisotropy. In this paper we review the experimental challenges in high precision CMB experiments and discuss the future perspectives opened by second and third generation space missions like WMAP and Planck.Comment: To be published in "Recent Research Developments in Astronomy & Astrophysics Astrophysiscs" - Vol I

    The Grid in INAF

    Get PDF
    Abstract. This paper presents an overview of the Grid-related projects in which Insitutes of INAF (Istituto Nazionale di Astrofisica) were involved, starting from the GRID.IT project until the recent and currently in progress participation to EGEE (Enabling Grids for EsciencE), the main project for the setup of a Grid Infrastructure for Science in Europe. The paper will give an overview of these activities putting particular emphasis on some key pilot projects, like the simulations of the Planck mission and the development of tools to widen the Grid capabilities to meet the needs of astrophysical applications

    The Low Frequency Instrument in the ESA Planck mission

    Full text link
    Measurements of the cosmic microwave background (CMB) allow high precision observation of the cosmic plasma at redshift z~1100. After the success of the NASA satellite COBE, that in 1992 provided the first detection of the CMB anisotropy, results from many ground-based and balloon-borne experiments have showed a remarkable consistency between different results and provided quantitative estimates of fundamental cosmological properties. During the current year the team of the NASA WMAP satellite has released the first improved full-sky maps of the CMB since COBE, leading to a deeper insight in the origin and evolution of the Universe. The ESA satellite Planck, scheduled for launch in 2007, is designed to provide the ultimate measurement of the CMB temperature anisotropy over the full sky, with an accuracy that will be limited only by astrophysical foregrounds, and robust detection of polarisation anisotropy. Planck will observe the sky with two instruments over a wide spectral band (the Low Frequency Instrument, based on coherent radiometers, from 30 to 70 GHz and the High Frequency Instrument, based on bolometric detectors, from 100 to 857 GHz). The mission performances will improve dramatically the scientific return compared to WMAP. Furthermore the LFI radiometers (as well as some of the HFI bolometers) are intrinsically sensitive to polarisation so that by combining the data from different receivers it will be possible to measure accurately the E mode and to detect the B mode of the polarisation power spectrum. Planck sensitivity will offer also the possibility to detect the non-Gaussianities imprinted in the CMB.Comment: 4 pages, 2 figures, to appear in "Proc of International Symposium on Plasmas in the Laboratory and in the Universe: new insights and new challenges", September 16-19, 2003, Como, Ital

    A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    Full text link
    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI papers published on JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/jins

    Euclid space mission: a cosmological challenge for the next 15 years

    Get PDF
    Euclid is the next ESA mission devoted to cosmology. It aims at observing most of the extragalactic sky, studying both gravitational lensing and clustering over ∼\sim15,000 square degrees. The mission is expected to be launched in year 2020 and to last six years. The sheer amount of data of different kinds, the variety of (un)known systematic effects and the complexity of measures require efforts both in sophisticated simulations and techniques of data analysis. We review the mission main characteristics, some aspects of the the survey and highlight some of the areas of interest to this meetingComment: to appear in Proceedings IAU Symposium No. 306, 2014, "Statistical Challenges in 21st Century Cosmology", A.F. Heavens, J.-L. Starck & A. Krone-Martins, ed
    • …
    corecore