773 research outputs found

    A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    Full text link
    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI papers published on JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/jins

    Applicability of the langley method for non-geostationary in-orbit satellite effective isotropic radiated power estimation

    Get PDF
    The Effective Isotropic Radiated Power (EIRP) is a crucial parameter characterizing the transmitting antennas of a radiofrequency satellite link. During the satellite commissioning phase, the requirements compliance of communication subsystems is tested. One of the required tests concerns the EIRP of the satellite transmitting antenna. Ground-based power measurements of the satellite-emitted signal are collected to measure EIRP, provided that an estimate of the atmospheric losses is available from independent ancillary measurements or model data. This paper demonstrates the applicability of the so-called Langley method to infer EIRP and atmospheric attenuation simultaneously from ground-based power measurements, with no need for ancillary measurements. It is shown that the proposed method gives results similar to more traditional methods, without prior information on atmospheric attenuation. Thus, the proposed method can be applied to monitor EIRP throughout the satellite life-time from ground-based power measurements alone

    The Blue Straggler population in the globular cluster M53 (NGC5024): a combined HST, LBT, CFHT study

    Full text link
    We used a proper combination of multiband high-resolution and wide field multi-wavelength observations collected at three different telescopes (HST, LBT and CFHT) to probe Blue Straggler Star (BSS) populations in the globular cluster M53. Almost 200 BSS have been identified over the entire cluster extension. The radial distribution of these stars has been found to be bimodal (similarly to that of several other clusters) with a prominent dip at ~60'' (~2 r_c) from the cluster center. This value turns out to be a factor of two smaller than the radius of avoidance (r_avoid, the radius within which all the stars of ~1.2 M_sun have sunk to the core because of dynamical friction effects in an Hubble time). While in most of the clusters with a bimodal BSS radial distribution, r_avoid has been found to be located in the region of the observed minimum, this is the second case (after NGC6388) where this discrepancy is noted. This evidence suggests that in a few clusters the dynamical friction seems to be somehow less efficient than expected. We have also used this data base to construct the radial star density profile of the cluster: this is the most extended and accurate radial profile ever published for this cluster, including detailed star counts in the very inner region. The star density profile is reproduced by a standard King Model with an extended core (~25'') and a modest value of the concentration parameter (c=1.58). A deviation from the model is noted in the most external region of the cluster (at r>6.5' from the center). This feature needs to be further investigated in order to address the possible presence of a tidal tail in this cluster.Comment: 25 pages, 9 figures, accepted for publication on Ap

    Euclid space mission: a cosmological challenge for the next 15 years

    Get PDF
    Euclid is the next ESA mission devoted to cosmology. It aims at observing most of the extragalactic sky, studying both gravitational lensing and clustering over \sim15,000 square degrees. The mission is expected to be launched in year 2020 and to last six years. The sheer amount of data of different kinds, the variety of (un)known systematic effects and the complexity of measures require efforts both in sophisticated simulations and techniques of data analysis. We review the mission main characteristics, some aspects of the the survey and highlight some of the areas of interest to this meetingComment: to appear in Proceedings IAU Symposium No. 306, 2014, "Statistical Challenges in 21st Century Cosmology", A.F. Heavens, J.-L. Starck & A. Krone-Martins, ed

    Frequency Selective Surfaces for Extended Bandwidth Backing Reflector Functions

    Full text link

    The Low Frequency Instrument in the ESA Planck mission

    Full text link
    Measurements of the cosmic microwave background (CMB) allow high precision observation of the cosmic plasma at redshift z~1100. After the success of the NASA satellite COBE, that in 1992 provided the first detection of the CMB anisotropy, results from many ground-based and balloon-borne experiments have showed a remarkable consistency between different results and provided quantitative estimates of fundamental cosmological properties. During the current year the team of the NASA WMAP satellite has released the first improved full-sky maps of the CMB since COBE, leading to a deeper insight in the origin and evolution of the Universe. The ESA satellite Planck, scheduled for launch in 2007, is designed to provide the ultimate measurement of the CMB temperature anisotropy over the full sky, with an accuracy that will be limited only by astrophysical foregrounds, and robust detection of polarisation anisotropy. Planck will observe the sky with two instruments over a wide spectral band (the Low Frequency Instrument, based on coherent radiometers, from 30 to 70 GHz and the High Frequency Instrument, based on bolometric detectors, from 100 to 857 GHz). The mission performances will improve dramatically the scientific return compared to WMAP. Furthermore the LFI radiometers (as well as some of the HFI bolometers) are intrinsically sensitive to polarisation so that by combining the data from different receivers it will be possible to measure accurately the E mode and to detect the B mode of the polarisation power spectrum. Planck sensitivity will offer also the possibility to detect the non-Gaussianities imprinted in the CMB.Comment: 4 pages, 2 figures, to appear in "Proc of International Symposium on Plasmas in the Laboratory and in the Universe: new insights and new challenges", September 16-19, 2003, Como, Ital

    Cloud access to interoperable IVOA-compliant VOSpace storage

    Get PDF
    Handling, processing and archiving the huge amount of data produced by the new generation of experiments and instruments in Astronomy and Astrophysics are among the more exciting challenges to address in designing the future data management infrastructures and computing services. We investigated the feasibility of a data management and computation infrastructure, available world-wide, with the aim of merging the FAIR data management provided by IVOA standards with the efficiency and reliability of a cloud approach. Our work involved the Canadian Advanced Network for Astronomy Research (CANFAR) infrastructure and the European EGI federated cloud (EFC). We designed and deployed a pilot data management and computation infrastructure that provides IVOA-compliant VOSpace storage resources and wide access to interoperable federated clouds. In this paper, we detail the main user requirements covered, the technical choices and the implemented solutions and we describe the resulting Hybrid cloud Worldwide infrastructure, its benefits and limitation
    corecore