747 research outputs found
Absolute Calibration of the Radio Astronomy Flux Density Scale at 22 to 43 GHz Using Planck
The Planck mission detected thousands of extragalactic radio sources at
frequencies from 28 to 857 GHz. Planck's calibration is absolute (in the sense
that it is based on the satellite's annual motion around the Sun and the
temperature of the cosmic microwave background), and its beams are well
characterized at sub-percent levels. Thus Planck's flux density measurements of
compact sources are absolute in the same sense. We have made coordinated VLA
and ATCA observations of 65 strong, unresolved Planck sources in order to
transfer Planck's calibration to ground-based instruments at 22, 28, and 43
GHz. The results are compared to microwave flux density scales currently based
on planetary observations. Despite the scatter introduced by the variability of
many of the sources, the flux density scales are determined to 1-2% accuracy.
At 28 GHz, the flux density scale used by the VLA runs 3.6% +- 1.0% below
Planck values; at 43 GHz, the discrepancy increases to 6.2% +- 1.4% for both
ATCA and the VLA.Comment: 16 pages, 4 figures and 4 table
Absolute calibration of the radio astronomy flux density scale at 22 to 43 GHz using Planck
arXiv:1506.02892v2.-- et al.The Planck mission detected thousands of extragalactic radio sources at frequencies from 28 to 857 GHz. Planck's calibration is absolute (in the sense that it is based on the satellite's annual motion around the Sun and the temperature of the cosmic microwave background), and its beams are well characterized at sub-percent levels. Thus, Planck's flux density measurements of compact sources are absolute in the same sense. We have made coordinated Very Large Array (VLA) and Australia Telescope Compact Array (ATCA) observations of 65 strong, unresolved Planck sources in order to transfer Planck's calibration to ground-based instruments at 22, 28, and 43 GHz. The results are compared to microwave flux density scales currently based on planetary observations. Despite the scatter introduced by the variability of many of the sources, the flux density scales are determined to 1%-2% accuracy. At 28 GHz, the flux density scale used by the VLA runs 2%-3% ± 1.0% below Planck values with an uncertainty of at 43 GHz, the discrepancy increases to 5%-6% ± 1.4% for both ATCA and the VLA.MLC acknowledges the Spanish MINECO Projects AYA2012-39475-C02-01 and Consolider Ingenio 2010 CSD2010-00064. The Planck Collaboration acknowledges the support of: ESA; CNES and CNRS/INSU-IN2P3-INP (France); ASI, CNR, and INAF (Italy); NASA and DoE (USA); STFC and UKSA (UK); CSIC, MINECO, JA, and RES
(Spain); Tekes, AoF, and CSC (Finland); DLR and MPG(Germany); CSA (Canada); DTU Space (Denmark); SER/SSO (Switzerland); RCN (Norway); SFI (Ireland); FCT/MCTES (Portugal); ERC and PRACE (EU).Peer Reviewe
A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline
The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the
handling of the scientific and housekeeping telemetry. It is a critical
component of the Planck ground segment which has to strictly commit to the
project schedule to be ready for the launch and flight operations. In order to
guarantee the quality necessary to achieve the objectives of the Planck
mission, the design and development of the Level 1 software has followed the
ESA Software Engineering Standards. A fundamental step in the software life
cycle is the Verification and Validation of the software. The purpose of this
work is to show an example of procedures, test development and analysis
successfully applied to a key software project of an ESA mission. We present
the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by
detailing the methods used and the results obtained. Different approaches have
been used to test the scientific and housekeeping data processing. Scientific
data processing has been tested by injecting signals with known properties
directly into the acquisition electronics, in order to generate a test dataset
of real telemetry data and reproduce as much as possible nominal conditions.
For the HK telemetry processing, validation software have been developed to
inject known parameter values into a set of real housekeeping packets and
perform a comparison with the corresponding timelines generated by the Level 1.
With the proposed validation and verification procedure, where the on-board and
ground processing are viewed as a single pipeline, we demonstrated that the
scientific and housekeeping processing of the Planck-LFI raw data is correct
and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI
papers published on JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/jins
Off-line radiometric analysis of Planck/LFI data
The Planck Low Frequency Instrument (LFI) is an array of 22
pseudo-correlation radiometers on-board the Planck satellite to measure
temperature and polarization anisotropies in the Cosmic Microwave Background
(CMB) in three frequency bands (30, 44 and 70 GHz). To calibrate and verify the
performances of the LFI, a software suite named LIFE has been developed. Its
aims are to provide a common platform to use for analyzing the results of the
tests performed on the single components of the instrument (RCAs, Radiometric
Chain Assemblies) and on the integrated Radiometric Array Assembly (RAA).
Moreover, its analysis tools are designed to be used during the flight as well
to produce periodic reports on the status of the instrument. The LIFE suite has
been developed using a multi-layered, cross-platform approach. It implements a
number of analysis modules written in RSI IDL, each accessing the data through
a portable and heavily optimized library of functions written in C and C++. One
of the most important features of LIFE is its ability to run the same data
analysis codes both using ground test data and real flight data as input. The
LIFE software suite has been successfully used during the RCA/RAA tests and the
Planck Integrated System Tests. Moreover, the software has also passed the
verification for its in-flight use during the System Operations Verification
Tests, held in October 2008.Comment: Planck LFI technical papers published by JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/1748-022
THE PHYSIOLOGICAL DISPOSITION OF THE URICOSURIC- SALURETIC AGENT (6,7-DICHLORO-2-METHYL-1 -OXO-2- PHENYL-5-INDANYLOXY)ACETIC ACID (MK-196) IN THE RAT, DOG, AND MONKEY
ABSTRACT The physiological disposition of a new aluretic-uricosuric agent. (6.7-dichloro-2-methyl-1-oxo-2-phenyl-5-indanyloxy)acetic acid . was studied in the rat. dog. and monkey. MK-196 was well absorbed and showed minimal metabolism in these species. Peak plasma levels of radioactivity and drug occurred 0.5-2 hr after oral administration at a dose of 2.5 mg/kg
Imaging the first light: experimental challenges and future perspectives in the observation of the Cosmic Microwave Background Anisotropy
Measurements of the cosmic microwave background (CMB) allow high precision
observation of the Last Scattering Surface at redshift 1100. After the
success of the NASA satellite COBE, that in 1992 provided the first detection
of the CMB anisotropy, results from many ground-based and balloon-borne
experiments have showed a remarkable consistency between different results and
provided quantitative estimates of fundamental cosmological properties. During
2003 the team of the NASA WMAP satellite has released the first improved
full-sky maps of the CMB since COBE, leading to a deeper insight into the
origin and evolution of the Universe. The ESA satellite Planck, scheduled for
launch in 2007, is designed to provide the ultimate measurement of the CMB
temperature anisotropy over the full sky, with an accuracy that will be limited
only by astrophysical foregrounds, and robust detection of polarisation
anisotropy. In this paper we review the experimental challenges in high
precision CMB experiments and discuss the future perspectives opened by second
and third generation space missions like WMAP and Planck.Comment: To be published in "Recent Research Developments in Astronomy &
Astrophysics Astrophysiscs" - Vol I
CIWS-FW: a Customizable InstrumentWorkstation Software Framework for instrument-independent data handling
The CIWS-FW is aimed at providing a common and standard solution for the
storage, processing and quick look at the data acquired from scientific
instruments for astrophysics. The target system is the instrument workstation
either in the context of the Electrical Ground Support Equipment for
space-borne experiments, or in the context of the data acquisition system for
instrumentation. The CIWS-FW core includes software developed by team members
for previous experiments and provides new components and tools that improve the
software reusability, configurability and extensibility attributes. The CIWS-FW
mainly consists of two packages: the data processing system and the data access
system. The former provides the software components and libraries to support
the data acquisition, transformation, display and storage in near real time of
either a data packet stream and/or a sequence of data files generated by the
instrument. The latter is a meta-data and data management system, providing a
reusable solution for the archiving and retrieval of the acquired data. A
built-in operator GUI allows to control and configure the IW. In addition, the
framework provides mechanisms for system error and logging handling. A web
portal provides the access to the CIWS-FW documentation, software repository
and bug tracking tools for CIWS-FW developers. We will describe the CIWS-FW
architecture and summarize the project status.Comment: Accepted for pubblication on ADASS Conference Serie
Dynamic validation of the Planck/LFI thermal model
The Low Frequency Instrument (LFI) is an array of cryogenically cooled
radiometers on board the Planck satellite, designed to measure the temperature
and polarization anisotropies of the cosmic microwave backgrond (CMB) at 30, 44
and 70 GHz. The thermal requirements of the LFI, and in particular the
stringent limits to acceptable thermal fluctuations in the 20 K focal plane,
are a critical element to achieve the instrument scientific performance.
Thermal tests were carried out as part of the on-ground calibration campaign at
various stages of instrument integration. In this paper we describe the results
and analysis of the tests on the LFI flight model (FM) performed at Thales
Laboratories in Milan (Italy) during 2006, with the purpose of experimentally
sampling the thermal transfer functions and consequently validating the
numerical thermal model describing the dynamic response of the LFI focal plane.
This model has been used extensively to assess the ability of LFI to achieve
its scientific goals: its validation is therefore extremely important in the
context of the Planck mission. Our analysis shows that the measured thermal
properties of the instrument show a thermal damping level better than
predicted, therefore further reducing the expected systematic effect induced in
the LFI maps. We then propose an explanation of the increased damping in terms
of non-ideal thermal contacts.Comment: Planck LFI technical papers published by JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/1748-022
Optimization of Planck/LFI on--board data handling
To asses stability against 1/f noise, the Low Frequency Instrument (LFI)
onboard the Planck mission will acquire data at a rate much higher than the
data rate allowed by its telemetry bandwith of 35.5 kbps. The data are
processed by an onboard pipeline, followed onground by a reversing step. This
paper illustrates the LFI scientific onboard processing to fit the allowed
datarate. This is a lossy process tuned by using a set of 5 parameters Naver,
r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level
of distortion introduced by the onboard processing, EpsilonQ, as a function of
these parameters. It describes the method of optimizing the onboard processing
chain. The tuning procedure is based on a optimization algorithm applied to
unprocessed and uncompressed raw data provided either by simulations, prelaunch
tests or data taken from LFI operating in diagnostic mode. All the needed
optimization steps are performed by an automated tool, OCA2, which ends with
optimized parameters and produces a set of statistical indicators, among them
the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr =
2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup
the process an analytical model is developed that is able to extract most of
the relevant information on EpsilonQ and Cr as a function of the signal
statistics and the processing parameters. This model will be of interest for
the instrument data analysis. The method was applied during ground tests when
the instrument was operating in conditions representative of flight. Optimized
parameters were obtained and the performance has been verified, the required
data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of
3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx,
txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted
10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio
- …