962 research outputs found
Calibration of Correlation Radiometers Using Pseudo-Random Noise Signals
The calibration of correlation radiometers, and particularly aperture synthesis interferometric radiometers, is a critical issue to ensure their performance. Current calibration techniques are based on the measurement of the cross-correlation of receivers’ outputs when injecting noise from a common noise source requiring a very stable distribution network. For large interferometric radiometers this centralized noise injection approach is very complex from the point of view of mass, volume and phase/amplitude equalization. Distributed noise injection techniques have been proposed as a feasible alternative, but are unable to correct for the so-called “baseline errors” associated with the particular pair of receivers forming the baseline. In this work it is proposed the use of centralized Pseudo-Random Noise (PRN) signals to calibrate correlation radiometers. PRNs are sequences of symbols with a long repetition period that have a flat spectrum over a bandwidth which is determined by the symbol rate. Since their spectrum resembles that of thermal noise, they can be used to calibrate correlation radiometers. At the same time, since these sequences are deterministic, new calibration schemes can be envisaged, such as the correlation of each receiver’s output with a baseband local replica of the PRN sequence, as well as new distribution schemes of calibration signals. This work analyzes the general requirements and performance of using PRN sequences for the calibration of microwave correlation radiometers, and particularizes the study to a potential implementation in a large aperture synthesis radiometer using an optical distribution network
Optimization of Planck/LFI on--board data handling
To asses stability against 1/f noise, the Low Frequency Instrument (LFI)
onboard the Planck mission will acquire data at a rate much higher than the
data rate allowed by its telemetry bandwith of 35.5 kbps. The data are
processed by an onboard pipeline, followed onground by a reversing step. This
paper illustrates the LFI scientific onboard processing to fit the allowed
datarate. This is a lossy process tuned by using a set of 5 parameters Naver,
r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level
of distortion introduced by the onboard processing, EpsilonQ, as a function of
these parameters. It describes the method of optimizing the onboard processing
chain. The tuning procedure is based on a optimization algorithm applied to
unprocessed and uncompressed raw data provided either by simulations, prelaunch
tests or data taken from LFI operating in diagnostic mode. All the needed
optimization steps are performed by an automated tool, OCA2, which ends with
optimized parameters and produces a set of statistical indicators, among them
the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr =
2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup
the process an analytical model is developed that is able to extract most of
the relevant information on EpsilonQ and Cr as a function of the signal
statistics and the processing parameters. This model will be of interest for
the instrument data analysis. The method was applied during ground tests when
the instrument was operating in conditions representative of flight. Optimized
parameters were obtained and the performance has been verified, the required
data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of
3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx,
txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted
10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio
New Passive Instruments Developed for Ocean Monitoring at the Remote Sensing Lab—Universitat Politècnica de Catalunya
Lack of frequent and global observations from space is currently a limiting factor in many Earth Observation (EO) missions. Two potential techniques that have been proposed nowadays are: (1) the use of satellite constellations, and (2) the use of Global Navigation Satellite Signals (GNSS) as signals of opportunity (no transmitter required). Reflectometry using GNSS opportunity signals (GNSS-R) was originally proposed in 1993 by Martin-Neira (ESA-ESTEC) for altimetry applications, but later its use for wind speed determination has been proposed, and more recently to perform the sea state correction required in sea surface salinity retrievals by means of L-band microwave radiometry (TB). At present, two EO space-borne missions are currently planned to be launched in the near future: (1) ESA's SMOS mission, using a Y-shaped synthetic aperture radiometer, launch date November 2nd, 2009, and (2) NASA-CONAE AQUARIUS/SAC-D mission, using a three beam push-broom radiometer. In the SMOS mission, the multi-angle observation capabilities allow to simultaneously retrieve not only the surface salinity, but also the surface temperature and an “effective” wind speed that minimizes the differences between observations and models. In AQUARIUS, an L-band scatterometer measuring the radar backscatter (σ0) will be used to perform the necessary sea state corrections. However, none of these approaches are fully satisfactory, since the effective wind speed captures some sea surface roughness effects, at the expense of introducing another variable to be retrieved, and on the other hand the plots (TB-σ0) present a large scattering. In 2003, the Passive Advance Unit for ocean monitoring (PAU) project was proposed to the European Science Foundation in the frame of the EUropean Young Investigator Awards (EURYI) to test the feasibility of GNSS-R over the sea surface to make sea state measurements and perform the correction of the L-band brightness temperature. This paper: (1) provides an overview of the Physics of the L-band radiometric and GNSS reflectometric observations over the ocean, (2) describes the instrumentation that has been (is being) developed in the frame of the EURYI-funded PAU project, (3) the ground-based measurements carried out so far, and their interpretation in view of placing a GNSS-reflectometer as secondary payload in future SMOS follow-on missions
Planck-LFI: Design and Performance of the 4 Kelvin Reference Load Unit
The LFI radiometers use a pseudo-correlation design where the signal from the
sky is continuously compared with a stable reference signal, provided by a
cryogenic reference load system. The reference unit is composed by small
pyramidal horns, one for each radiometer, 22 in total, facing small absorbing
targets, made of a commercial resin ECCOSORB CR (TM), cooled to approximately
4.5 K. Horns and targets are separated by a small gap to allow thermal
decoupling. Target and horn design is optimized for each of the LFI bands,
centered at 70, 44 and 30 GHz. Pyramidal horns are either machined inside the
radiometer 20K module or connected via external electro-formed bended
waveguides. The requirement of high stability of the reference signal imposed a
careful design for the radiometric and thermal properties of the loads.
Materials used for the manufacturing have been characterized for thermal, RF
and mechanical properties. We describe in this paper the design and the
performance of the reference system.Comment: This is an author-created, un-copyedited version of an article
accepted for publication in JINST. IOP Publishing Ltd is not responsible for
any errors or omissions in this version of the manuscript or any version
derived from it. The definitive publisher authenticated version is available
online at [10.1088/1748-0221/4/12/T12006]. 14 pages, 34 figure
Five-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Data Processing, Sky Maps, and Basic Results
We present new full-sky temperature and polarization maps in five frequency
bands from 23 to 94 GHz, based on data from the first five years of the WMAP
sky survey. The five-year maps incorporate several improvements in data
processing made possible by the additional years of data and by a more complete
analysis of the instrument calibration and in-flight beam response. We present
several new tests for systematic errors in the polarization data and conclude
that Ka band data (33 GHz) is suitable for use in cosmological analysis, after
foreground cleaning. This significantly reduces the overall polarization
uncertainty. With the 5 year WMAP data, we detect no convincing deviations from
the minimal 6-parameter LCDM model: a flat universe dominated by a cosmological
constant, with adiabatic and nearly scale-invariant Gaussian fluctuations.
Using WMAP data combined with measurements of Type Ia supernovae and Baryon
Acoustic Oscillations, we find (68% CL uncertainties): Omega_bh^2 = 0.02267 \pm
0.00059, Omega_ch^2 = 0.1131 \pm 0.0034, Omega_Lambda = 0.726 \pm 0.015, n_s =
0.960 \pm 0.013, tau = 0.084 \pm 0.016, and Delta_R^2 = (2.445 \pm 0.096) x
10^-9. From these we derive: sigma_8 = 0.812 \pm 0.026, H_0 = 70.5 \pm 1.3
km/s/Mpc, z_{reion} = 10.9 \pm 1.4, and t_0 = 13.72 \pm 0.12 Gyr. The new limit
on the tensor-to-scalar ratio is r < 0.22 (95% CL). We obtain tight,
simultaneous limits on the (constant) dark energy equation of state and spatial
curvature: -0.14 < 1+w < 0.12 and -0.0179 < Omega_k < 0.0081 (both 95% CL). The
number of relativistic degrees of freedom (e.g. neutrinos) is found to be
N_{eff} = 4.4 \pm 1.5, consistent with the standard value of 3.04. Models with
N_{eff} = 0 are disfavored at >99.5% confidence.Comment: 46 pages, 13 figures, and 7 tables. Version accepted for publication,
ApJS, Feb-2009. Includes 5-year dipole results and additional references.
Also available at
http://lambda.gsfc.nasa.gov/product/map/dr3/map_bibliography.cf
Imaging the first light: experimental challenges and future perspectives in the observation of the Cosmic Microwave Background Anisotropy
Measurements of the cosmic microwave background (CMB) allow high precision
observation of the Last Scattering Surface at redshift 1100. After the
success of the NASA satellite COBE, that in 1992 provided the first detection
of the CMB anisotropy, results from many ground-based and balloon-borne
experiments have showed a remarkable consistency between different results and
provided quantitative estimates of fundamental cosmological properties. During
2003 the team of the NASA WMAP satellite has released the first improved
full-sky maps of the CMB since COBE, leading to a deeper insight into the
origin and evolution of the Universe. The ESA satellite Planck, scheduled for
launch in 2007, is designed to provide the ultimate measurement of the CMB
temperature anisotropy over the full sky, with an accuracy that will be limited
only by astrophysical foregrounds, and robust detection of polarisation
anisotropy. In this paper we review the experimental challenges in high
precision CMB experiments and discuss the future perspectives opened by second
and third generation space missions like WMAP and Planck.Comment: To be published in "Recent Research Developments in Astronomy &
Astrophysics Astrophysiscs" - Vol I
Satellite Emission Range Inferred Earth Survey (SERIES) project
The Global Positioning System (GPS) was developed by the Department of Defense primarily for navigation use by the United States Armed Forces. The system will consist of a constellation of 18 operational Navigation Satellite Timing and Ranging (NAVSTAR) satellites by the late 1980's. During the last four years, the Satellite Emission Range Inferred Earth Surveying (SERIES) team at the Jet Propulsion Laboratory (JPL) has developed a novel receiver which is the heart of the SERIES geodetic system designed to use signals broadcast from the GPS. This receiver does not require knowledge of the exact code sequence being transmitted. In addition, when two SERIES receivers are used differentially to determine a baseline, few cm accuracies can be obtained. The initial engineering test phase has been completed for the SERIES Project. Baseline lengths, ranging from 150 meters to 171 kilometers, have been measured with 0.3 cm to 7 cm accuracies. This technology, which is sponsored by the NASA Geodynamics Program, has been developed at JPL to meet the challenge for high precision, cost-effective geodesy, and to complement the mobile Very Long Baseline Interferometry (VLBI) system for Earth surveying
Impact of signal quantization on the performance of RFI mitigation algorithms
Radio Frequency Interference (RFI) is currently a major problem in Communications and Earth Observation, but it is even more dramatic in Microwave Radiometry because of the low power levels of the received signals. Its impact has been attested in several Earth Observation missions. On-board mitigation systems are becoming a requirement to detect and remove affected measurements, increasing thus radiometric accuracy and spatial coverage. However, RFI mitigation methods have not been tested yet in the context of some particular radiometer topologies, which rely on the use of coarsely quantized streams of data. In this study, the impact of quantization and sampling in the performance of several known RFI mitigation algorithms is studied under different conditions. It will be demonstrated that in the presence of clipping, quantization changes fundamentally the time-frequency properties of the contaminated signal, strongly impairing the performance of most mitigation methods. Important design considerations are derived from this analysis that must be taken into account when defining the architecture of future instruments. In particular, the use of Automatic Gain Control (AGC) systems is proposed, and its limitations are discussedPeer ReviewedPostprint (published version
- …