381 research outputs found
A General Analysis of the Impact of Digitization in Microwave Correlation Radiometers
This study provides a general framework to analyze the effects on correlation radiometers of a generic quantization scheme and sampling process. It reviews, unifies and expands several previous works that focused on these effects separately. In addition, it provides a general theoretical background that allows analyzing any digitization scheme including any number of quantization levels, irregular quantization steps, gain compression, clipping, jitter and skew effects of the sampling period
Impact of signal quantization on the performance of RFI mitigation algorithms
Radio Frequency Interference (RFI) is currently a major problem in Communications and Earth Observation, but it is even more dramatic in Microwave Radiometry because of the low power levels of the received signals. Its impact has been attested in several Earth Observation missions. On-board mitigation systems are becoming a requirement to detect and remove affected measurements, increasing thus radiometric accuracy and spatial coverage. However, RFI mitigation methods have not been tested yet in the context of some particular radiometer topologies, which rely on the use of coarsely quantized streams of data. In this study, the impact of quantization and sampling in the performance of several known RFI mitigation algorithms is studied under different conditions. It will be demonstrated that in the presence of clipping, quantization changes fundamentally the time-frequency properties of the contaminated signal, strongly impairing the performance of most mitigation methods. Important design considerations are derived from this analysis that must be taken into account when defining the architecture of future instruments. In particular, the use of Automatic Gain Control (AGC) systems is proposed, and its limitations are discussedPeer ReviewedPostprint (published version
A review of RFI mitigation techniques in microwave radiometry
Radio frequency interference (RFI) is a well-known problem in microwave radiometry (MWR). Any undesired signal overlapping the MWR protected frequency bands introduces a bias in the measurements, which can corrupt the retrieved geophysical parameters. This paper presents a literature review of RFI detection and mitigation techniques for microwave radiometry from space. The reviewed techniques are divided between real aperture and aperture synthesis. A discussion and assessment of the application of RFI mitigation techniques is presented for each type of radiometer.Peer ReviewedPostprint (published version
The Temperature of the CMB at 10 GHz
We report the results of an effort to measure the low frequency portion of
the spectrum of the Cosmic Microwave Background Radiation (CMB), using a
balloon-borne instrument called ARCADE (Absolute Radiometer for Cosmology,
Astrophysics, and Diffuse Emission). These measurements are to search for
deviations from a thermal spectrum that are expected to exist in the CMB due to
various processes in the early universe. The radiometric temperature was
measured at 10 and 30 GHz using a cryogenic open-aperture instrument with no
emissive windows. An external blackbody calibrator provides an in situ
reference. A linear model is used to compare the radiometer output to a set of
thermometers on the instrument. The unmodeled residuals are less than 50 mK
peak-to-peak with a weighted RMS of 6 mK. Small corrections are made for the
residual emission from the flight train, atmosphere, and foreground Galactic
emission. The measured radiometric temperature of the CMB is 2.721 +/- 0.010 K
at 10 GHz and 2.694 +/- 0.032 K at 30 GHz.Comment: 8 pages including 5 figures. Submitted to The Astrophysical Journa
Dynamic validation of the Planck/LFI thermal model
The Low Frequency Instrument (LFI) is an array of cryogenically cooled
radiometers on board the Planck satellite, designed to measure the temperature
and polarization anisotropies of the cosmic microwave backgrond (CMB) at 30, 44
and 70 GHz. The thermal requirements of the LFI, and in particular the
stringent limits to acceptable thermal fluctuations in the 20 K focal plane,
are a critical element to achieve the instrument scientific performance.
Thermal tests were carried out as part of the on-ground calibration campaign at
various stages of instrument integration. In this paper we describe the results
and analysis of the tests on the LFI flight model (FM) performed at Thales
Laboratories in Milan (Italy) during 2006, with the purpose of experimentally
sampling the thermal transfer functions and consequently validating the
numerical thermal model describing the dynamic response of the LFI focal plane.
This model has been used extensively to assess the ability of LFI to achieve
its scientific goals: its validation is therefore extremely important in the
context of the Planck mission. Our analysis shows that the measured thermal
properties of the instrument show a thermal damping level better than
predicted, therefore further reducing the expected systematic effect induced in
the LFI maps. We then propose an explanation of the increased damping in terms
of non-ideal thermal contacts.Comment: Planck LFI technical papers published by JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/1748-022
Literature review of the remote sensing of natural resources
Abstracts of 596 documents related to remote sensors or the remote sensing of natural resources by satellite, aircraft, or ground-based stations are presented. Topics covered include general theory, geology and hydrology, agriculture and forestry, marine sciences, urban land use, and instrumentation. Recent documents not yet cited in any of the seven information sources used for the compilation are summarized. An author/key word index is provided
Optimization of Planck/LFI on--board data handling
To asses stability against 1/f noise, the Low Frequency Instrument (LFI)
onboard the Planck mission will acquire data at a rate much higher than the
data rate allowed by its telemetry bandwith of 35.5 kbps. The data are
processed by an onboard pipeline, followed onground by a reversing step. This
paper illustrates the LFI scientific onboard processing to fit the allowed
datarate. This is a lossy process tuned by using a set of 5 parameters Naver,
r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level
of distortion introduced by the onboard processing, EpsilonQ, as a function of
these parameters. It describes the method of optimizing the onboard processing
chain. The tuning procedure is based on a optimization algorithm applied to
unprocessed and uncompressed raw data provided either by simulations, prelaunch
tests or data taken from LFI operating in diagnostic mode. All the needed
optimization steps are performed by an automated tool, OCA2, which ends with
optimized parameters and produces a set of statistical indicators, among them
the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr =
2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup
the process an analytical model is developed that is able to extract most of
the relevant information on EpsilonQ and Cr as a function of the signal
statistics and the processing parameters. This model will be of interest for
the instrument data analysis. The method was applied during ground tests when
the instrument was operating in conditions representative of flight. Optimized
parameters were obtained and the performance has been verified, the required
data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of
3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx,
txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted
10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio
The effect of signal digitisation in CMB experiments
Signal digitisation may produce significant effects in balloon - borne or
space CMB experiments, since the limited bandwidth for downlink of data
requires imposes a large quantisation step q applied on board by the instrument
acquisition chain. In this paper we present a study of the impact of the
quantization error in CMB experiments using, as a working case, simulated data
from the Planck/LFI. At TOD level, the effect of the quantization can be
approximated as a source of nearly normally distributed noise. At map level,
the data quantization alters the noise distribution and the expectation of some
higher order moments. Finally, at the levell of power spectra, the quantization
introduces a power excess, that, although related to the instrument and mission
parameters, is weakly dependent on the multipole l at middle and large l and
can be quite accurately subtracted, leaving a residual uncertainty of few % of
the RMS uncertainty. Only for l<30 the quantization removal is less accurate.Comment: 15 pages, 5 figures, LaTeX2e, A&A style (aa.cls). Release 1, april
1st 2003. Submitted to A&A for the pubblication, april 1st 2003. Contact
author: [email protected]
An assessment of NASA master directory/catalog interoperability for interdisciplinary study of the global water cycle
The most important issue facing science is understanding global change; the causes, the processes involved and their consequences. The key to success in this massive Earth science research effort will depend on efficient identification and access to the most data available across the atmospheric, oceanographic, and land sciences. Current mechanisms used by earth scientists for accessing these data fall far short of meeting this need. Scientists must as a result frequently rely on a priori knowledge and informal person to person networks to find relevant data. The Master Directory/Catalog Interoperability Program (MC/CI) undertaken by NASA is an important step in overcoming these problems. The stated goal of the MD project is to enable researchers to efficiently identify, locate, and obtain access to space and Earth science data
- …