2,707 research outputs found

    Basic data requirements for microwave radiometer systems

    Get PDF
    Microwave radiometry has emerged over the last two decades to become an integral part of the field of environmental remote sensing. Numerous investigations were conducted to evaluate the use of microwave radiometry for atmospheric, oceanographic, hydrological, and geological applications. Remote sensing of the earth using microwave radiometry began in 1968 by the Soviet satellite Cosmos 243, which included four microwave radiometers (Ulably, 1981). Since then, microwave radiometers were included onboard many spacecraft, and were used to infer many physical parameters. Some of the basic concepts of radiometric emission and measurement will be discussed. Several radiometer systems are presented and an overview of their operation is discussed. From the description of the radiometer operation the data stream required from the radiometer and the general type of algorithm required for the measurement is discussed

    Ground data investigations Mt. Lassen, site 56-mission 76

    Get PDF
    Microwave radiometry and infrared photography for meteorological dat

    Microwave cryogenic thermal-noise standards

    Get PDF
    Field operational waveguide noise standard with nominal noise temperature of 78.09 plus/minus 0.12 deg K is calibrated more precisely than before. Calibration technique applies to various disciplines such as microwave radiometry, antenna temperature and loss measurement, and low-noise amplifier performance evaluation

    Performance assessment of time–frequency RFI mitigation techniques in microwave radiometry

    Get PDF
    ©2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Radio–frequency interference (RFI) signals are a well-known threat for microwave radiometry (MWR) applications. In order to alleviate this problem, different approaches for RFI detection and mitigation are currently under development. Since RFI signals are man made, they tend to have their power more concentrated in the time–frequency (TF) space as compared to naturally emitted noise. The aim of this paper is to perform an assessment of different TF RFI mitigation techniques in terms of probability of detection, resolution loss (RL), and mitigation performance. In this assessment, six different kinds of RFI signals have been considered: a glitch, a burst of pulses, a wide-band chirp, a narrow-band chirp, a continuous wave, and a wide-band modulation. The results show that the best performance occurs when the transform basis has a similar shape as compared to the RFI signal. For the best case performance, the maximum residual RFI temperature is 14.8 K, and the worst RL is 8.4%. Moreover, the multiresolution Fourier transform technique appears as a good tradeoff solution among all other techniques since it can mitigate all RFI signals under evaluation with a maximum residual RFI temperature of 21 K, and a worst RL of 26.3%. Although the obtained results are still far from an acceptable bias Misplaced < 1 K for MWR applications, there is still work to do in a combined test using the information gathered simultaneously by all mitigation techniques, which could improve the overall performance of RFI mitigation.Peer ReviewedPostprint (author's final draft

    Dielectric properties measurements of brown and white adipose tissue in rats from 0.5 to 10 GHz

    Get PDF
    Brown adipose tissue (BAT) plays an important role in whole body metabolism and with appropriate stimulus could potentially mediate weight gain and insulin sensitivity. Although imaging techniques are available to detect subsurface BAT, there are currently no viable methods for continuous acquisition of BAT energy expenditure. Microwave (MW) radiometry is an emerging technology that allows the quantification of tissue temperature variations at depths of several centimeters. Such temperature differentials may be correlated with variations in metabolic rate, thus providing a quantitative approach to monitor BAT metabolism. In order to optimize MW radiometry, numerical and experimental phantoms with accurate dielectric properties are required to develop and calibrate radiometric sensors. Thus, we present for the first time, the characterization of relative permittivity and electrical conductivity of brown (BAT) and white (WAT) adipose tissues in rats across the MW range 0.5-10GHz. Measurements were carried out in situ and post mortem in six female rats of approximately 200g. A Cole-Cole model was used to fit the experimental data into a parametric model that describes the variation of dielectric properties as a function of frequency. Measurements confirm that the dielectric properties of BAT (εr = 14.0-19.4, σ = 0.3-3.3S/m) are significantly higher than those of WAT (εr = 9.1-11.9, σ = 0.1-1.9S/m), in accordance with the higher water content of BAT

    Feasibility study on application of microwave radiometry to monitor contamination level on insulator materials

    Get PDF
    This paper introduces a novel method for monitoring contamination levels on high voltage insulators based on microwave radiometry. Present contamination monitoring solutions for high voltage insulators are only effective in predicting flashover risk when the contamination layer has been wetted by rain, fog or condensation. The challenge comes where the pollution occurs during a dry period prior to a weather change. Under these conditions, flashover can often occur within a short time period after wetting and is not predicted by measurements taken in the dry period. The microwave radiometer system described in this paper measures energy emitted from the contamination layer and could provide a safe, reliable, contactless monitoring method that is effective under dry conditions. The relationship between equivalent salt deposit density and radiometer output is described using a theoretical model and experimentally verified using a specially designed X-band radiometer. Results demonstrate that the output from the radiometer is able to clearly distinguish between different levels of contamination on insulator materials under dry conditions. This novel contamination monitoring method could potentially provide advance warning of the future failure of wet insulators in climates where insulators can experience dry conditions for extended periods

    The Use of Cryogenic HEMT Amplifiers in Wide Band Radiometers

    Get PDF
    Advances in device fabrication, modelling and design techniques have made wide band, low noise cryogenic amplifiers available at frequencies up to 106 GHz. Microwave radiometry applications as used in radio astronomy capitalize on the low noise and large bandwidths of these amplifiers. Radiometers must be carefully designed so as to preclude sensitivity degradations caused by small, low frequency gain fluctuations inherent in these amplifiers

    A lake and sea ice experiment with Skylab microwave radiometry

    Get PDF
    There are no author-identified significant results in this report
    • …
    corecore