29 research outputs found
Analysis of Suomi - NPP VIIRS Vignetting Functions Based on Yaw Maneuver Data
The Suomi NPP Visible Infrared Imager Radiometer Suite (VIIRS) reflective bands are calibrated on-orbit via reference to regular solar observations through a solar attenuation screen (SAS) and diffusely reflected off a Spectralon (Registered Trademark) panel. The degradation of the Spectralon panel BRDF due to UV exposure is tracked via a ratioing radiometer (SDSM) which compares near simultaneous observations of the panel with direct observations of the sun (through a separate attenuation screen). On-orbit, the vignetting functions of both attenuation screens are most easily measured when the satellite performs a series of yaw maneuvers over a short period of time (thereby covering the yearly angular variation of solar observations in a couple of days). Because the SAS is fixed, only the product of the screen transmission and the panel BRDF was measured. Moreover, this product was measured by both VIIRS detectors as well as the SDSM detectors (albeit at different reflectance angles off the Spectralon panel). The SDSM screen is also fixed; in this case, the screen transmission was measured directly. Corrections for instrument drift and degradation, solar geometry, and spectral effects were taken into consideration. The resulting vignetting functions were then compared to the pre-launch measurements as well as models based on screen geometry
Calibration of VIIRS F1 Sensor Fire Detection Band Using lunar Observations
Visible Infrared Imager Radiometer Suite (VIIRS) Fight 1 (Fl) sensor includes a fire detection band at roughly 4 microns. This spectral band has two gain states; fire detection occurs in the low gain state above approximately 345 K. The thermal bands normally utilize an on-board blackbody to provide on-orbit calibration. However, as the maximum temperature of this blackbody is 315 K, the low gain state of the 4 micron band cannot be calibrated in the same manner as the rest of the thermal bands. Regular observations of the moon provide an alternative calibration source. The lunar surface temperature has been recently mapped by the DIVINER sensor on the LRO platform. The periodic on-board high gain calibration along with the DIVINER surface temperatures was used to determine the emissivity and solar reflectance of the lunar surface at 4 microns; these factors and the lunar data are then used to fit the low gain calibration coefficients of the 4 micron band. Furthermore, the emissivity of the lunar surface is well known near 8.5 microns due to the Christiansen feature (an emissivity maximum associated with Si-O stretching vibrations) and the solar reflectance is negligible. Thus, the 8.5 micron band is used for relative calibration with the 4 micron band to de-trend any temporal variations. In addition, the remaining thermal bands are analyzed in a similar fashion, with both calculated emissivities and solar reflectances produced
Comparison of Detector-Based and Source-Based Absolute Radiance Standards
A detailed comparison of two separate radiometric calibration standards was conducted with analysis of error sources for each. One is a detector based standard, with radiance traceable to fundamental units using the electrical substitution method. The other is based on a calibrated field emission lamp (FEL) generating blackbody radiation. This study was motivated by the discontinuance of the FEL lamps by the manufacturer and a desire to calibrate sensors using non-blackbody spectral profiles. Three spectrometers were calibrated simultaneously by both methods to separate spectrometer artifacts from differences in the radiometric standards and error in the irradiance to radiance conversion procedure needed for the FEL. Conducting this study now is important to provide continuity between the extensive prior FEL based calibration database with a replacement method while calibrated FEL lamps are still available.
This also begins a longer-term repeatability study of the spectrometers used in this work for evaluation as calibrated transfer standards, usable with both monochromatic and broad spectrum radiance sources. Eliminating the tie to the FEL blackbody spectrum with a detector-based standard allows for flexibility in the illumination source. In particular, for Earth science sensors intended for use with sunlit scenes, augmenting traditional tungsten halogen lighting with blue and ultraviolet light emitting diodes allows for a better match to the solar spectrum during laboratory testing
Landsat 9 Thermal Infrared Sensor 2 Subsystem-Level Spectral Test Results
Results from the Thermal Infrared Sensor 2 (TIRS-2) prelaunch spectral characterization at telescope and detector subsystem level are presented. The derived relative spectral response (RSR) shape is expected to be very similar to the instrument-level spectral response and provides an initial estimate of the RSR and its differences to the component-level RSR measurements. Such differences were observed at TIRS- 1 and are likely a result of angular dependence of the spectral response of the detector. The subsystem RSR measurements also provide an opportunity for a preliminary assessment of the spectral requirements. Final requirements verification will be performed at future thermal vacuum environmental testing with the fully assembled TIRS-2 instrument
Extreme UV QSOs
We present a sample of spectroscopically confirmed QSOs with FUV-NUV color
(as measured by GALEX photometry) bluer than canonical QSO templates and than
the majority of known QSOs. We analyze their FUV to NIR colors, luminosities
and optical spectra. The sample includes a group of 150 objects at low redshift
(z 0.5), and a group of 21 objects with redshift 1.7z2.6. For the low
redshift objects, the "blue" FUV-NUV color may be caused by enhanced Ly
emission, since Ly transits the GALEX FUV band from z=0.1 to z=0.47.
Synthetic QSO templates constructed with Ly up to 3 times stronger than
in standard templates match the observed UV colors of our low redshift sample.
The H emission increases, and the optical spectra become bluer, with
increasing absolute UV luminosity. The UV-blue QSOs at redshift about 2, where
the GALEX bands sample restframe about 450-590A (FUV) and about 590-940A(NUV),
are fainter than the average of UV-normal QSOs at similar redshift in NUV,
while they have comparable luminosities in other bands. Therefore we speculate
that their observed FUV-NUV color may be explained by a combination of steep
flux rise towards short wavelengths and dust absorption below the Lyman limit,
such as from small grains or crystalline carbon. The ratio of Ly to CIV
could be measured in 10 objects; it is higher (30% on average) than for
UV-normal QSOs, and close to the value expected for shock or collisional
ionization. FULL VERSION AVAILABLE FROM AUTHOR'S WEB SITE:
http://dolomiti.pha.jhu.edu/papers/2009_AJ_Extreme_UV_QSOs.pdfComment: Astronomical Journal, in pres
GOES-17 Advanced Baseline Imager Performance Recovery Summary
The 17th Geostationary Operational Environmental Satellite (GOES-17) was launched on 1 March 2018. The Advanced Baseline Imager (ABI) is the primary instrument on the GOES-R series for weather and environmental monitoring. The GOES-17 ABI (flight model 2) experienced a degradation in its thermal system that limits ABI's ability to shed solar heat load. This limitation resulted in significant reduction in performance after initial turn on with only 3 of 16 spectral channels expected to be available for much of the year. A combined government/vendor team was tasked with optimizing the operation of ABI to recapture as much performance as possible. By modifying the operational configuration and sensor parameters, the team was able to regain over 97% imaging capability.This was accomplished by taking advantage of the considerably flexible nature of ABI's design to adapt its configuration to the new reality and improve capabilities for many of ABI's subsystems. The significant differences in operational configuration, sensor parameter optimization, and algorithm optimization will be discussed as well as their impact on performance and data availability
JPSS-1 VIIRS Radiometric Characterization and Calibration Based on Pre-Launch Testing
The Visible Infrared Imaging Radiometer Suite (VIIRS) on-board the first Joint Polar Satellite System (JPSS) completed its sensor level testing on December 2014. The JPSS-1 (J1) mission is scheduled to launch in December 2016, and will be very similar to the Suomi-National Polar-orbiting Partnership (SNPP) mission. VIIRS instrument has 22 spectral bands covering the spectrum between 0.4 and 12.6 m. It is a cross-track scanning radiometer capable of providing global measurements twice daily, through observations at two spatial resolutions, 375 m and 750 m at nadir for the imaging and moderate bands, respectively. This paper will briefly describe J1 VIIRS characterization and calibration performance and methodologies executed during the pre-launch testing phases by the government independent team to generate the at-launch baseline radiometric performance and the metrics needed to populate the sensor data record (SDR) Look-Up-Tables (LUTs). This paper will also provide an assessment of the sensor pre-launch radiometric performance, such as the sensor signal to noise ratios (SNRs), radiance dynamic range, reflective and emissive bands calibration performance, polarization sensitivity, spectral performance, response-vs-scan (RVS), and scattered light response. A set of performance metrics generated during the pre-launch testing program will be compared to both the VIIRS sensor specification and the SNPP VIIRS pre-launch performance
LANDSAT 9 Thermal Infrared Sensor 2 Characterization Plan Overview
Landsat 9 will continue the Landsat data record into its fifth decade with a near-copy build of Landsat 8 with launch scheduled for December 2020. The two instruments on Landsat 9 are Thermal Infrared Sensor-2 (TIRS-2) and Operational Land Imager-2 (OLI-2). TIRS-2 is a two-channel pushbroom imager with a 15-degree field of view that will have a 16-day measurement cadence from its nominal 705-km orbit altitude. Its carefully developed instrument performance requirements and associated characterization plan will result in stable and well-understood science-quality imagery that will be used for environmental, economic and legal applications. This paper will present a summary of the plan for TIRS-2 prelaunch characterization at the component, subsystem, and instrument level
Landsat 9 TIRS-2 Performance Results Based on Subsystem-Level Testing
Landsat 9 is the next in the series of Landsat satellites and has a complement of two pushbroom imagers: Operational Land Imager-2 (OLI-2) that samples the solar reflective spectrum with nine channels and Thermal Infrared Sensor-2 (TIRS-2) samples the thermal infrared spectrum with two channels. The first builds of these sensors, OLI and TIRS, were launched on Landsat 8 in 2013 and Landsat 9 is expected to launch in December 2020. TIRS-2 is designed and built to continue the Landsat data record and satisfy the needs of the remote sensing community. There are two sets of requirements considered for planning the component, subsystem and instrument level tests for TIRS-2: performance requirements and Special Calibration Test Requirements (SCTR). The performance requirements specify key spectral, spatial, radiometric, and operational parameters of TIRS-2 while the SCTRs specify parameters of how the instrument is tested. Several requirements can only be verified at the instrument level, but many performance metrics can be assessed earlier in prelaunch testing at the subsystem level. A test program called TIRS Imaging Performance and Cryoshell Evaluation (TIPCE) was developed to characterize TIRS-2 spectral, spatial, and scattered-light rejection performance at the telescope and detector subsystem level. There were three thermal vacuum campaigns in TIPCE that occurred from November 2017 to March 2018. This work shows results of TIPCE data analysis which provide confidence that key requirements will be met at instrument level with a few minor waivers. A full complement of performance testing will be done at the TIRS-2 instrument level for final verification in late 2018 through Spring 2019
The Recent Star Formation in NGC 6822: an Ultraviolet Study
We characterize the star formation in the low-metallicity galaxy NGC 6822
over the past few hundred million years, using GALEX far-UV (FUV, 1344-1786 A)
and near-UV (NUV, 1771-2831 A) imaging, and ground-based Ha imaging. From GALEX
FUV image, we define 77 star-forming (SF) regions with area >860 pc^2, and
surface brightness <=26.8 mag(AB)arcsec^-2, within 0.2deg (1.7kpc) of the
center of the galaxy. We estimate the extinction by interstellar dust in each
SF region from resolved photometry of the hot stars it contains: E(B-V) ranges
from the minimum foreground value of 0.22mag up to 0.66+-0.21mag. The
integrated FUV and NUV photometry, compared with stellar population models,
yields ages of the SF complexes up to a few hundred Myr, and masses from 2x10^2
Msun to 1.5x10^6 Msun. The derived ages and masses strongly depend on the
assumed type of interstellar selective extinction, which we find to vary across
the galaxy. The total mass of the FUV-defined SF regions translates into an
average star formation rate (SFR) of 1.4x10^-2 Msun/yr over the past 100 Myr,
and SFR=1.0x10^-2 Msun/yr in the most recent 10 Myr. The latter is in agreement
with the value that we derive from the Ha luminosity, SFR=0.008 Msun/yr. The
SFR in the most recent epoch becomes higher if we add the SFR=0.02 Msun/yr
inferred from far-IR measurements, which trace star formation still embedded in
dust (age <= a few Myr).Comment: Accepted for publication in ApJ, 21 pages, 6 figures, 3 table