25 research outputs found
Interannual Variation in Biomass Burning and Fire Seasonality Derived from Geostationary Satellite Data Across the Contiguous United States from 1995 to 2011
Wildfires exhibit a strong seasonality that is driven by climatic factors and human activities. Although the fire seasonality is commonly determined using burned area and fire frequency, it could also be quantified using biomass consumption estimates that directly represent biomass loss (a combination of the area burned and the fuel loading). Therefore, in this study a data set of long-term biomass consumed was derived from geostationary satellite data to explore the interannual variation in the fire seasonality and the possible impacts of climate change and land management practices across the Contiguous United States (CONUS). Specifically, daily biomass consumed data were derived using the fire radiative power retrieved from Geostationary Operational Environmental Satellites series with a pixel size of 4–10 km from 1995 to 2011. Annual fire seasonality metrics including the fire season duration, the timing of the start, peak, and end of the fire season, and interannual variation and trends were derived from the 17 year biomass consumed record. These metrics were associated with climatic factors to examine drivers and mediators of fire seasonality. The results indicate that biomass consumed significantly increased at a rate of 2.87 Tg/yr; however, the derived fire season duration exhibited a shortening trend in various states over the western CONUS and no significant trend in most other regions. This suggests that the frequency of extreme fire events has increased, which is perhaps associated with an observed increase of extreme weather conditions. Further, both the start and the end of the fire season exhibited an early shift (1.5–5 d/yr) in various eastern states although a late shift occurred in Arizona and Oregon. The interannual variation and trend in the fire seasonality was more strongly related to temperature in the western CONUS and to precipitation in the southeast. The Palmer Drought Severity Index was found to effectively reflect interannual variations in total biomass consumed although it was poorly correlated to the fire seasonality metrics. The results indicate that across the CONUS, the spatial patterns of the start, peak, and end of the fire season shift regularly in various regions in response to latitudinal gradients of temperature variation
Evaluation of the Multi-Angle Implementation of Atmospheric Correction (MAIAC) Aerosol Algorithm through Intercomparison with VIIRS Aerosol Products and AERONET
The Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm is under evaluation for use in conjunction with the Geostationary Coastal and Air Pollution Events (GEO-CAPE) mission. Column aerosol optical thickness (AOT) data from MAIAC are compared against corresponding data. from the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument over North America during 2013. Product coverage and retrieval strategy, along with regional variations in AOT through comparison of both matched and un-matched seasonally gridded data are reviewed. MAIAC shows extended coverage over parts of the continent when compared to VIIRS, owing to its pixel selection process and ability to retrieve aerosol information over brighter surfaces. To estimate data accuracy, both products are compared with AERONET Level 2 measurements to determine the amount of error present and discover if there is any dependency on viewing geometry and/or surface characteristics. Results suggest that MAIAC performs well over this region with a relatively small bias of -0.01; however there is a tendency for greater negative biases over bright surfaces and at larger scattering angles. Additional analysis over an expanded area and longer time period are likely needed to determine a comprehensive assessment of the products capability over the Western Hemisphere. and meet the levels of accuracy needed for aerosol monitoring
Spatio-temporal Associations Between GOES Aerosol Optical Depth Retrievals and Ground-Level PM2.5
We assess the strength of association between aerosol optical depth (AOD) retrievals from the GOES Aerosol/Smoke Product (GASP) and ground-level fine particulate matter (PM2.5) to assess AOD as a proxy for PM2.5 in the United States. GASP AOD is retrieved from a geostationary platform and therefore provides dense temporal coverage with half-hourly observations every day, in contrast to once per day snapshots from polar-orbiting satellites. However, GASP AOD is based on a less-sophisticated instrument and retrieval algorithm. We find that correlations between GASP AOD and PM2.5 over time at fixed locations are reasonably high, except in the winter and in the western U.S. Correlations over space at fixed times are lower. Simple averaging over time actually reduces correlations over space dramatically, but statistical calibration allows averaging over time that produces strong correlations. These results and the data density of GASP AOD highlight its potential to help improve exposure estimates for epidemiological analyses. On average 40% of days in a month have a GASP AOD retrieval compared to 14% for MODIS and 4% for MISR. Furthermore, GASP AOD has been retrieved since November 1994, providing the possibility of a long-term record that pre-dates the availability of most PM2.5 monitoring data and other satellite instruments
Near-Real-Time Global Biomass Burning Emissions Product from Geostationary Satellite Constellation
Near-real-time estimates of biomass burning emissions are crucial for air quality monitoring and forecasting. We present here the first near-real-time global biomass burning emission product from geostationary satellites (GBBEP-Geo) produced from satellite-derived fire radiative power (FRP) for individual fire pixels. Specifically, the FRP is retrieved using WF_ABBA V65 (wildfire automated biomass burning algorithm) from a network of multiple geostationary satellites. The network consists of two Geostationary Operational Environmental Satellites (GOES) which are operated by the National Oceanic and Atmospheric Administration, the Meteosat second-generation satellites (Meteosat-09) operated by the European Organisation for the Exploitation of Meteorological Satellites, and the Multifunctional Transport Satellite (MTSAT) operated by the Japan Meteorological Agency. These satellites observe wildfires at an interval of 15–30 min. Because of the impacts from sensor saturation, cloud cover, and background surface, the FRP values are generally not continuously observed. The missing observations are simulated by combining the available instantaneous FRP observations within a day and a set of representative climatological diurnal patterns of FRP for various ecosystems. Finally, the simulated diurnal variation in FRP is applied to quantify biomass combustion and emissions in individual fire pixels with a latency of 1 day. By analyzing global patterns in hourly biomass burning emissions in 2010, we find that peak fire season varied greatly and that annual wildfires burned 1.33 × 1012 kg dry mass, released 1.27 × 1010 kg of PM2.5 (particulate mass for particles with diameter \u3c2.5 μm) and 1.18 × 1011kg of CO globally (excluding most parts of boreal Asia, the Middle East, and India because of no coverage from geostationary satellites). The biomass burning emissions were mostly released from forest and savanna fires in Africa, South America, and North America. Evaluation of emission result reveals that the GBBEP-Geo estimates are comparable with other FRP-derived estimates in Africa, while the results are generally smaller than most of the other global products that were derived from burned area and fuel loading. However, the daily emissions estimated from GOES FRP over the United States are generally consistent with those modeled from GOES burned area and MODIS (Moderate Resolution Imaging Spectroradiometer) fuel loading, which produces an overall bias of 5.7% and a correlation slope of 0.97 ± 0.2. It is expected that near-real-time hourly emissions from GBBEP-Geo could provide a crucial component for atmospheric and chemical transport modelers to forecast air quality and weather conditions
Sensitivity of Mesoscale Modeling of Smoke Direct Radiative Effect to the Emission Inventory: a Case Study in Northern Sub-Saharan African Region
An ensemble approach is used to examine the sensitivity of smoke loading and smoke direct radiative effect in the atmosphere to uncertainties in smoke emission estimates. Seven different fire emission inventories are applied independently to WRF-Chem model (v3.5) with the same model configuration (excluding dust and other emission sources) over the northern sub-Saharan African (NSSA) biomass-burning region. Results for November and February 2010 are analyzed, respectively representing the start and end of the biomass burning season in the study region. For February 2010, estimates of total smoke emission vary by a factor of 12, but only differences by factors of 7 or less are found in the simulated regional (15degW-42degE, 13degS-17degN) and monthly averages of column PM(sub 2.5) loading, surface PM(sub 2.5) concentration, aerosol optical depth (AOD), smoke radiative forcing at the top-of-atmosphere and at the surface, and air temperature at 2 m and at 700 hPa. The smaller differences in these simulated variables may reflect the atmospheric diffusion and deposition effects to dampen the large difference in smoke emissions that are highly concentrated in areas much smaller than the regional domain of the study. Indeed, at the local scale, large differences (up to a factor of 33) persist in simulated smoke-related variables and radiative effects including semi-direct effect. Similar results are also found for November 2010, despite differences in meteorology and fire activity. Hence, biomass burning emission uncertainties have a large influence on the reliability of model simulations of atmospheric aerosol loading, transport, and radiative impacts, and this influence is largest at local and hourly-to-daily scales. Accurate quantification of smoke effects on regional climate and air quality requires further reduction of emission uncertainties, particularly for regions of high fire concentrations such as NSSA
Sensitivity of Mesoscale Modeling of Smoke Direct Radiative Effect to the Emission Inventory: A Case Study in Northern Sub-Saharan African Region
An ensemble approach is used to examine the sensitivity of smoke loading and smoke direct radiative effect in the atmosphere to uncertainties in smoke emission estimates. Seven different fire emission inventories are applied independently to WRF-Chem model (v3.5) with the same model configuration (excluding dust and other emission sources) over the northern sub-Saharan African (NSSA) biomass-burning region. Results for November and February 2010 are analyzed, respectively representing the start and end of the biomass burning season in the study region. For February 2010, estimates of total smoke emission vary by a factor of 12, but only differences by factors of 7 or less are found in the simulated regional (15°W–42°E, 13°S–17°N) and monthly averages of column PM2.5 loading, surface PM2.5 concentration, aerosol optical depth (AOD), smoke radiative forcing at the top-of-atmosphere and at the surface, and air temperature at 2 m and at 700 hPa. The smaller differences in these simulated variables may reflect the atmospheric diffusion and deposition effects to dampen the large difference in smoke emissions that are highly concentrated in areas much smaller than the regional domain of the study. Indeed, at the local scale, large differences (up to a factor of 33) persist in simulated smoke-related variables and radiative effects including semi-direct effect. Similar results are also found for November 2010, despite differences in meteorology and fire activity. Hence, biomass burning emission uncertainties have a large influence on the reliability of model simulations of atmospheric aerosol loading, transport, and radiative impacts, and this influence is largest at local and hourly-to-daily scales. Accurate quantification of smoke effects on regional climate and air quality requires further reduction of emission uncertainties, particularly for regions of high fire concentrations such as NSSA
An evaluation of advanced baseline imager fire radiative power based wildfire emissions using carbon monoxide observed by the Tropospheric Monitoring Instrument across the conterminous United States
Biomass-burning emissions (BBE) profoundly affect climate and air quality. BBE have been estimated using various methods, including satellite-based fire radiative power (FRP). However, BBE estimates show very large variability and the accuracy of emissions estimation is poorly understood due to the lack of good reference data. We evaluated fire emissions estimated using FRP from the Advanced Baseline Imager (ABI) on GOES-R (Geostationary Operational Environmental Satellites-R) by comparing with the Sentinel 5 Precursor TROPOspheric Monitoring Instrument (TROPOMI) Carbon Monoxide (CO) over 41 wildfires across the United States during July 2018—October 2019. All the ABI FRP-based CO and TROPOMI CO emissions were significantly correlated and showed a very good agreement with a coefficient of determination of 0.94 and an accuracy of 13–18%. We further reported a CO emission coefficient of 29.92 ± 2.39 g MJ ^−1 based on ABI FRP and TROPOMI CO, which can be used to directly estimate BBE from FRP observed from satellites. Based on the CO emission coefficient and ABI FRP, we finally estimated a monthly mean CO of 596 Gg across the Conterminous United States for June—September 2018