26,765 research outputs found

    A Predictive Algorithm For Wetlands In Deep Time Paleoclimate Models

    Get PDF
    Methane is a powerful greenhouse gas produced in wetland environments via microbial action in anaerobic conditions. If the location and extent of wetlands are unknown, such as for the Earth many millions of years in the past, a model of wetland fraction is required in order to calculate methane emissions and thus help reduce uncertainty in the understanding of past warm greenhouse climates. Here we present an algorithm for predicting inundated wetland fraction for use in calculating wetland methane emission fluxes in deep time paleoclimate simulations. The algorithm determines, for each grid cell in a given paleoclimate simulation, the wetland fraction predicted by a nearest neighbours search of modern day data in a space described by a set of environmental, climate and vegetation variables. To explore this approach, we first test it for a modern day climate with variables obtained from observations and then for an Eocene climate with variables derived from a fully coupled global climate model (HadCM3BL-M2.2). Two independent dynamic vegetation models were used to provide two sets of equivalent vegetation variables which yielded two different wetland predictions. As a first test the method, using both vegetation models, satisfactorily reproduces modern data wetland fraction at a course grid resolution, similar to those used in paleoclimate simulations. We then applied the method to an early Eocene climate, testing its outputs against the locations of Eocene coal deposits. We predict global mean monthly wetland fraction area for the early Eocene of 8 to 10 × 106km2 with corresponding total annual methane flux of 656 to 909 Tg, depending on which of two different dynamic global vegetation models are used to model wetland fraction and methane emission rates. Both values are significantly higher than estimates for the modern-day of 4 × 106km2 and around 190Tg (Poulter et. al. 2017, Melton et. al., 2013

    Progress in paleoclimate modeling

    Get PDF
    International audienceThis paper briefly surveys areas of paleoclimate modeling notable for recent progress. New ideas, including hypotheses giving a pivotal role to sea ice, have revitalized the low-order models used to simulate the time evolution of glacial cycles through the Pleistocene, a prohibitive length of time for comprehensive general circulation models (GCMs). In a recent breakthrough, however, GCMs have succeeded in simulating the onset of glaciations. This occurs at times (most recently, 115 kyr B.P.) when high northern latitudes are cold enough to maintain a snow cover and tropical latitudes are warm, enhancing the moisture source. More generally, the improvement in models has allowed simulations of key periods such as the Last Glacial Maximum and the mid-Holocene that compare more favorably and in more detail with paleoproxy data. These models now simulate ENSO cycles, and some of them have been shown to reproduce the reduction of ENSO activity observed in the early to middle Holocene. Modeling studies have demonstrated that the reduction is a response to the altered orbital configuration at that time. An urgent challenge for paleoclimate modeling is to explain and to simulate the abrupt changes observed during glacial epochs (i.e., Dansgaard-Oescher cycles, Heinrich events, and the Younger Dryas). Efforts have begun to simulate the last millennium. Over this time the forcing due to orbital variations is less important than the radiance changes due to volcanic eruptions and variations in solar output. Simulations of these natural variations test the models relied on for future climate change projections. They provide better estimates of the internal and naturally forced variability at centennial time scales, elucidating how unusual the recent global temperature trends are

    A Methodology for Robust Multiproxy Paleoclimate Reconstructions and Modeling of Temperature Conditional Quantiles

    Full text link
    Great strides have been made in the field of reconstructing past temperatures based on models relating temperature to temperature-sensitive paleoclimate proxies. One of the goals of such reconstructions is to assess if current climate is anomalous in a millennial context. These regression based approaches model the conditional mean of the temperature distribution as a function of paleoclimate proxies (or vice versa). Some of the recent focus in the area has considered methods which help reduce the uncertainty inherent in such statistical paleoclimate reconstructions, with the ultimate goal of improving the confidence that can be attached to such endeavors. A second important scientific focus in the subject area is the area of forward models for proxies, the goal of which is to understand the way paleoclimate proxies are driven by temperature and other environmental variables. In this paper we introduce novel statistical methodology for (1) quantile regression with autoregressive residual structure, (2) estimation of corresponding model parameters, (3) development of a rigorous framework for specifying uncertainty estimates of quantities of interest, yielding (4) statistical byproducts that address the two scientific foci discussed above. Our statistical methodology demonstrably produces a more robust reconstruction than is possible by using conditional-mean-fitting methods. Our reconstruction shares some of the common features of past reconstructions, but also gains useful insights. More importantly, we are able to demonstrate a significantly smaller uncertainty than that from previous regression methods. In addition, the quantile regression component allows us to model, in a more complete and flexible way than least squares, the conditional distribution of temperature given proxies. This relationship can be used to inform forward models relating how proxies are driven by temperature

    Anomaly Detection in Paleoclimate Records using Permutation Entropy

    Get PDF
    Permutation entropy techniques can be useful in identifying anomalies in paleoclimate data records, including noise, outliers, and post-processing issues. We demonstrate this using weighted and unweighted permutation entropy of water-isotope records in a deep polar ice core. In one region of these isotope records, our previous calculations revealed an abrupt change in the complexity of the traces: specifically, in the amount of new information that appeared at every time step. We conjectured that this effect was due to noise introduced by an older laboratory instrument. In this paper, we validate that conjecture by re-analyzing a section of the ice core using a more-advanced version of the laboratory instrument. The anomalous noise levels are absent from the permutation entropy traces of the new data. In other sections of the core, we show that permutation entropy techniques can be used to identify anomalies in the raw data that are not associated with climatic or glaciological processes, but rather effects occurring during field work, laboratory analysis, or data post-processing. These examples make it clear that permutation entropy is a useful forensic tool for identifying sections of data that require targeted re-analysis---and can even be useful in guiding that analysis.Comment: 15 pages, 7 figure
    corecore