245 research outputs found
geoChronR – an R package to model, analyze, and visualize age-uncertain data
Chronological uncertainty is a hallmark of the paleoenvironmental sciences and geosciences. While many tools have been made available to researchers to quantify age uncertainties suitable for various settings and assumptions, disparate tools and output formats often discourage integrative approaches. In addition, associated tasks like propagating age-model uncertainties to subsequent analyses, and visualizing the results, have received comparatively little attention in the literature and available software. Here, we describe geoChronR, an open-source R package to facilitate these tasks. geoChronR is built around an emerging data standard (Linked PaleoData, or LiPD) and offers access to four popular age-modeling techniques (Bacon, BChron, OxCal, BAM). The output of these models is used to conduct ensemble data analysis, quantifying the impact of chronological uncertainties on common analyses like correlation, regression, principal component, and spectral analyses by repeating the analysis across a large collection of plausible age models. We present five real-world use cases to illustrate how geoChronR may be used to facilitate these tasks, visualize the results in intuitive ways, and store the results for further analysis, promoting transparency and reusability.</p
Recommended from our members
A data assimilation approach to last millennium temperature field reconstruction using a limited high-sensitivity proxy network
The authors acknowledge support from the Climate Program Office of the National Oceanographic and Atmospheric Administration (NOAA grants NA18OAR4310420 to KJA, NA18OAR4310426 to JEG and FZ, and NA18OAR4310422 to GJH). GJH also acknowledges support from the NSF through grant AGS–1702423. JMK was partially supported by NSF grant AGS-1803946. JET and JMK acknowledge support from NSF grant #AGS-1602301 and Heising655 Simons Foundation grant #2016-05.We use the Northern Hemisphere Tree-Ring Network Development (NTREND) tree-ring database to examine the effects of using a small, highly-sensitive proxy network for paleotemperature data assimilation over the last millennium. We first evaluate our methods using pseudo-proxy experiments. These indicate that spatial assimilations using this network are skillful in the extratropical Northern Hemisphere and improve on previous NTREND reconstructions based on Point-by-Point regression. We also find our method is sensitive to climate model biases when the number of sites becomes small. Based on these experiments, we then assimilate the real NTREND network. To quantify model prior uncertainty, we produce 10 separate reconstructions, each assimilating a different climate model. These reconstructions are most dissimilar prior to 1100 CE, when the network becomes sparse, but show greater consistency as the network grows. Temporal variability is also underestimated before 1100 CE. Our assimilation method produces spatial uncertainty estimates and these identify treeline North America and eastern Siberia as regions that would most benefit from development of new millennial-length temperature-sensitive tree-ring records. We compare our multi-model mean reconstruction to five existing paleo-temperature products to examine the range of reconstructed responses to radiative forcing. We find substantial differences in the spatial patterns and magnitudes of reconstructed responses to volcanic eruptions and in the transition between the Medieval epoch and Little Ice Age. These extant uncertainties call for the development of a paleoclimate reconstruction intercomparison framework for systematically examining the consequences of proxy network composition and reconstruction methodology and for continued expansion of tree-ring proxy networks.Publisher PDFPeer reviewe
Recommended from our members
Evaluating Climate Field Reconstruction Techniques Using Improved Emulations of Real-World Conditions
Pseudoproxy experiments (PPEs) have become an important framework for evaluating paleoclimate reconstruction methods. Most existing PPE studies assume constant proxy availability through time and uniform proxy quality across the pseudoproxy network. Real multiproxy networks are, however, marked by pronounced disparities in proxy quality, and a steep decline in proxy availability back in time, either of which may have large effects on reconstruction skill. A suite of PPEs constructed from a millennium-length general circulation model (GCM) simulation is thus designed to mimic these various real-world characteristics. The new pseudoproxy network is used to evaluate four climate field reconstruction (CFR) techniques: truncated total least squares embedded within the regularized EM (expectation-maximization) algorithm (RegEM-TTLS), the Mann et al. (2009) implementation of RegEM-TTLS (M09), canonical correlation analysis (CCA), and Gaussian graphical models embedded within RegEM (GraphEM). Each method's risk properties are also assessed via a 100-member noise ensemble.
Contrary to expectation, it is found that reconstruction skill does not vary monotonically with proxy availability, but also is a function of the type and amplitude of climate variability (forced events vs. internal variability). The use of realistic spatiotemporal pseudoproxy characteristics also exposes large inter-method differences. Despite the comparable fidelity in reconstructing the global mean temperature, spatial skill varies considerably between CFR techniques. Both GraphEM and CCA efficiently exploit teleconnections, and produce consistent reconstructions across the ensemble. RegEM-TTLS and M09 appear advantageous for reconstructions on highly noisy data, but are subject to larger stochastic variations across different realizations of pseudoproxy noise. Results collectively highlight the importance of designing realistic pseudoproxy networks and implementing multiple noise realizations of PPEs. The results also underscore the difficulty in finding the proper bias-variance tradeoff for jointly optimizing the spatial skill of CFRs and the fidelity of the global mean reconstructions
A probabilistic model of chronological errors in layer-counted climate proxies: applications to annually banded coral archives
The ability to precisely date climate proxies is central to the
reconstruction of past climate variations. To a degree, all climate proxies
are affected by age uncertainties, which are seldom quantified. This article
proposes a probabilistic age model for proxies based on layer-counted
chronologies, and explores its use for annually banded coral archives. The
model considers both missing and doubly counted growth increments
(represented as independent processes), accommodates various assumptions
about error rates, and allows one to quantify the impact of chronological
uncertainties on different diagnostics of variability. In the case of a
single coral record, we find that time uncertainties primarily affect
high-frequency signals but also significantly bias the estimate of decadal
signals. We further explore tuning to an independent, tree-ring-based
chronology as a way to identify an optimal age model. A synthetic pseudocoral
network is used as testing ground to quantify uncertainties in the estimation
of spatiotemporal patterns of variability. Even for small error rates, the
amplitude of multidecadal variability is systematically overestimated at the
expense of interannual variability (El Niño–Southern Oscillation, or ENSO, in this case), artificially
flattening its spectrum at periods longer than 10 years. An
optimization approach to correct chronological errors in coherent
multivariate records is presented and validated in idealized cases, though it
is found difficult to apply in practice due to the large number of solutions.
We close with a discussion of possible extensions of this model and
connections to existing strategies for modeling age uncertainties
Last Millennium Hurricane Activity Linked to Endogenous Climate Variability
Despite increased Atlantic hurricane risk, projected trends in hurricane frequency in the warming climate are still highly uncertain, mainly due to short instrumental record that limits our understanding of hurricane activity and its relationship to climate. Here we extend the record to the last millennium using two independent estimates: a reconstruction from sedimentary paleohurricane records and a statistical model of hurricane activity using sea surface temperatures (SSTs). We find statistically significant agreement between the two estimates and the late 20th century hurricane frequency is within the range seen over the past millennium. Numerical simulations using a hurricane-permitting climate model suggest that hurricane activity was likely driven by endogenous climate variability and linked to anomalous SSTs of warm Atlantic and cold Pacific. Volcanic eruptions can induce peaks in hurricane activity, but such peaks would likely be too weak to be detected in the proxy record due to large endogenous variability
Climate and society in long-term perspective: opportunities and pitfalls in the use of historical datasets
Recent advances in palaeoclimatology and the growing digital availability of large historical datasets on human activity have created new opportunities to investigate long-term interactions between climate and society. However, noncritical use of historical datasets can create pitfalls, resulting in misleading findings that may become entrenched as accepted knowledge. We demonstrate pitfalls in the content, use and interpretation of historical datasets in research into climate and society interaction through a systematic review of recent studies on the link between climate and (a) conflict incidence, (b) plague outbreaks and (c) agricultural productivity changes. We propose three sets of interventions to overcome these pitfalls, which involve a more critical and multidisciplinary collection and construction of historical datasets, increased specificity and transparency about uncertainty or biases, and replacing inductive with deductive approaches to causality. This will improve the validity and robustness of interpretations on the long-term relationship between climate and society
PaCTS 1.0: A Crowdsourced Reporting Standard for Paleoclimate Data
The progress of science is tied to the standardization of measurements, instruments, and data. This is especially true in the Big Data age, where analyzing large data volumes critically hinges on the data being standardized. Accordingly, the lack of community-sanctioned data standards in paleoclimatology has largely precluded the benefits of Big Data advances in the field. Building upon recent efforts to standardize the format and terminology of paleoclimate data, this article describes the Paleoclimate Community reporTing Standard (PaCTS), a crowdsourced reporting standard for such data. PaCTS captures which information should be included when reporting paleoclimate data, with the goal of maximizing the reuse value of paleoclimate data sets, particularly for synthesis work and comparison to climate model simulations. Initiated by the LinkedEarth project, the process to elicit a reporting standard involved an international workshop in 2016, various forms of digital community engagement over the next few years, and grassroots working groups. Participants in this process identified important properties across paleoclimate archives, in addition to the reporting of uncertainties and chronologies; they also identified archive-specific properties and distinguished reporting standards for new versus legacy data sets. This work shows that at least 135 respondents overwhelmingly support a drastic increase in the amount of metadata accompanying paleoclimate data sets. Since such goals are at odds with present practices, we discuss a transparent path toward implementing or revising these recommendations in the near future, using both bottom-up and top-down approaches
- …