55 research outputs found

    ESD ideas: translating historical extreme weather events into a warmer world

    No full text
    A new reanalysis-based approach is proposed to examine how reconstructions of extreme weather events differ in warmer or cooler counter-factual worlds. This approach offers a novel way to develop plausible storylines for some types of extreme event that other methods may not be suitable for. As a proof of concept, a reanalysis of a severe windstorm that occurred in February 1903 is translated into a warmer world where it produces higher wind speeds and increased rainfall, suggesting that this storm would be more damaging if it occurred today rather than 120 years ago. Whenever a severe weather event occurs with harmful impacts in a particular region, it is often asked by disaster responders, recovery planners, politicians, and journalists whether climate change caused or affected the event. The harmful impacts are caused by the unusual weather, but climate change may have made the weather event more likely, more severe, or both. In those cases, the harmful impacts may be partly or even mostly due to the change in climate. In some cases, the worst consequences may be due to the vulnerability or exposure of the local population or ecosystems, or due to a combination of many other factors (e.g. Otto et al., 2022). Many methodologies exist to understand how climate change has affected extreme events. These are broadly categorized into risk-based approaches and storyline approaches (Stott and Christidis, 2023). The risk-based approaches assess the change in likelihood and magnitude of a particular class of event (e.g. Stott et al., 2004), whereas storyline approaches consider how climate change may have affected a specific event (Trenberth et al., 2015; Shepherd et al., 2018). Other studies consider the related question of what a plausible worst-case event might look like in a particular climate (e.g. Thompson et al., 2017). Some of these methods are now regularly used to provide attribution statements soon after events occur (van Oldenborgh et al., 2021). Event storyline approaches attempt to quantify how an extreme event would be different in an altered climate. This can be achieved by producing reconstructions of the event as it occurred and in counter-factual cooler or warmer climates and comparing the consequences. Various approaches exist for such analyses, including statistical methods (e.g. Cattiaux et al., 2010), analogues (Ginesta et al., 2023; Faranda et al., 2022), nudging a weather or climate model (e.g. Meredith et al., 2015; van Garderen et al., 2021; Sánchez-Benítez et al., 2022), and forecast-based approaches (e.g. Wehner et al., 2019; Leach et al., 2021). Here, we propose a complementary reanalysis-based approach to translate extreme events into different climates

    Evaluation of Total Column Water Vapour Products from Satellite Observations and Reanalyses within the GEWEX Water Vapor Assessment

    No full text
    Since 2011 the Global Energy and Water cycle Exchanges (GEWEX) Water Vapor Assessment (G-VAP) has provided performance analyses for state-of-the-art reanalysis and satellite water vapour products to the GEWEX Data and Analysis Panel (GDAP) and the user community in general. A significant component of the work undertaken by G-VAP is to characterise the quality and uncertainty of these water vapour records to; i) ensure full exploitation and ii) avoid incorrect use or interpretation of results. This study presents results from the second phase of G-VAP, where we have extended and expanded our analysis of Total Column Water Vapour (TCWV) from phase 1, in conjunction with updating the G-VAP archive. For version 2 of the archive, we consider 28 freely available and mature satellite and reanalysis data products, remapped to a regular longitude-latitude grid of 2 • × 2 • , and on monthly time steps between January 1979 and December 2019. We first analysed all records for a 'common' short period of five years (2005-2009), focusing on variability (spatial & seasonal) and deviation from the ensemble mean. We observed that clear-sky daytime-only satellite products were generally drier than the ensemble mean, and seasonal variability/disparity in several regions up to 12 kg/m 2 related to original spatial resolution and temporal sampling. For 11 of the 28 data records, further analysis was undertaken between 1988-2014. Within this 'long period', key results show i) trends between-1.18±0.68 to 3.82±3.94 kg/m 2 /decade and-0.39±0.27 to 1.24±0.85 kg/m 2 /decade were found over ice-free global oceans and land surfaces respectively, and ii) regression coefficients of TWCV against surface temperatures of 6.17±0.24 to 27.02±0.51 %/K over oceans (using sea surface temperature) and 3.00±0.17 to 7.77±0.16 %/K over land (using surface air temperature). It is important to note that trends estimated within G-VAP are used to identify issues in the data records rather than analyse climate change. Additionally, breakpoints have been identified and characterised for both land and ocean surfaces within this period. Finally, we present a spatial analysis of correlations to six climate indices within the 'long period', highlighting regional areas of significant positive and negative correlation and the level of agreement among records

    Assessing homogeneity of land surface air temperature observations using sparse‐input reanalyses

    No full text
    State-of-the-art homogenisation approaches for any test site rely upon the avail-ability of a sufficient number of neighbouring sites with similar climatic condi-tions and a sufficient quantity of overlapping measurements. These conditionsare not always met, particularly in poorly sampled regions and epochs. Modernsparse-input reanalysis products which are constrained by observed sea surfacetemperatures, sea-ice and surface pressure observations, continue to improve,offering independently produced surface temperature estimates back to the early19th century. This study undertakes an exploratory analysis on the applicabilityof sparse-input reanalysis to identify breakpoints in available basic station data.Adjustments are then applied using a variety of reanalysis and neighbour-basedapproaches to produce four distinct estimates. The methodological indepen-dence of the approach may offer valuable insights into historical data qualityissues. The resulting estimates are compared to Global Historical ClimatologyNetwork version 4 (GHCNMv4) at various aggregations. Comparisons are alsomade with five existing global land surface monthly time series. We find a lowerrate of long-term warming which principally arises in differences in estimatedbehaviour prior to the early 20th century. Differences depend upon the exactpair of estimates, varying between 15 and 40% for changes from 1850–1900 to2005–2014. Differences are much smaller for metrics starting after 1900 and neg-ligible after 1950. Initial efforts at quantifying parametric uncertainty suggestthis would be substantial and may lead to overlap between these new estimatesand existing estimates. Further work would be required to use these data prod-ucts in an operational context. This would include better understanding the rea-sons for apparent early period divergence including the impact of spatialinfilling choices, quantification of parametric uncertainty, and a means toupdate the product post-2015 when the NOAA-CIRES-DOE 20CRv3 sparseinput reanalysis product, upon which they are based, presently ceases

    Assessing homogeneity of land surface air temperature observations using sparse‐input reanalyses

    Get PDF
    State-of-the-art homogenisation approaches for any test site rely upon the avail-ability of a sufficient number of neighbouring sites with similar climatic condi-tions and a sufficient quantity of overlapping measurements. These conditionsare not always met, particularly in poorly sampled regions and epochs. Modernsparse-input reanalysis products which are constrained by observed sea surfacetemperatures, sea-ice and surface pressure observations, continue to improve,offering independently produced surface temperature estimates back to the early19th century. This study undertakes an exploratory analysis on the applicabilityof sparse-input reanalysis to identify breakpoints in available basic station data.Adjustments are then applied using a variety of reanalysis and neighbour-basedapproaches to produce four distinct estimates. The methodological indepen-dence of the approach may offer valuable insights into historical data qualityissues. The resulting estimates are compared to Global Historical ClimatologyNetwork version 4 (GHCNMv4) at various aggregations. Comparisons are alsomade with five existing global land surface monthly time series. We find a lowerrate of long-term warming which principally arises in differences in estimatedbehaviour prior to the early 20th century. Differences depend upon the exactpair of estimates, varying between 15 and 40% for changes from 1850–1900 to2005–2014. Differences are much smaller for metrics starting after 1900 and neg-ligible after 1950. Initial efforts at quantifying parametric uncertainty suggestthis would be substantial and may lead to overlap between these new estimatesand existing estimates. Further work would be required to use these data prod-ucts in an operational context. This would include better understanding the rea-sons for apparent early period divergence including the impact of spatialinfilling choices, quantification of parametric uncertainty, and a means toupdate the product post-2015 when the NOAA-CIRES-DOE 20CRv3 sparseinput reanalysis product, upon which they are based, presently ceases

    Assessing homogeneity of land surface air temperature observations using sparse‐input reanalyses

    No full text
    State-of-the-art homogenisation approaches for any test site rely upon the avail-ability of a sufficient number of neighbouring sites with similar climatic condi-tions and a sufficient quantity of overlapping measurements. These conditionsare not always met, particularly in poorly sampled regions and epochs. Modernsparse-input reanalysis products which are constrained by observed sea surfacetemperatures, sea-ice and surface pressure observations, continue to improve,offering independently produced surface temperature estimates back to the early19th century. This study undertakes an exploratory analysis on the applicabilityof sparse-input reanalysis to identify breakpoints in available basic station data.Adjustments are then applied using a variety of reanalysis and neighbour-basedapproaches to produce four distinct estimates. The methodological indepen-dence of the approach may offer valuable insights into historical data qualityissues. The resulting estimates are compared to Global Historical ClimatologyNetwork version 4 (GHCNMv4) at various aggregations. Comparisons are alsomade with five existing global land surface monthly time series. We find a lowerrate of long-term warming which principally arises in differences in estimatedbehaviour prior to the early 20th century. Differences depend upon the exactpair of estimates, varying between 15 and 40% for changes from 1850–1900 to2005–2014. Differences are much smaller for metrics starting after 1900 and neg-ligible after 1950. Initial efforts at quantifying parametric uncertainty suggestthis would be substantial and may lead to overlap between these new estimatesand existing estimates. Further work would be required to use these data prod-ucts in an operational context. This would include better understanding the rea-sons for apparent early period divergence including the impact of spatialinfilling choices, quantification of parametric uncertainty, and a means toupdate the product post-2015 when the NOAA-CIRES-DOE 20CRv3 sparseinput reanalysis product, upon which they are based, presently ceases

    Fallout from U.S. atmospheric nuclear tests in New Mexico and Nevada (1945-1962)

    Full text link
    One hundred and one atmospheric nuclear weapon tests were conducted between 1945 and 1962 in the United States, resulting in widespread dispersion of radioactive fallout, and leading to environmental contamination and population exposures. Accurate assessment of the extent of fallout from nuclear weapon tests has been challenging in the United States and elsewhere, due to limited monitoring and data accessibility. Here we address this deficit by combining U.S. government data, high-resolution reanalyzed historical weather fields, and atmospheric transport modeling to reconstruct radionuclide deposition across the contiguous United States, with 10-kilometer spatial and one-hour temporal resolution for five days following detonation, from all 94 atmospheric tests detonated in New Mexico and Nevada with fission yields sufficient to generate mushroom clouds. Our analysis also includes deposition estimates for 10 days following the detonation of Trinity, the first ever nuclear weapon test, on July 16, 1945. We identify locations where radionuclide deposition significantly exceeded levels in areas covered by the U.S. Radiation Exposure Compensation Act (RECA). These findings include deposition in all 48 contiguous U.S. states. They provide an opportunity for re-evaluating the public health and environmental implications from atmospheric nuclear testing. Finally, our findings also speak to debates about marking the beginning of the Anthropocene with nuclear weapons fallout. Our deposition estimates indicate that direct fallout from Trinity, a plutonium device, reached Crawford Lake in Canada, the proposed "golden spike" site marking the beginning of the Anthropocene epoch, starting on July 20, 1945.Comment: 19 pages, 4 figures, 1 supplementary table, 3 supplementary figure

    Northern Hemisphere Extratropical Cyclone Activity in the Twentieth Century Reanalysis Version 3 (20CRv3) and Its Relationship with Continental Extreme Temperatures

    No full text
    In this study, we detect and track extratropical cyclones using 6-hourly mean sea level pressure data taken from the Twentieth Century Reanalysis version 3 (20CRv3) over the period 1951–2015 and compare them with those in the Interim and fifth generation of ECMWF reanalyses over the period 1979–2018. Three indices were employed to characterize cyclone activity, including cyclone count, cyclone intensity, and a cyclone activity index (CAI) that combines the count and intensity. The results show that the cyclone indices in the three datasets have comparable annual climatologies and seasonal evolution over the northern extratropical land and ocean in recent decades. Based on the cyclone indices over the period 1951–2010 in 80 ensemble members of 20CRv3, cyclone count and intensity are negatively correlated in winter and tend to be positively and weakly correlated in summer. The interannual CAI variability is dominated by the cyclone count variability. Regional mean cyclone activity can be well represented using the ensemble average cyclone index. We then examined the linkage of the cyclone activity in 20CRv3 and observed cold and warm extremes over Eurasia and North America over the period 1951–2010. In winter, the principal components of interannual cold and warm extreme anomalies are more correlated with the regional mean cyclone count index over Eurasia, while they are more correlated with the cyclone intensity index over North America. The temperature anomalies associated with the regional and ensemble mean cyclone count index explain about 10% (20%) of interannual cold (warm) extreme variances averaged over Eurasia. The temperature anomalies associated with the mean cyclone intensity explain about 10% of interannual cold and warm extreme variances over North America. Large-scale atmospheric circulation anomalies in association with cyclone activity and the induced temperature advection drive temperature anomalies over Eurasia and North America. In summer, circulation and thermal advection anomalies associated with cyclone activity are weak over the two continents. Hence, that season’s relationship between cyclone activity and extreme temperature variability is weak

    Meteorological data rescue: citizen science lessons learned from Southern Weather Discovery

    Get PDF
    Daily weather reconstructions (called "reanalyses") can help improve our understanding of meteorology and long-term climate changes. Adding undigitized historical weather observations to the datasets that underpin reanalyses is desirable; however, time requirements to capture those data from a range of archives is usually limited. Southern Weather Discovery is a citizen science data rescue project that recovered tabulated handwritten meteorological observations from ship log books and land-based stations spanning New Zealand, the Southern Ocean, and Antarctica. We describe the Zooniverse-hosted Southern Weather Discovery campaign, highlight promotion tactics, and replicate keying levels needed to obtain 100% complete transcribed datasets with minimal type 1 and type 2 transcription errors. Rescued weather observations can augment optical character recognition (OCR) text recognition libraries. Closer links between citizen science data rescue and OCR-based scientific data capture will accelerate weather reconstruction improvements, which can be harnessed to mitigate impacts on communities and infrastructure from weather extremes

    Estimates of land surface air temperature changes homogenised using sparse input 20th Century reanalysis product 20CRv3

    No full text
    The data consist of 4 frozen gridded estimates as described in Gillespie et al, 2022. The data are presented as 5 degree resolution global fields over 1850-2014. Station data arising from the International Surface Temperature Initiative (ISTI) global databank of monthly holdings have been homogenised using 20CRv3 to identify breakpoints. Four distinct approaches to adjustment have been undertaken resulting in 4 separate estimates of the resulting series. The gridding is a simple gridbox average of any observations available within each 5 degree gridbox. Data is available for those global land areas for which underlying station data are available in teh ISTI holdings

    Influence of warming and atmospheric circulation changes on multidecadal European flood variability

    Get PDF
    International audienceEuropean flood frequency and intensity change on a multidecadal scale. Floods were more frequent in the 19th (central Europe) and early 20th century (western Europe) than during the mid-20th century and again more frequent since the 1970s. The causes of this variability are not well understood and the relation to climate change is unclear. Palaeoclimate studies from the northern Alps suggest that past flood-rich periods coincided with cold periods. In contrast, some studies suggest that more floods might occur in a future, warming world. Here we address the contribution of atmospheric circulation and of warming to multidecadal flood variability. For this, we use long series of annual peak streamflow, daily weather data, reanalyses, and reconstructions. We show that both changes in atmospheric circulation and moisture content affected multidecadal changes of annual peak streamflow in central and western Europe over the past two centuries. We find that during the 19th and early 20th century, atmospheric circulation changes led to high peak values of moisture flux convergence. The circulation was more conducive to strong and long-lasting precipitation events than in the mid-20th century. These changes are also partly reflected in the seasonal mean circulation and reproduced in atmospheric model simulations, pointing to a possible role of oceanic variability. For the period after 1980, increasing moisture content in a warming atmosphere led to extremely high moisture flux convergence. Thus, the main atmospheric driver of flood variability changed from atmospheric circulation variability to water vapour increase.La fréquence et l'intensité des inondations en Europe changent à une échelle multidécennale. Les inondations étaient plus fréquentes au 19ème (Europe centrale) et au début du 20ème siècle (Europe occidentale) qu'au milieu du 20ème siècle et à nouveau plus fréquentes depuis les années 1970. Les causes de cette variabilité ne sont pas bien comprises et la relation avec le changement climatique n'est pas claire. Les études paléoclimatiques menées dans les Alpes du Nord suggèrent que les périodes passées riches en inondations coïncidaient avec des périodes froides. En revanche, certaines études suggèrent que davantage d'inondations pourraient se produire dans un monde futur en réchauffement. Nous abordons ici la contribution de la circulation atmosphérique et du réchauffement à la variabilité multidécennale des inondations. Pour cela, nous utilisons de longues séries de débit maximal annuel, des données météorologiques quotidiennes, des réanalyses et des reconstructions climatiques. Nous montrons que les changements de la circulation atmosphérique et du contenu en humidité ont affecté les changements multidécennaux du débit maximal annuel en Europe centrale et occidentale au cours des deux derniers siècles. Nous constatons qu'au cours du 19ème et du début du 20ème siècle, les changements de la circulation atmosphérique ont conduit à des valeurs de pointe élevées de convergence du flux d'humidité. La circulation était plus propice à des événements de précipitations forts et durables qu'au milieu du 20e siècle. Ces changements se reflètent également en partie dans la circulation moyenne saisonnière et sont reproduits dans les simulations des modèles atmosphériques, ce qui indique un rôle possible de la variabilité océanique. Pour la période après 1980, l'augmentation de la teneur en humidité dans une atmosphère qui se réchauffe a conduit à une convergence extrêmement élevée des flux d'humidité. Ainsi, le principal moteur atmosphérique de la variabilité des crues est passé de la variabilité de la circulation atmosphérique à l'augmentation de la vapeur d'eau
    • …
    corecore