27 research outputs found

    Synoptic Analysis of the New York March 1888 Blizzard

    Get PDF
    The meteorological circumstances that led to the Blizzard of March 1888 that hit New York are analysed in Version 2 of the “Twentieth Century Reanalysis” (20CR). The potential of this data set for studying historical extreme events has not yet been fully explored. A detailed analysis of 20CR data alongside other data sources (including historical instrumental data and weather maps) for historical extremes such as the March 1888 blizzard may give insights into the limitations of 20CR. We find that 20CR reproduces the circulation pattern as well as the temperature development very well. Regarding the absolute values of variables such as snow fall or minimum and maximum surface pressure, there is anunderestimation of the observed extremes, which may be due to the low spatial resolution of 20CR and the fact that only the ensemble mean is considered. Despite this drawback, the dataset allows us to gain new information due to its complete spatial and temporal coverage

    Historical weather extremes in the “Twentieth Century Reanalysis”

    Get PDF
    Meteorological or climatological extremes are rare and hence studying them requires long meteorological data sets. Moreover, for addressing the underlying atmospheric processes, detailed three-dimensional data are desired. Until recently the two requirements were incompatible as long meteorological series were only available for a few locations, whereas detailed 3-dimensional data sets such as reanalyses were limited to the past few decades. In 2011, the “Twentieth Century Reanalysis” (20CR) was released, a 6-hourly global atmospheric data set covering the past 140 years, thus combining the two properties. The collection of short papers in this volume contains case studies of individual extreme events in the 20CR data set. In this overview paper we introduce the first six cases and summarise some common findings. All of the events are represented in 20CR in a physically consistent way, allowing further meteorological interpretations and process studies. Also, for most of the events, the magnitudes are underestimated in the ensemble mean. Possible causes are addressed. For interpreting extrema it may be necessary to address individual ensemble members. Also, the density of observations underlying 20CR should be considered. Finally, we point to problems in wind speeds over the Arctic and the northern North Pacific in 20CR prior to the 1950s

    The 1872 Baltic Sea storm surge

    Get PDF
    On 13 November 1872, the Baltic Sea coast from Denmark to Pomerania was devastated by an extreme storm surge caused by high winds. This is still the strongest surge on record, and understanding its development can contribute to improved risk assessment and protection. In this paper we trace this event in sea-level pressure and wind data from the “Twentieth Century Reanalysis” (20CR) and compare the results with other observation-based data sources. The analysis shows that, in the ensemble mean of 20CR, the general development is qualitatively well depicted, but with much reduced strength compared to other data sets. The same is true when selecting the ensemble member with maximum wind speeds

    Tambora 1815 as a test case for high impact volcanic eruptions: Earth system effects

    Get PDF
    The eruption of Tambora (Indonesia) in April 1815 had substantial effects on global climate and led to the ‘Year Without a Summer’ of 1816 in Europe and North America. Although a tragic event—tens of thousands of people lost their lives—the eruption also was an ‘experiment of nature’ from which science has learned until today. The aim of this study is to summarize our current understanding of the Tambora eruption and its effects on climate as expressed in early instrumental observations, climate proxies and geological evidence, climate reconstructions, and model simulations. Progress has been made with respect to our understanding of the eruption process and estimated amount of SO2 injected into the atmosphere, although large uncertainties still exist with respect to altitude and hemispheric distribution of Tambora aerosols. With respect to climate effects, the global and Northern Hemispheric cooling are well constrained by proxies whereas there is no strong signal in Southern Hemisphere proxies. Newly recovered early instrumental information for Western Europe and parts of North America, regions with particularly strong climate effects, allow Tambora's effect on the weather systems to be addressed. Climate models respond to prescribed Tambora-like forcing with a strengthening of the wintertime stratospheric polar vortex, global cooling and a slowdown of the water cycle, weakening of the summer monsoon circulations, a strengthening of the Atlantic Meridional Overturning Circulation, and a decrease of atmospheric CO2. Combining observations, climate proxies, and model simulations for the case of Tambora, a better understanding of climate processes has emerged

    Biases in precipitation records found in parallel measurements

    Get PDF
    Presentación realizada en: 10th EUMETNET Data Management Workshop celebrado en St. Gallen, Suiza, del 28 al 30 de octubre de 2015.In this work we investigate biases introduced by the transition from Conventional to automatic precipitation measurements. This is another study in the framework of The Parallel Observations Scientific Team (POST, http://www.surfacetemperatures.org/databank/parallel_measurements), which is a newly created group of the International Surface Temperature Initiative (ISTI) supported by the World Meteorological Organization (WMO). The goals of POST are the study of climate data inhomogeneities at the daily and sub-daily level. Long instrumental climate records are usually affected by non-climatic changes, due to various reasons like relocations, changes in instrumentation, measurements schemes etc. Such inhomogeneities may distort the climate signal and can influence the assessment of trends and variability. For studying climatic changes it is important to accurately distinguish non-climatic from climatic signals. This can be achieved by studying the differences between two parallel measurements. These need to be sufficiently close together to be well correlated. One important ongoing worldwide transition is the one from manual to automated measurements. We need to study the impact of automated measurements urgently because sooner or later this will affect most of the stations in individual national networks. Similar to temperature series, we study the transition from conventional manual measurements (CON) to Automatic Weather Stations (AWS), using several parallel datasets distributed over Europe and America. The ratio series AWS-CON are subject to quality control, and before the analysis obvious errors are removed. Further, the series are inspected for internal inhomogeneities and– if necessary –the records are split into two or more homogeneous segments. Finally, each segment is studied to understand the biases introduced by the transition, its seasonality as well as changes in the empirical distributions. When additional variables are available, an attempt is made to study the effects of other variables on the observed biases

    Description of the bias introduced by the transition from Conventional Manual Measurements to Automatic Weather Station through the analysis of European and American parallel datasets (+ Australia, Israel & Kyrgyzstan)

    Get PDF
    PresentaciĂłn realizada en: 10th EUMETNET Data Management Workshop celebrado en St. Gallen, Suiza, del 28 al 30 de octubre de 2015.In this work, we approach the description of the biases introduced by automation in temperature records. This is one of the first studies in the framework of The Parallel Observations Scientific Team (POST). POST is a newly created group of the International Surface Temperature Initiative (ISTI), with the support of the World Meteorological Organization (WMO). The goals of POST (http://www.surfacetemperatures.org/databank/parallel_measurements) are the study of climate data inhomogeneities at the daily and sub-daily level through the compilation and analysis of parallel measurements. Long instrumental climate records are usually affected by non-climatic changes, due to, e.g., relocations and changes in instrumentation, instrument height or data collection and manipulation procedures. These so-called inhomogeneities distort the climate signal and can hamper the assessment of trends and variability. Thus to study climatic changes we need to accurately distinguish non-climatic and climatic signals. The most direct way to study the influence of non-climatic changes on the distribution and to understand the reasons for these biases is the analysis of parallel measurements. A parallel measurement is composed of two or more time series, which measure a climatic variable with two different systems (for example, Montsouris and Stevenson Screens) or in two different locations (for example, city centre and airport). They mimic the situation “before” and “after” a homogeneity break. Most parallel measurements are obtained from collocated or nearly collocated series and can help us to understand the size and shape of different typical sources of inhomogeneity, which affect the climate series. Here we study the transition from conventional temperature manual measurements (CON) to Automatic Weather Stations (AWS), using several parallel datasets distributed over Europe and America. The variables studied in the analysis presented here are daily maximum and minimum temperature. First of all, the metadata – when available - is gathered to gain knowledge on the exact setting of the parallel series. Secondly, the difference (temperature) series AWS-CON are submitted to quality control, to remove obvious errors and inspected to detect internal inhomogeneities and split if necessary. In a third step, each segment is studied to understand the bias introduced by the transition, its seasonality as well as changes in the empirical distributions. When additional variables are available, an attempt is made to study the effects of other variables on the observed biases.With the support of Grant CGL2012-32193, Ministerio de EconomĂ­a y Competitividad, MINECO, España and FP7-SPACE-2013-1 grand 607193, Uncertainties in Ensembles of Regional Reanalyses (UERRA)

    The International Surface Pressure Databank version 2

    Get PDF
    The International Surface Pressure Databank (ISPD) is the world's largest collection of global surface and sea-level pressure observations. It was developed by extracting observations from established international archives, through international cooperation with data recovery facilitated by the Atmospheric Circulation Reconstructions over the Earth (ACRE) initiative, and directly by contributing universities, organizations, and countries. The dataset period is currently 1768–2012 and consists of three data components: observations from land stations, marine observing systems, and tropical cyclone best track pressure reports. Version 2 of the ISPD (ISPDv2) was created to be observational input for the Twentieth Century Reanalysis Project (20CR) and contains the quality control and assimilation feedback metadata from the 20CR. Since then, it has been used for various general climate and weather studies, and an updated version 3 (ISPDv3) has been used in the ERA-20C reanalysis in connection with the European Reanalysis of Global Climate Observations project (ERA-CLIM). The focus of this paper is on the ISPDv2 and the inclusion of the 20CR feedback metadata. The Research Data Archive at the National Center for Atmospheric Research provides data collection and access for the ISPDv2, and will provide access to future versions

    Simulating the temperature and precipitation signal in an Alpine ice core

    Get PDF
    Accumulation and delta O-18 data from Alpine ice cores provide information on past temperature and precipitation. However, their correlation with seasonal or annual mean temperature and precipitation at nearby sites is often low. This is partly due to the irregular sampling of the atmosphere by the ice core (i.e. ice cores almost only record precipitation events and not dry periods) and the possible incongruity between annual layers and calendar years. Using daily meteorological data from a nearby station and reanalyses, we replicate the ice core from the Grenzgletscher (Switzerland, 4200m a.s.l.) on a sample-by-sample basis by calculating precipitation-weighted temperature (PWT) over short intervals. Over the last 15 yr of the ice core record, accumulation and delta O-18 variations can be well reproduced on a sub-seasonal scale. This allows a wiggle-matching approach for defining quasi-annual layers, resulting in high correlations between measured quasi-annual delta O-18 and PWT. Further back in time, the agreement deteriorates. Nevertheless, we find significant correlations over the entire length of the record (1938-1993) of ice core delta O-18 with PWT, but not with annual mean temperature. This is due to the low correlations between PWT and annual mean temperature, a characteristic which in ERA-Interim reanalysis is also found for many other continental mid-to-high-latitude regions. The fact that meteorologically very different years can lead to similar combinations of PWT and accumulation poses limitations to the use of delta O-18 from Alpine ice cores for temperature reconstructions. Rather than for reconstructing annual mean temperature, delta O-18 from Alpine ice cores should be used to reconstruct PWT over quasi-annual periods. This variable is reproducible in reanalysis or climate model data and could thus be assimilated into conventional climate models

    Break detection of annual Swiss temperature series

    Get PDF
    [1] Instrumental temperature series are often affected by artificial breaks (“break points”) due to (e.g.,) changes in station location, land-use, or instrumentation. The Swiss climate observation network offers a high number and density of stations, many long and relatively complete daily to sub-daily temperature series, and well-documented station histories (i.e., metadata). However, for many climate observation networks outside of Switzerland, detailed station histories are missing, incomplete, or inaccessible. To correct these records, the use of reliable statistical break detection methods is necessary. Here, we apply three statistical break detection methods to high-quality Swiss temperature series and use the available metadata to assess the methods. Due to the complex terrain in Switzerland, we are able to assess these methods under specific local conditions such as the Foehn or crest situations. We find that the temperature series of all stations are affected by artificial breaks (average = 1 break point / 48 years) with discrepancies in the abilities of the methods to detect breaks. However, by combining the three statistical methods, almost all of the detected break points are confirmed by metadata. In most cases, these break points are ascribed to a combination of factors in the station history

    Extreme climate, not extreme weather: the summer of 1816 in Geneva, Switzerland

    Get PDF
    We analyze weather and climate during the "Year without Summer" 1816 using sub-daily data from Geneva, Switzerland, representing one of the climatically most severely affected regions. The record includes twice daily measurements and observations of air temperature, pressure, cloud cover, wind speed, and wind direction as well as daily measurements of precipitation. Comparing 1816 to a contemporary reference period (1799–1821) reveals that the coldness of the summer of 1816 was most prominent in the afternoon, with a shift of the entire distribution function of temperature anomalies by 3–4 °C. Early morning temperature anomalies show a smaller change for the mean, a significant decrease in the variability, and no changes in negative extremes. Analyzing cloudy and cloud-free conditions separately suggests that an increase in the number of cloudy days was to a significant extent responsible for these features. A daily weather type classification based on pressure, pressure tendency, and wind direction shows extremely anomalous frequencies in summer 1816, with only one day (compared to 20 in an average summer) classified as high-pressure situation but a tripling of low-pressure situations. The afternoon temperature anomalies expected from only a change in weather types was much stronger negative in summer 1816 than in any other year. For precipitation, our analysis shows that the 80% increase in summer precipitation compared to the reference period can be explained by 80% increase in the frequency of precipitation, while no change could be found neither in the average intensity of precipitation nor in the frequency distribution of extreme precipitation. In all, the analysis shows that the regional circulation and local cloud cover played a dominant role. It also shows that the summer of 1816 was an example of extreme climate, not extreme weather
    corecore