11 research outputs found

    Climate influences on flood probabilities across Europe

    Get PDF
    The link between streamflow extremes and climatology has been widely studied in recent decades. However, a study investigating the effect of large-scale circulation variations on the distribution of seasonal discharge extremes at the European level is missing. Here we fit a climate-informed generalized extreme value (GEV) distribution to about 600 streamflow records in Europe for each of the standard seasons, i.e., to winter, spring, summer and autumn maxima, and compare it with the classical GEV distribution with parameters invariant in time. The study adopts a Bayesian framework and covers the period 1950 to 2016. Five indices with proven influence on the European climate are examined independently as covariates, namely the North Atlantic Oscillation (NAO), the east Atlantic pattern (EA), the east Atlantic–western Russian pattern (EA/WR), the Scandinavia pattern (SCA) and the polar–Eurasian pattern (POL). It is found that for a high percentage of stations the climate-informed model is preferred to the classical model. Particularly for NAO during winter, a strong influence on streamflow extremes is detected for large parts of Europe (preferred to the classical GEV distribution for 46&thinsp;% of the stations). Climate-informed fits are characterized by spatial coherence and form patterns that resemble relations between the climate indices and seasonal precipitation, suggesting a prominent role of the considered circulation modes for flood generation. For certain regions, such as northwestern Scandinavia and the British Isles, yearly variations of the mean seasonal climate indices result in considerably different extreme value distributions and thus in highly different flood estimates for individual years that can also persist for longer time periods.</p

    Investigation and evaluation of methods for homogenisation of hydroclimatic data

    No full text
    149 σ.Η διπλωματική αυτή εστιάστηκε στη μελέτη των ανομογενειών των χρονοσειρών θερμοκρασίας των τελευταίων περίπου εκατό χρόνων, των μεθόδων διόρθωσής τους και των αποτελεσμάτων της διόρθωσης αυτής και εξελίχθηκε σε τρεις κατευθύνσεις: Από συστηματική βιβλιογραφική διερεύνηση καταγράφηκαν, ταξινομήθηκαν και αξιολογήθηκαν οι παρατηρούμενες ανομογένειες στις ιστορικές και σύγχρονες χρονοσειρές θερμοκρασίας και οι μέθοδοι διόρθωσής τους. Διαπιστώθηκε ότι οι μέθοδοι αυτές είναι κυρίαρχα στατιστικές και όχι επαρκώς τεκμηριωμένες πειραματικά και συνήθως δεν υποστηρίζονται από μεταδεδομένα. Σε πολλές μάλιστα περιπτώσεις οι προτεινόμενες διορθώσεις δεν είναι καν στατιστικά σημαντικές. Από την παγκόσμια βάση δεδομένων GHCN-Monthly Version 2 εξετάστηκαν όλοι οι σταθμοί που περιείχαν αρχείο αρχικών και επεξεργασμένων δεδομένων και ικανοποιούσαν ορισμένα κριτήρια χρονικής πληρότητας. Ειδικά για τις Ηνωμένες Πολιτείες της Αμερικής, λόγω του μεγάλου πλήθους των διαθέσιμων σταθμών, επιλέχτηκε αριθμός σταθμών που προέκυψε με κατάλληλη δειγματοληψία. Συνολικά εξετάστηκαν 181 σταθμοί παγκοσμίως, στους οποίους υπολογίστηκε η διαφορά των γραμμικών τάσεων εκατονταετίας μεταξύ διορθωμένων και μη δεδομένων. Διαπιστώθηκε ότι στα 2/3 περίπου των σταθμών η ομογενοποίηση αύξησε τις θετικές τάσεις ή μείωσε τις αρνητικές. Έγινε εφαρμογή της κυριότερης μεθόδου ομογενοποίησης SNHT for single shifts σε συνθετικά δεδομένα με επιλεγμένα στατιστικά χαρακτηριστικά με και χωρίς την προσθήκη τεχνητού άλματος. Η μέθοδος είχε ικανοποιητική συμπεριφορά σε ανεξάρτητα δεδομένα που ακολουθούν κανονική κατανομή, αλλοίωσε όμως ομογενείς χρονοσειρές που παρουσίαζαν μακροπρόθεσμη εμμονή. Τα παραπάνω συμπεράσματα θέτουν ορισμένες δεσμεύσεις για τις διαδικασίες ομογενοποίησης και οδηγούν στην άποψη ότι μια πιθανή αύξηση της θερμοκρασίας τον τελευταίο αιώνα φαίνεται να είναι μεταξύ 0.4οC και 0.7οC, τιμές που έχουν προκύψει από αδιόρθωτα και διορθωμένα στοιχεία αντίστοιχα.This study focuses on the inhomogeneities of temperature time series covering the last 100 years, as well as on the methods used for their adjustment, and has three components. Based on a systematic study of scientific literature, observed inhomogeneities in historical and modern time series and their adjustement methods were recorded, classified and evaluated. It was found that these methods are mainly statistical, not well documented by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant. From the global database GHCN-Monthly Version 2 all stations containing both raw and adjusted data and satisfying certain criteria of continuity were examined. Specifically concerning the United States of America, because of the big number of available stations, stations were chosen after a suitable sampling. 181 stations in total were analysed globally. For these stations the differences between the adjusted and non-adjusted linear 100-year trends were calculated. It was found that in the two thirds of the cases, the homogenisation procedure increased positive or decreased the negative temperature trends differences. The most common homogenisation method ‘SNHT for single shifts’ was applied to synthetic time series with selected statistical characteristics, occasionally with offsets. The method was satisfactory when applied to data following the normal distribution, but not in data with long-term persistence. The above results put some constraints in the use of homogenisation procedures and tend to indicate that the global temperature increase during the last century is between 0.4οC and 0.7οC, estimates derived from raw and adjusted data, respectively.Εύα-Στυλιανή Ε. Στείρο

    Investigation of homogenization errors in hydro-climatic time series with long-term persistence

    No full text
    161 σ.Εθνικό Μετσόβιο Πολυτεχνείο--Μεταπτυχιακή Εργασία. Διεπιστημονικό-Διατμηματικό Πρόγραμμα Μεταπτυχιακών Σπουδών (Δ.Π.Μ.Σ.) “Επιστήμη και Τεχνολογία Υλικών”Οι καταγραφές υδροκλιματικών δεδομένων περιέχουν ανομογένειες, δηλαδή σφάλματα διαφόρων τύπων, συνηθέστερα άλματα, τα οποία εντοπίζονται και διορθώνονται με στατιστικές κυρίως μεθόδους ομογενοποίησης, συνήθως σε σύγκριση με γειτονικές χρονοσειρές. Η ομογενοποίηση αποτελεί αντικείμενο διαμάχης, επειδή υπάρχει υποψία ότι εισάγει συστηματικά σφάλματα τα οποία αλλοιώνουν την κλιματική πληροφορία. Στα πλαίσια της εργασίας αυτής διερευνήθηκαν συστηματικά οι διάφορες μέθοδοι ομογενοποίησης δεδομένων θερμοκρασίας και βροχόπτωσης και οι μελέτες αξιολόγησής τους στη βιβλιογραφία. Διαπιστώθηκε ότι οι μέθοδοι και οι αξιολογήσεις τους μέχρι σήμερα βασίζονται σε δύο υποθέσεις: Πρώτον ότι τα υδροκλιματικά δεδομένα χαρακτηρίζονται από ασθενή ή και καθόλου αυτοσυσχέτιση, χωρίς να λαμβάνεται υπόψη ένα στοχαστικό χαρακτηριστικό των δεδομένων, η μακροπρόθεσμη εμμονή, δηλαδή μια δομή αυτοσυσχέτισης που διατηρείται στο χρόνο και προσδιορίζεται από το συντελεστή Ηurst. Και δεύτερον, ότι η διαφορά θερμοκρασίας και ο λόγος βροχόπτωσης γειτονικών σταθμών αποτελούν σειρά τυχαίων αριθμών, ενώ αντίθετα αποδεικνύεται ότι είναι συσχετισμένα. Για να διερευνηθούν τα αποτελέσματα της ομογενοποίησης σε δεδομένα με μακροπρόθεσμη εμμονή, εφαρμόστηκαν, με χρήση της μεθόδου Monte Carlo δύο κλασικές μέθοδοι ομογενοποίησης, οι SNHT και Διπλή αθροιστική καμπύλη, σε ετήσια ομογενή συνθετικά δεδομένα θερμοκρασίας και βροχόπτωσης, αντίστοιχα. Διαπιστώθηκε ότι οι δύο μέθοδοι ομογενοποίησης οδηγούν σε στατιστικά μη σημαντικό αριθμό χρονοσειρών με πλασματικά άλματα για τιμή του συντελεστή Hurst Η = 0.5, που αντιστοιχεί σε ασυσχέτιστα δεδομένα. Αύξηση όμως του συντελεστή Hurst (0.5 < Η < 1) οδηγεί σε αύξηση του ποσοστού των χρονοσειρών με πλασματικά άλματα Διαπιστώθηκε επίσης ότι ο αριθμός και ο τρόπος χρήσης των χρονοσειρών αναφοράς επηρεάζει επίσης πολύ το ποσοστό των χρονοσειρών στις οποίες εντοπίζονται ψευδή άλματα. Από την εφαρμογή της μεθόδου SNHT προέκυψε ότι το ποσοστό χρονοσειρών με ψευδή άλματα φαίνεται να επηρεάζεται από το μήκος των χρονοσειρών και την ελάχιστη απόσταση ανάμεσα στις ανομογένειες, όχι όμως από τη συσχέτιση ανάμεσα στις χρονοσειρές αναφοράς και τις ελεγχόμενες. Τέλος, από διόρθωση των αλμάτων που εντοπίζονται από το SNHT φαίνεται ότι η μέθοδος δεν οδηγεί σε αλλοίωση των τάσεων μεταβολών θερμοκρασίας, καθώς τα ποσοστά των χρονοσειρών στις οποίες αυξάνονται και μειώνονται οι τάσεις είναι σχεδόν ίσα. Αντίθετα, η διόρθωση αλμάτων τείνει να μειώνει το συντελεστή Ηurst για αρχικά δεδομένα με μέτρια έως ισχυρή εμμονή (Η > 0.65), όχι όμως για Η < 0.65.Hydroclimatic time series contain inhomogeneities, which are errors introduced by replacements and calibration of instruments, station relocations, changes in the environment of the stations, etc. The most common expression of inhomogeneities is shifts between two parts of time series. The identification and correction of inhomogeneities is called homogenization and is usually done with statistical methods which compare a candidate station with one or more neighbouring reference stations, assuming that they belong to the same climatological region and they reflect the same weather and climate variations. The homogenization of hydroclimatic data, mainly of temperature and precipitation time series, is a procedure of great importance and also a controversial subject because of its implications in the estimations of climate change. This study focuses on a generally ignored by the homogenization community, though important characteristic of hydroclimatic time series and of its effects on homogenization, the long-term persistence of hydroclimatic data, and has two components: (a) a literature review and (b) a computational approach. 1. Literature review A systematic study of the scientific literature was made in order to examine types and causes of inhomogeneities, to identify and classify the existing homogenization methods, understand their stochastic background and to evaluate their output. This literature review focused mainly on previous evaluation studies of homogenization methods with synthetic data. A main result of this study is that existing homogenization methods generally ignore the long-term persistence of hydroclimatic data expressed by the Hurst coefficient and examine only first-order autoregressive series (AR(1)) or series of identically and independently distributed Gaussian errors. No systematic studies of the relationship between the Hurst coefficient and the homogenization results have been identified. It was also found that homogenization methods assume that the series of temperature differences or precipitation ratios between reference and candidate stations constitute series of random numbers (e.g. white noise). However a basic stochastic analysis indicates that the difference and ratio series reproduce the autocorrelation function of the reference and candidate series assuming that both have the same autocorrelation structure. 2. Computational approach A computational approach based on Monte Carlo simulations permitted to understand and evaluate the behaviour of selected classical homogenization methods as a function of various parameters: a) the Hurst coefficient of hydroclimatic time series (tested values Η={0.50, 0.55, ..., 0.90}), b) the cross-correlation coefficient between candidate and reference time series (tested values: ρXY={0.5, 0.6, 0.7, 0.8, 0.9, 0.95}), c) the number of reference time series and the way they were used to locate shifts (reference systems - shown in Table 1), d) the length of the time series (50 and 100 years), and e) the minimum distance between possible inhomogeneities or between inhomogeneities and the edge of series (tested values 5 and 10 years). The percentage of time series with false alarms was regarded as the critical factor for this evaluation. For temperatures, the homogenization method selected was SNHT for shifts (Alexandersson and Moberg, 1997) in combination with all systems of reference series summarized in Table 1 except 3/1 (see Figure 1). For multiple reference series a pairwise comparison of the candidate series with all reference series was applied. The SNHT was applied using a cutting algorithm described by Domonkos (2011a) and a 95% confidence level. For the precipitation data the Double Mass Curve (Kohler, 1949, Searcy and Hardison,1960) was selected. The original method is subjective and involves the identification of the main inhomogeneity of the time series on a graph. An objective (automated) version was developed using a piecewise linear algorithm based on the least squares approach. The reference series systems used were the most commonly applied 1/1 and 3/1. The synthetic series with long-term persistence were simulated using a multiple time-scale fluctuation approach proposed by Koutsoyiannis (2002) and following the normal distribution. Temperature data were generated with zero mean and unit standard deviation and precipitation data with a mean of 1000mm and a standard deviation of 300mm. For every candidate series one or multiple correlated reference series with the same characteristics were generated (see Table 1). All simulations and computations were based on original Matlab codes. 3. Results and conclusions Some main conclusions of this study are summarized in Figures 1, 2, 3 and 4. a) For time series with H=0.5 (i.e. characterized by white noise), the false alarm rate is not significant (below 5%), which is expected because of the design of the homogenization methods, but the percentage of series with false alarms increases with H. b) The number of reference series and of the minimum number required to locate shifts in the time series greatly affects the percentage of series with false alarms. Furthermore, some more conclusions can be extracted concerning the application of the SNHT to temperature data and the Double mass curve to precipitation data. For the temperature data (see Figures 1 and 2) it can be assumed that: a) For a Hurst coefficient H ≥ 0.85, common shifts located by all (minimum number 3) reference time series correspond to percentage of series with false alarms higher than 5% and tend to indicate a real inhomogeneity (e.g. systems 2/2 and 3/3 in Figure 1). b) In the case of a common shift identified by some of the reference series only, this may only correspond to a false alarm (e.g. Figure 1). In such cases a possible inhomogeneity must be confirmed by analysis of the reference time series. c) The cross-correlation coefficient between reference and candidate series does not seem to influence the percentage of time series with false alarms. d) The percentage of series with false alarms for time series with length 50 years was lower than the percentage for 100 years (Figure 1). e) The minimum distance between possible inhomogeneities or between inhomogeneities and the edge of series influences but not greatly the percentage of series with false alarms. A minimum distance of 10 years leads to a lower percentage than a minimum distance of 5 years (Figure 1). f) For the case of a single reference series, corrections of the located shifts were applied. These corrections led to a similar percentage of series with increased and decreased trend after the homogenization. Therefore it seems that SNHT does not introduce significant changes in the temperature trends. g) For H < 0.65 the percentage of series with an increased Hurst coefficient after homogenization is similar to that with a lower Hurst coefficient. For Η > 0.65 there is a different case. The percentage of series with an increased Hurst coefficient exceeds that of series with a lower Hurst coefficient. This difference increases with the increase of the initial Hurst coefficient of the time series. For the precipitation data (see Figures 3 and 4) it can be assumed that: a) For all values of the Hurst coefficient examined, the percentage of series with false alarms decreases with the increase of the ratio of the slopes of the two lines of the Double Mass Curve. b) Application of the Double Mass Curve with a reference time series produced by three time series (3/1) tends to decrease the percentage of false alarms in comparison to the application of the method with a single reference series (1/1). c) For the system 1/1 and all the parameters examined a slope ratio 1.5 corresponds to a percentage of series with false alarms lower than 5%. For the system 3/1 the same ratio is 1.3. These values seem to be indicative of a real inhomogeneity.Εύα-Στυλιανή Ε. Στείρο

    The impact of atmospheric teleconnections on the coastal aquifers of Ria Formosa (Algarve, Portugal)

    No full text
    Fluctuations in groundwater level in the Ria Formosa coastal aquifers, southern Portugal, owe 80% of the variability to climate-induced oscillations. Wavelet coherences computed between hydraulic heads and the North Atlantic Oscillation (NAO) and East Atlantic (EA) atmospheric teleconnections show nonstationary and spatially varying relationships. The NAO is the most important teleconnection and the main driver of long-term variability, inducing cycle periods of 6-10 years. The NAO fingerprint is ubiquitous and it accounts for nearly 50% of the total variance of groundwater levels. The influence of EA emerges coupled to NAO and is mainly associated with oscillations in the 2-4-year band. These cycles contribute to less than 5% of the variance in groundwater levels and are more evident further from the coast, in the northern part of the system near the main recharge area. Inversely, the power of the annual cycle increases towards the shoreline. The weight of the annual cycle (related to direct recharge) is greatest in the Campina de Faro aquifer, where it is responsible for 20-50% of the variance of piezometric levels. There, signals linked to atmospheric teleconnections (related to regional recharge) are low-pass filtered and have periods >8 years. This behavior (lack of power in the 2-8-year band) emphasizes the vulnerability of coastal groundwater levels to multi-year droughts, particularly in the already stressed Quinta do Lago region, where hydraulic heads are persistently below sea level.FCTPortuguese Foundation for Science and Technology [UID/GEO/50019/2019]Fundacao para a Ciencia e Tecnologia (FCT)Portuguese Foundation for Science and Technology [SFRH/BD/131568/2017]info:eu-repo/semantics/publishedVersio

    Climate and Rivers

    Get PDF
    Over the last few decades as hydrologists have slowly raised their line of sight above the watershed boundary, it has become increasingly recognised that what happens in the atmosphere, as a major source of moisture for the terrestrial branch of the hydrological cycle, can strongly influence river dynamics at a range of spatial and temporal scales. Notwithstanding this, there is still a tendency for some in the river research community to restrict their gaze to the river channel or floodplain. However Geoff Petts, the person to which this special issue is dedicated, understood well and widely encouraged a holistic view of river catchment processes. This included an acknowledgment of the role of climate, in its broadest sense, in shaping what happens within and without the river channel. The purpose of this paper therefore is to offer a broad overview of the role of some aspects of climate science in advancing knowledge in river research. Topics to be addressed include the role of climate in influencing river flow regimes, a consideration of the large scale climate mechanisms that drive hydrological variability within river basins at inter-annual to decadal timescales and atmospheric rivers and their link to surface hydrology. In reviewing these topics a number of key knowledge gaps have emerged including attributing the causes of river flow regime changes to any one particular cause, the non-stationary and asymmetric forcing of river regimes by modes of climate variability and establishing links between atmospheric rivers, and terrestrial river channel processes, fluvial habitats, and ecological change

    Current European flood-rich period exceptional compared with past 500 years

    Get PDF
    There are concerns that recent climate change is altering the frequency and magnitude of river floods in an unprecedented way1. Historical studies have identified flood-rich periods in the past half millennium in various regions of Europe2. However, because of the low temporal resolution of existing datasets and the relatively low number of series, it has remained unclear whether Europe is currently in a flood-rich period from a long-term perspective. Here we analyse how recent decades compare with the flood history of Europe, using a new database composed of more than 100 high-resolution (sub-annual) historical flood series based on documentary evidence covering all major regions of Europe. We show that the past three decades were among the most flood-rich periods in Europe in the past 500 years, and that this period differs from other flood-rich periods in terms of its extent, air temperatures and flood seasonality. We identified nine flood-rich periods and associated regions. Among the periods richest in floods are 1560-1580 (western and central Europe), 1760-1800 (most of Europe), 1840-1870 (western and southern Europe) and 1990-2016 (western and central Europe). In most parts of Europe, previous flood-rich periods occurred during cooler-than-usual phases, but the current flood-rich period has been much warmer. Flood seasonality is also more pronounced in the recent period. For example, during previous flood and interflood periods, 41 per cent and 42 per cent of central European floods occurred in summer, respectively, compared with 55 per cent of floods in the recent period. The exceptional nature of the present-day flood-rich period calls for process-based tools for flood-risk assessment that capture the physical mechanisms involved, and management strategies that can incorporate the recent changes in risk.Europäischer Forschungsrat (ERC)Fonds zur Förderung der wissenschaftlichen Forschung (FWF)Fonds zur Förderung der wissenschaftlichen Forschung (FWF)5605667Horizon 2020DFGSpanish Agency of ScienceSpanish Agency of ScienceSpanish Agency of ScienceSpanish Agency of ScienceMinistry of Education, Youth and Sports of the Czech Republi

    Climate and rivers

    No full text
    corecore