13 research outputs found
Homogenisierung von Radiosondentemperaturzeitreihen in der Antarktis mit ERA-Interim Daten
Das am Institut für Meteorologie und Geophysik der Universität Wien entwickelte Verfahren zur Homogenisierung von Radiosondentemperaturzeitreihen RAOBCORE (Radiosonde Observation Correction using Reanalysis) hat bisher Zeitreihen der Differenz zwischen Beobachtungen und Backgroundvorhersagen aus ERA-40 Reanalysen (sogenannte Innovationsstatistiken) zum Auffinden und Korrigieren von Brüchen verwendet. Am wenigsten zufriedenstellend hat das in der Antarktis funktioniert, da in ERA-40 in den letzten Jahren die Brewer-Dobson Zirkulation zu stark ausgeprägt war, was zu vertikal inkonsistenten Temperaturtrends in der Antarktis geführt hat. Mittlerweile sind mit ERA-Interim neue Reanalysen verfügbar, die diesbezüglich deutlich besser und daher zumindest ab 1989 eine bessere Referenz zur Homogenisierung sind.
Die Methode zur Berechnung der Bruchgrößen wurde verbessert und mit der ursprünglichen Methode verglichen. Brüche in antarktischen Radiosondentemperaturzeitreihen werden nur noch korrigiert, wenn mindestens fünf umliegende, ebenfalls antarktische Stationen als Referenz zur Berechnung der Bruchgröße zur Verfügung stehen. Ohne Nachbarstationen sind die Sprünge vor 1989 praktisch nicht korrigierbar, da der Background bei ERA-40, der vor 1989 weiterhin verwendet wird, in der Antarktis größere Inhomogenitäten durch Änderungen bei den Satellitendaten hat. Die Auswirkungen von Verbesserungen in der Korrekturberechnung für antarktische Stationen und von ERA-Interim Innovationen auf die Homogenisierung von antarktischen Radiosondentemperaturzeitreihen wurde untersucht.
Mit den neuen Korrekturen passen die Zeitreihen besser zum Mittel über alle antarktischen Stationen. Die räumliche Homogenität der Radiosondentemperaturtrends wird verbessert. Einige Brüche können nicht mehr korrigiert werden, da es zu wenig Daten von umliegenden Stationen gibt. Der Einfluss der Methode zur Berechnung der Bruchgrößen ist vor 1989 deutlich größer als danach wo ERA-Interim Daten verfügbar sind
Descente d'échelle probabiliste pour analogues météorologiques. Etude de la cohérence spatiale
Studying past and present day precipitation and its link to large scale circulation increases our understanding of precipitation characteristics and helps to anticipate their future behaviour. Downscaling techniques are being developed to bridge the gap between large-scale climate information from global reanalyses or GCM global projections and local meteorological information relevant for hydrology. The stepwise analogue downscaling method for hydrology (SANDHY) is extended to the whole mainland of France by optimising the geopotential predictor domains for 608 zones covering France using a multiple growing rectangular domain algorithm that allows to take equifinality into account. A high diversity of predictor domains has been found. To increase the spatial coherence three ways are explored to reduce the parameter space: assessing the skill for predictor domains found for other zones, form groups of zones using cluster algorithms and using a less skewed predictand variable during optimisation. Using information from neighbouring zones allows to counterbalance in part limitations of the optimisation algorithm. A feature based spatial verification method (SAL) is adapted for probabilistic precipitation simulation as provided by SANDHY. Skill scores derived from the probabilistic SAL are used to assess different strategies for spatially coherent precipitation downscaling at catchment scale. Locally optimised predictor domains lead to a better localisation of precipitation in the catchment and higher local skill while uniform predictor domains for the whole catchment lead to a more realistic spatial structure of the simulated precipitation. Streamflow simulations for the Durance catchment (Southern Alps) are most sensitive to the realistic localisation of precipitation which highlights the interest of locally optimising predictor domains.STARÉtudier les précipitations et leur lien avec la circulation atmosphérique augmente notre connaissance de leurs caracteristiques et aide à anticiper leur comportement futur. Des méthodes de déscente d'échelle sont développées pour fournir des informations météorologiques locales et importantes pour l'hydrologie à partir des informations issues des réanalyses ou des projections globales du climat. La méthode SANDHY (Stepwise ANalogue Downscaling method for HYdrology) est étendue à l'ensemble de la France métropolitaine en optimisant les domaines pour le prédicteur géopotentiel pour les 608 zones climatiquement homogènes en France en utilisant un algorithme qui permet de prendre en compte l'équifinalité. Une grande diversité des domaines pour le prédicteur géopotentiel a été trouvée. Trois voies pour augmenter la cohérence spatiale et diminuer l'espace des paramètres sont explorés : prendre en compte les domaines optimisés pour des zones voisines, rassembler des zones en utilisant des algorithmes d'aggregation et utiliser un preditant moins asymétrique pendant l'optimisation. Utiliser de l'information issues de zones voisines permet de compenser certaines limitations de l'algorithme d'optimisation. Une méthode de vérification spatiale (SAL) est ici adaptée pour les précipitations probabilistes simulées par SANDHY. Des mesures de performance derivées de cette version probabiliste du SAL sont ensuite utilisées pour évaluer différentes stratégies de déscente d'échelle concernant la cohérence spatiale à l'échelle d'un bassin versant. Les domains optimisés localement pour le prédicteur géopotentiel permettent de mieux localiser les précipitations dans le bassin tandis que des domains uniformes sur tout le bassin apportent une structure des précipitations plus réaliste. Les simulations de débit pour le bassin de la Durance sont le plus sensible à la localisation des précipitations ce qui souligne l'interêt d'une optimisation locale des domaines des prédicteurs
Evaluation of the HadGEM3-A simulations in view of detection and attribution of human influence on extreme events in Europe
A detailed analysis is carried out to assess the HadGEM3-A global atmospheric model skill in simulating extreme temperatures, precipitation and storm surges in Europe in the view of their attribution to human influence. The analysis is performed based on an ensemble of 15 atmospheric simulations forced with observed Sea Surface Temperature of the 54 year period 1960-2013. These simulations, together with dual simulations without human influence in the forcing, are intended to be used in weather and climate event attribution. The analysis investigates the main processes leading to extreme events, including atmospheric circulation patterns, their links with temperature extremes, land-atmosphere and troposphere-stratosphere interactions. It also compares observed and simulated variability, trends and generalized extreme value theory parameters for temperature and precipitation. One of the most striking findings is the ability of the model to capture North Atlantic atmospheric weather regimes as obtained from a cluster analysis of sea level pressure fields. The model also reproduces the main observed weather patterns responsible for temperature and precipitation extreme events. However, biases are found in many physical processes. Slightly excessive drying may be the cause of an overestimated summer interannual variability and too intense heat waves, especially in central/northern Europe. However, this does not seem to hinder proper simulation of summer temperature trends. Cold extremes appear well simulated, as well as the underlying blocking frequency and stratosphere-troposphere interactions. Extreme precipitation amounts are overestimated and too variable. The atmospheric conditions leading to storm surges were also examined in the Baltics region. There, simulated weather conditions appear not to be leading to strong enough storm surges, but winds were found in very good agreement with reanalyses. The performance in reproducing atmospheric weather patterns indicates that biases mainly originate from local and regional physical processes. This makes local bias adjustment meaningful for climate change attribution
Descente d'échelle probabiliste pour analogues météorologiques. Etude de la cohérence spatiale
Studying past and present day precipitation and its link to large scale circulation increases our understanding of precipitation characteristics and helps to anticipate their future behaviour. Downscaling techniques are being developed to bridge the gap between large-scale climate information from global reanalyses or GCM global projections and local meteorological information relevant for hydrology. The stepwise analogue downscaling method for hydrology (SANDHY) is extended to the whole mainland of France by optimising the geopotential predictor domains for 608 zones covering France using a multiple growing rectangular domain algorithm that allows to take equifinality into account. A high diversity of predictor domains has been found. To increase the spatial coherence three ways are explored to reduce the parameter space: assessing the skill for predictor domains found for other zones, form groups of zones using cluster algorithms and using a less skewed predictand variable during optimisation. Using information from neighbouring zones allows to counterbalance in part limitations of the optimisation algorithm. A feature based spatial verification method (SAL) is adapted for probabilistic precipitation simulation as provided by SANDHY. Skill scores derived from the probabilistic SAL are used to assess different strategies for spatially coherent precipitation downscaling at catchment scale. Locally optimised predictor domains lead to a better localisation of precipitation in the catchment and higher local skill while uniform predictor domains for the whole catchment lead to a more realistic spatial structure of the simulated precipitation. Streamflow simulations for the Durance catchment (Southern Alps) are most sensitive to the realistic localisation of precipitation which highlights the interest of locally optimising predictor domains.STARÉtudier les précipitations et leur lien avec la circulation atmosphérique augmente notre connaissance de leurs caracteristiques et aide à anticiper leur comportement futur. Des méthodes de déscente d'échelle sont développées pour fournir des informations météorologiques locales et importantes pour l'hydrologie à partir des informations issues des réanalyses ou des projections globales du climat. La méthode SANDHY (Stepwise ANalogue Downscaling method for HYdrology) est étendue à l'ensemble de la France métropolitaine en optimisant les domaines pour le prédicteur géopotentiel pour les 608 zones climatiquement homogènes en France en utilisant un algorithme qui permet de prendre en compte l'équifinalité. Une grande diversité des domaines pour le prédicteur géopotentiel a été trouvée. Trois voies pour augmenter la cohérence spatiale et diminuer l'espace des paramètres sont explorés : prendre en compte les domaines optimisés pour des zones voisines, rassembler des zones en utilisant des algorithmes d'aggregation et utiliser un preditant moins asymétrique pendant l'optimisation. Utiliser de l'information issues de zones voisines permet de compenser certaines limitations de l'algorithme d'optimisation. Une méthode de vérification spatiale (SAL) est ici adaptée pour les précipitations probabilistes simulées par SANDHY. Des mesures de performance derivées de cette version probabiliste du SAL sont ensuite utilisées pour évaluer différentes stratégies de déscente d'échelle concernant la cohérence spatiale à l'échelle d'un bassin versant. Les domains optimisés localement pour le prédicteur géopotentiel permettent de mieux localiser les précipitations dans le bassin tandis que des domains uniformes sur tout le bassin apportent une structure des précipitations plus réaliste. Les simulations de débit pour le bassin de la Durance sont le plus sensible à la localisation des précipitations ce qui souligne l'interêt d'une optimisation locale des domaines des prédicteurs
Spatially coherent probabilistic precipitation downscaling with meteorological analogues
Étudier les précipitations et leur lien avec la circulation atmosphérique augmente notre connaissance de leurs caracteristiques et aide à anticiper leur comportement futur. Des méthodes de déscente d'échelle sont développées pour fournir des informations météorologiques locales et importantes pour l'hydrologie à partir des informations issues des réanalyses ou des projections globales du climat. La méthode SANDHY (Stepwise ANalogue Downscaling method for HYdrology) est étendue à l'ensemble de la France métropolitaine en optimisant les domaines pour le prédicteur géopotentiel pour les 608 zones climatiquement homogènes en France en utilisant un algorithme qui permet de prendre en compte l'équifinalité. Une grande diversité des domaines pour le prédicteur géopotentiel a été trouvée. Trois voies pour augmenter la cohérence spatiale et diminuer l'espace des paramètres sont explorés : prendre en compte les domaines optimisés pour des zones voisines, rassembler des zones en utilisant des algorithmes d'aggregation et utiliser un preditant moins asymétrique pendant l'optimisation. Utiliser de l'information issues de zones voisines permet de compenser certaines limitations de l'algorithme d'optimisation. Une méthode de vérification spatiale (SAL) est ici adaptée pour les précipitations probabilistes simulées par SANDHY. Des mesures de performance derivées de cette version probabiliste du SAL sont ensuite utilisées pour évaluer différentes stratégies de déscente d'échelle concernant la cohérence spatiale à l'échelle d'un bassin versant. Les domains optimisés localement pour le prédicteur géopotentiel permettent de mieux localiser les précipitations dans le bassin tandis que des domains uniformes sur tout le bassin apportent une structure des précipitations plus réaliste. Les simulations de débit pour le bassin de la Durance sont le plus sensible à la localisation des précipitations ce qui souligne l'interêt d'une optimisation locale des domaines des prédicteurs.Studying past and present day precipitation and its link to large scale circulation increases our understanding of precipitation characteristics and helps to anticipate their future behaviour. Downscaling techniques are being developed to bridge the gap between large-scale climate information from global reanalyses or GCM global projections and local meteorological information relevant for hydrology. The stepwise analogue downscaling method for hydrology (SANDHY) is extended to the whole mainland of France by optimising the geopotential predictor domains for 608 zones covering France using a multiple growing rectangular domain algorithm that allows to take equifinality into account. A high diversity of predictor domains has been found. To increase the spatial coherence three ways are explored to reduce the parameter space: assessing the skill for predictor domains found for other zones, form groups of zones using cluster algorithms and using a less skewed predictand variable during optimisation. Using information from neighbouring zones allows to counterbalance in part limitations of the optimisation algorithm. A feature based spatial verification method (SAL) is adapted for probabilistic precipitation simulation as provided by SANDHY. Skill scores derived from the probabilistic SAL are used to assess different strategies for spatially coherent precipitation downscaling at catchment scale. Locally optimised predictor domains lead to a better localisation of precipitation in the catchment and higher local skill while uniform predictor domains for the whole catchment lead to a more realistic spatial structure of the simulated precipitation. Streamflow simulations for the Durance catchment (Southern Alps) are most sensitive to the realistic localisation of precipitation which highlights the interest of locally optimising predictor domains.STA
Spatial verification of ensemble precipitation: an ensemble version of SAL
[Departement_IRSTEA]Eaux [TR1_IRSTEA]ARCEAU [ADD1_IRSTEA]Hydrosystèmes et risques naturelsInternational audienceSpatial verification methods able to handle high-resolution ensemble forecasts and analysis ensembles are increasingly required because of the increasing development of such ensembles. An ensemble extension of the spatial verification method SAL (Structure Amplitude Location) is proposed here. The ensemble SAL (eSAL) allows for verifying ensemble forecasts against a deterministic or ensemble analysis. The eSAL components are equal to those of SAL in the deterministic case, thus allowing the comparison of deterministic and ensemble forecasts. The Mesoscale Verification Intercomparison over Complex Terrain (MesoVICT) project provides a dataset containing deterministic and ensemble precipitation forecasts as well as a deterministic and ensemble analysis for case studies in Summer 2007 over the greater Alpine region. These data sets allow testing the sensitivity of SAL and eSAL to analysis uncertainty and their suitability for the verification of ensemble forecasts. Their sensitivity with respect to the main parameter of this feature-based method – the threshold for defining precipitation features – is furthermore tested for both the deterministic and ensemble forecasts. Our results stress the importance of using meaningful thresholds in order to limit any unstable behavior of the threshold-dependent SAL components. The eSAL components are typically close to the median of the distribution of deterministic SAL components calculated for all combinations of ensemble members of the forecast and the analysis, with considerably less computational time. The eSAL ensemble extension of SAL can be considered as a relevant summary measure that leads to more easily interpretable SAL diagrams
Role of circulation in European heatwaves using flow analogues
International audienceThe intensity of European heatwaves is connected to specific synoptic atmospheric circulation. Given the relatively small number of observations, estimates of the connexion between the circulation and temperature require ad hoc statistical methods. This can be achieved through the use of analogue methods, which allow to determine a distribution of temperature conditioned to the circulation. The computation of analogues depends on a few parameters. In this article, we evaluate the influence of the variable representing the circulation, the size of the domain of computation, the length of the dataset, and the number of analogues on the reconstituted temperature anomalies. We test the sensitivity of the reconstitution of temperature to these parameters for four emblematic recent heat waves :. The paper provides general guidelines for the use of flow analogues to investigate European summer heat waves. We found that Z500 is better suited than SLP to simulate temperature anomalies, and that rather small domains lead to better reconstitutions. The dataset length has an important influence on the uncertainty. We conclude by a set of recommendations for an optimal use of analogues to probe European heatwaves
Analysis of the exceptionally warm December 2015 in France using flow analogues
International audienceCapsule December 2015 in France was an extreme of circulation and temperature. Both circulation and climate change partly explain the 4°C anomaly. We found no link between climate change and circulation. The event The December 2015 average temperature broke a record in France, with an anomaly of +4.1°C (Fig. 1a) with respect to the 1949-2015 climatology. The linear trend of average December temperature (in red in Fig. 1a) is not significant (p-value > 0.05), as regional temperature variability is high in winter. Such a positive temperature anomaly has impacts on the vegetation cycle (the French press covered this topic in the daily newspaper Le Monde 1). It also affects local economies, e.g. tourism in ski resorts. The temperature anomaly was concomitant with a zonal atmospheric circulation over Western Europe (Fig. 1b), directing mild subtropical air masses towards France. We found that the mean monthly SLP (sea level pressure) anomaly over the black box of Fig.1b is also a record high for the NCEP reanalysis. Such a circulation type generally leads to warm temperatures overs France (Yiou and Nogaj, 2004). In this paper we seek to address three questions: How much does the circulation anomaly explain the temperature anomaly during December 2015 in France? What is the influence of climate change on the occurrence of the circulation anomaly? How does the distribution of temperature conditional to the atmospheric circulation evolve with climate change
Trends of atmospheric circulation during singular hot days in Europe
International audienceThe influence of climate change on mid-latitudes atmospheric circulation is still very uncertain. The large internal variability makes it difficult to extract any statistically significant signal regarding the evolution of the circulation. Here we propose a methodology to calculate dynamical trends tailored to the circulation of specific days by computing the evolution of the distances between the circulation of the day of interest and the other days of the time series. We compute these dynamical trends for two case studies of the hottest days recorded in two different European regions (corresponding to the heatwaves of summer 2003 and 2010). We use the NCEP reanalysis dataset, an ensemble of CMIP5 models, and a large ensemble of a single model (CESM), in order to account for different sources of uncertainty. While we find a positive trend for most models for 2003, we cannot conclude for 2010 since the models disagree on the trend estimates
Web processing service for climate impact and extreme weather event analyses. Flyingpigeon (Version 1.0)
Analyses of extreme weather events and their impacts often requires big data processing of ensembles of climate model simulations. Researchers generally proceed by downloading the data from the providers and processing the data files " at home " with their own analysis processes. However, the growing amount of available climate model and observation data makes this procedure quite awkward. In addition, data processing knowledge is kept local, instead of being consolidated into a common resource of reusable code. These drawbacks can be mitigated by using a web processing service (WPS). A WPS hosts services such as data analysis processes that are accessible over the web, and can be installed close to the data archives. We developed a WPS named 'flyingpigeon' that communicates over an HTTP network protocol based on standards defined by the Open Geospatial Consortium (OGC) [23], to be used by climatologists and impact modelers as a tool for analyzing large datasets remotely. Here, we present the current processes we developed in flyingpigeon relating to commonly-used processes (preprocessing steps, spatial subsets at continent, country or region level, and climate indices) as well as methods for specific climate data analysis (weather regimes, analogues of circulation, segetal flora distribution, and species distribution models). We also developed a novel, browser-based interactive data visualization for circulation analogues , illustrating the flexibility of WPS in designing custom outputs. Bringing the software to the data instead of transferring the data to the code is becoming increasingly necessary, especially with the upcoming massive climate datasets