65 research outputs found

    Comparison of three downscaling methods in simulating the impact of climate change on the hydrology of Mediterranean basins

    No full text
    International audienceStudies of the impact of climate change on water resources usually follow a top to bottom approach: a scenario of emissions is used to run a GCM simulation, which is downscaled (RCM and/or stastistical methods) and bias-corrected. Then, this data is used to force a hydrological model. Seldom, impact studies take into account all relevant uncertainties. In fact, many published studies only use one climate model and one downscaling technique. In this study, the outputs of an atmosphere-ocean regional climate model are downscaled and bias-corrected using three different techniques: a statistical method based on weather regimes, a quantile-mapping method and the method of the anomaly. The resulting data are used to force a distributed hydrological model to simulate the French Mediterranean basins. These are characterized by water scarcity and an increasing human pressure, which cause a demand in assessments on the impact of climate change hydrological systems. The purpose of the study is mainly methodological: the evaluation of the uncertainty related to the downscaling and bias-correction step. The periods chosen to compare the changes are the end of the 20th century (1970-2000) and the middle of the 21st century (2035-2065). The study shows that the three methods produce similar anomalies of the mean annual precipitation, but there are important differences, mainly in terms of spatial patterns. The study also shows that there are important differences in the anomalies of temperature. These uncertainties are amplified by the hydrological model. In some basins, the simulations do not agree in the sign of the anomalies and, in many others, the differences in amplitude of the anomaly are very important. Therefore, the uncertainty related to the downscaling and bias-correction of the climate simulation must be taken into account in order to better estimate the impact of climate change, with its uncertainty, on a specific basin. The study also shows that according to the RCM simulation used and to the periods studied, there might be significant increases of winter precipitation on the CĂ©vennes region of the Massif Central, which is already affected by flash floods, and significant decreases of summer precipitation in most of the region. This will cause a decrease in the average discharge in the middle of the 21st in most of the gauging stations studied, specially in summer. Winter and, maybe spring, in some areas, are the exception, as discharge may increase in some basins

    An updated assessment of past and future warming over France based on a regional observational constraint

    Get PDF
    Building on CMIP6 climate simulations, updated global and regional observations, and recently introduced statistical methods, we provide an updated assessment of past and future warming over France. Following the IPCC AR6 and recent global-scale studies, we combine model results with observations to constrain climate change at the regional scale. Over mainland France, the forced warming in 2020 with respect to 1900–1930 is assessed to be 1.66 [1.41 to 1.90] ∘C, i.e., in the upper range of the CMIP6 estimates, and is almost entirely human-induced. A refined view of the seasonality of this past warming is provided through updated daily climate normals. Projected warming in response to an intermediate emission scenario is assessed to be 3.8 ∘C (2.9 to 4.8 ∘C) in 2100 and rises up to 6.7 [5.2 to 8.2] ∘C in a very high emission scenario, i.e., substantially higher than in previous ensembles of global and regional simulations. Winter warming and summer warming are expected to be about 15 % lower than and 30 % higher than the annual mean warming, respectively, for all scenarios and time periods. This work highlights the importance of combining various lines of evidence, including model and observed data, to deliver the most reliable climate information. This refined regional assessment can feed adaptation planning for a range of activities and provides additional rationale for urgent climate action. Code is made available to facilitate replication over other areas or political entities.</p

    Regional climate model emulator based on deep learning: concept and first evaluation of a novel hybrid downscaling approach

    Get PDF
    Providing reliable information on climate change at local scale remains a challenge of first importance for impact studies and policymakers. Here, we propose a novel hybrid downscaling method combining the strengths of both empirical statistical downscaling methods and Regional Climate Models (RCMs). The aim of this tool is to enlarge the size of high-resolution RCM simulation ensembles at low cost. We build a statistical RCM-emulator by estimating the downscaling function included in the RCM. This framework allows us to learn the relationship between large-scale predictors and a local surface variable of interest over the RCM domain in present and future climate. Furthermore, the emulator relies on a neural network architecture, which grants computational efficiency. The RCM-emulator developed in this study is trained to produce daily maps of the near-surface temperature at the RCM resolution (12km). The emulator demonstrates an excellent ability to reproduce the complex spatial structure and daily variability simulated by the RCM and in particular the way the RCM refines locally the low-resolution climate patterns. Training in future climate appears to be a key feature of our emulator. Moreover, there is a huge computational benefit in running the emulator rather than the RCM, since training the emulator takes about 2 hours on GPU, and the prediction is nearly instantaneous. However, further work is needed to improve the way the RCM-emulator reproduces some of the temperature extremes, the intensity of climate change, and to extend the proposed methodology to different regions, GCMs, RCMs, and variables of interest

    Statistical methods in the detection and attribution of long-term climate changes

    No full text
    Detection and attribution of climate change has been a growing activity since the 90's when the question of a possible human influence on the observed climate arose. I will first briefly introduce this theme together with the standard definitions of detection and attribution. Second, I will review the statistical models that have been used over the last 20 years to deal with these questions. Those models were predominantly linear regression models where the observations are regressed onto expected response patterns to different external forcings. Several levels of complexity have been proposed, from usual linear regression models to sophisticated error-in-variable models where observational and climate modelling uncertainties are accounted for. A recent, simple alternative proposes to avoid using linear regression based on similar assumption regarding uncertainties. Third, I will discuss a few statistical issues common to those models. A first issue involves the estimation of internal variability and its covariance matrix, which is required to carry out optimal inference. A second issue involves the estimation of climate modelling uncertainty, and the underlying assumptions. A third issue involves the data preprocessing and the dimension reduction that is needed to use those models.Non UBCUnreviewedAuthor affiliation: Météo France - CNRSFacult

    Modélisation statistique des changements climatiques, détection, et attribution

    No full text
    International audienceDans le cadre des Ă©tudes de dĂ©tection et d'attribution, diffĂ©rents outils statistiques sont utilisĂ©s afin d'Ă©tudier les changements climatiques en cours. La dĂ©tection d'un changement, tout d'abord, consiste Ă  montrer qu'un phĂ©nomĂšne est effectivement un changement ; on montre, statistiquement, que ce phĂ©nomĂšne est incohĂ©rent avec la seule variabilitĂ© interne du systĂšme climatique, considĂ©rĂ©e comme alĂ©atoire. L'attribution d'un changement Ă  une ou plusieurs causes consiste Ă  Ă©tablir un lien de causalitĂ© entre diffĂ©rents facteurs explicatifs physiquement plausibles et le changement Ă©tudiĂ©. Nous prĂ©sentons ici un bref descriptif des modĂšles et de la dĂ©marche statistiques mis en oeuvre dans ce cadre, qui accordent une place importante aux tests d'hypothĂšses. Nous introduisons ensuite quelques unes des problĂ©matiques statistiques pouvant ĂȘtre rencontrĂ©es pour mener Ă  bien ces Ă©tudes

    Is future climate predictable with statistics?

    No full text
    The purpose of this note is to briefly introduce the statistical models and methods used in climate sciences to estimate, from observations, the sensitivity of the Earth's climate to Greenhouse Gases. First the context of climatology is described with an explanation of how statistics can interact with the use of climate models. A description of the main models used, which are original variants of Error-in-Variables models, follows. Then a few issues for which methodological progresses would be helpful are mentioned. This includes the inference of large covariance matrices and uncertainty quantification

    DĂ©tection statistique des changements climatiques

    Get PDF
    According to the International Panel on Climate Change (IPCC), detection is the statistical demonstration that an observed change cannot be explained by natural internal variability alone. This PhD Thesis deals with regional climate changes detection and in particular with the statistical methods well suited to it. Several statistical hypothesis testing procedures are introduced and studied. The first method considered involves looking for a climate change signal in the observations, assuming that its spatial distribution is known. In this case, a new adaptation of the optimal fingerprint method is proposed. It is based on the use of a well-conditioned covariance matrix estimate of the internal climate variability. The second approach proposes to look for a signal with a prescribed temporal pattern. This investigated pattern can be evaluated from climate model runs by using smoothing splines. A third strategy involves the study of an undefined climate change signal but one which satisfies a space-time separability assumption. Its time component also need to be regular. A functional statistical framework can be used in this case to construct a test of significance for the first smooth principal component, based on the penalised likelihood ratio. Applying these different methods to observed datasets covering France and the Mediterranean basin has led to new sets of results regarding the current climate changes over these regions. Significant changes are found in the mean annual and seasonal temperatures as well as in the annual precipitation over France. These changes are not spatially uniform, and modify the spatial distribution of the variable considered. Finally, comparing the various methods proposed allows to discuss the ability of numerical climate models to properly represent the spatial and temporal features of climate changes.Selon le Groupe Intergouvernemental d'experts sur l'Evolution du Climat (GIEC), la dĂ©tection est la dĂ©monstration statistique de ce qu'un changement observĂ© ne peut pas ĂȘtre expliquĂ© par la seule variabilitĂ© interne naturelle du climat. Cette thĂšse s'intĂ©resse Ă  la dĂ©tection des changements climatiques Ă  l'Ă©chelle rĂ©gionale, et en particulier aux mĂ©thodes statistiques adaptĂ©es Ă  ce type de problĂ©matique. Plusieurs procĂ©dures de tests statistiques sont ainsi prĂ©sentĂ©es et Ă©tudiĂ©es. La premiĂšre mĂ©thode dĂ©veloppĂ©e consiste Ă  rechercher, dans les observations, la prĂ©sence d'un signal de changements climatiques dont la distribution spatiale est connue. Dans ce cas, une nouvelle adaptation de la mĂ©thode des empreintes digitales optimales a Ă©tĂ© proposĂ©e, basĂ©e sur l'utilisation d'un estimateur bien conditionnĂ© de la matrice de covariance de la variabilitĂ© interne du climat. Une seconde approche propose de rechercher un signal ayant une forme d'Ă©volution temporelle particuliĂšre. La forme recherchĂ©e peut alors ĂȘtre Ă©valuĂ©e Ă  partir de scĂ©narios climatiques en utilisant des fonctions de lissage "splines". Une troisiĂšme stratĂ©gie consiste Ă  Ă©tudier la prĂ©sence d'un changement non spĂ©cifiĂ© Ă  l'avance, mais qui vĂ©rifie une propriĂ©tĂ© de sĂ©parabilitĂ© espace-temps, et qui prĂ©sente une certaine rĂ©gularitĂ© en temps. On utilise dans ce cas un formalisme de statistique fonctionnelle, pour construire un test de significativitĂ© de la premiĂšre composante principale lisse, basĂ© sur le rapport des vraisemblances pĂ©nalisĂ©es. L'application de ces diffĂ©rentes mĂ©thodes sur des donnĂ©es observĂ©es sur la France et le bassin MĂ©diterranĂ©en a permis de mettre en Ă©vidence de nouveaux rĂ©sultats concernant les changements climatiques en cours sur ces deux domaines. Des changements significatifs sont notamment mis en Ă©vidence sur les tempĂ©ratures annuelles et saisonniĂšres, ainsi que sur les prĂ©cipitations annuelles, dans le cas de la France. Ces changements ne sont pas uniformes en espace et modifient la distribution rĂ©gionale de la variable Ă©tudiĂ©e. La comparaison des diffĂ©rentes mĂ©thodes de dĂ©tection proposĂ©es a Ă©galement permis de discuter de la capacitĂ© des modĂšles de climat Ă  simuler correctement les caractĂ©ristiques spatiales et temporelles des changements climatiques
    • 

    corecore