137 research outputs found

    Regional climate downscaling with prior statistical correction of the global climate forcing

    Get PDF
    International audienceA novel climate downscaling methodology that attempts to correct climate simulation biases is proposed. By combining an advanced statistical bias correction method with a dynamical downscaling it constitutes a hybrid technique that yields nearly unbiased, high-resolution, physically consistent, three-dimensional fields that can be used for climate impact studies. The method is based on a prior statistical distribution correction of large-scale global climate model (GCM) 3-dimensional output fields to be taken as boundary forcing of a dynamical regional climate model (RCM). GCM fields are corrected using meteorological reanalyses. We evaluate this methodology over a decadal experiment. The improvement in terms of spatial and temporal variability is discussed against observations for a past period. The biases of the downscaled fields are much lower using this hybrid technique, up to a factor 4 for the mean temperature bias compared to the dynamical downscaling alone without prior bias correction. Precipitation biases are subsequently improved hence offering optimistic perspectives for climate impact studies

    Modeling pairwise dependencies in precipitation intensities

    Get PDF
    International audienceIn statistics, extreme events are classically defined as maxima over a block length (e.g. annual maxima of daily precipitation) or as exceedances above a given large threshold. These definitions allow the hydrologist and the flood planner to apply the univariate Extreme Value Theory (EVT) to their time series of interest. But these strategies have two main drawbacks. Firstly, working with maxima or exceedances implies that a lot of observations (those below the chosen threshold or the maximum) are completely disregarded. Secondly, this univariate modeling does not take into account the spatial dependence. Nearby weather stations are considered independent, although their recordings can show otherwise. To start addressing these two issues, we propose a new statistical bivariate model that takes advantages of the recent advances in multivariate EVT. Our model can be viewed as an extension of the non-homogeneous univariate mixture. The two strong points of this latter model are its capacity at modeling the entire range of precipitation (and not only the largest values) and the absence of an arbitrarily fixed large threshold to define exceedances. Here, we adapt this mixture and broaden it to the joint modeling of bivariate precipitation recordings. The performance and flexibility of this new model are illustrated on simulated and real precipitation data

    Comparison of GCM- and RCM-simulated precipitation following stochastic postprocessing

    Get PDF
    In order to assess to what extent regional climate models (RCMs) yield better representations of climatic states than general circulation models (GCMs), the output of each is usually directly compared with observations. RCM output is often bias corrected, and in some cases correction methods can also be applied to GCMs. This leads to the question of whether bias-corrected RCMs perform better than bias-corrected GCMs. Here the first results from such a comparison are presented, followed by discussion of the value added by RCMs in this setup. Stochastic postprocessing, based on Model Output Statistics (MOS), is used to estimate daily precipitation at 465 stations across the United Kingdom between 1961 and 2000 using simulated precipitation from two RCMs (RACMO2 and CCLM) and, for the first time, a GCM (ECHAM5) as predictors. The large-scale weather states in each simulation are forced toward observations. The MOS method uses logistic regression to model precipitation occurrence and a Gamma distribution for the wet day distribution, and is cross validated based on Brier and quantile skill scores. A major outcome of the study is that the corrected GCM-simulated precipitation yields consistently higher validation scores than the corrected RCM-simulated precipitation. This seems to suggest that, in a setup with postprocessing, there is no clear added value by RCMs with respect to downscaling individual weather states. However, due to the different ways of controlling the atmospheric circulation in the RCM and the GCM simulations, such a strong conclusion cannot be drawn. Yet the study demonstrates how challenging it is to demonstrate the value added by RCMs in this setup

    Non-linear statistical downscaling of present and LGM precipitation and temperatures over Europe

    Get PDF
    International audienceLocal-scale climate information is increasingly needed for the study of past, present and future climate changes. In this study we develop a non-linear statistical downscaling method to generate local temperatures and precipitation values from large-scale variables of a Earth System Model of Intermediate Complexity (here CLIMBER). Our statistical downscaling scheme is based on the concept of Generalized Additive Models (GAMs), capturing non-linearities via non-parametric techniques. Our GAMs are calibrated on the present Western Europe climate. For this region, annual GAMs (i.e. models based on 12 monthly values per location) are fitted by combining two types of large-scale explanatory variables: geographical (e.g. topographical information) and physical (i.e. entirely simulated by the CLIMBER model). To evaluate the adequacy of the non-linear transfer functions fitted on the present Western European climate, they are applied to different spatial and temporal large-scale conditions. Local projections for present North America and Northern Europe climates are obtained and compared to local observations. This partially addresses the issue of spatial robustness of our transfer functions by answering the question "does our statistical model remain valid when applied to large-scale climate conditions from a region different from the one used for calibration?". To asses their temporal performances, local projections for the Last Glacial Maximum period are derived and compared to local reconstructions and General Circulation Model outputs. Our downscaling methodology performs adequately for the Western Europe climate. Concerning the spatial and temporal evaluations, it does not behave as well for Northern America and Northern Europe climates because the calibration domain may be too different from the targeted regions. The physical explanatory variables alone are not capable of downscaling realistic values. However, the inclusion of geographical-type variables – such as altitude, advective continentality and moutains effect on wind (W–slope) – as GAM explanatory variables clearly improves our local projections

    Higher probability of compound flooding from precipitation and storm surge in Europe under anthropogenic climate change

    Get PDF
    In low-lying coastal areas, the co-occurrence of high sea level and precipitation resulting in large runoff may cause compound flooding (CF). When the two hazards interact, the resulting impact can be worse than when they occur individually. Both storm surges and heavy precipitation, as well as their interplay, are likely to change in response to global warming. Despite the CF relevance, a comprehensive hazard assessment beyond individual locations is missing, and no studies have examined CF in the future. Analyzing co-occurring high sea level and heavy precipitation in Europe, we show that the Mediterranean coasts are experiencing the highest CF probability in the present. However, future climate projections show emerging high CF probability along parts of the northern European coast. In several European regions, CF should be considered as a potential hazard aggravating the risk caused by mean sea level rise in the future

    Coupling statistically downscaled GCM outputs with a basin-lake hydrological model in subtropical South America: evaluation of the influence of large-scale precipitation changes on regional hydroclimate variability

    Get PDF
    International audienceWe explore the reliability of large-scale climate variables, namely precipitation and temperature, as inputs for a basin-lake hydrological model in central Argentina. We used data from two regions in NCEP/NCAR reanalyses and three regions from LMDZ model simulations forced with observed sea surface temperature (HadISST) for the last 50 years. Reanalyses data cover part of the geographical area of the Sali-Dulce Basin (region A) and a zone at lower latitudes (region B). The LMDZ selected regions represent the geographical area of the Sali-Dulce Basin (box A), and two areas outside of the basin at lower latitudes (boxes B and C). A statistical downscaling method is used to connect the large-scale climate variables inferred from LMDZ and the reanalyses, with the hydrological Soil Water Assessment Tool (SWAT) model in order to simulate the Rio Sali-Dulce discharge during 1950-2005. The SWAT simulations are then used to force the water balance of Laguna Mar Chiquita, which experienced an abrupt level rise in the 1970's attributed to the increase in Rio Sali-Dulce discharge. Despite that the lowstand in the 1970's is not well reproduced in either simulation, the key hydrological cycles in the lake level are accurately captured. Even though satisfying results are obtained with the SWAT simulations using downscaled reanalyses, the lake level are more realistically simulated with the SWAT simulations using downscaled LMDZ data in boxes B and C, showing a strong climate influence from the tropics on lake level fluctuations. Our results highlight the ability of downscaled climatic data to reproduce regional climate features. Laguna Mar Chiquita can therefore be considered as an integrator of large-scale climate changes since the forcing scenarios giving best results are those relying on global climate simulations forced with observed sea surface temperature. This climate-basin-lake model is a promising approach for understanding and simulating long-term lake level variations

    Present and LGM permafrost from climate simulations : contribution of statistical downscaling

    Get PDF
    We quantify the agreement between permafrost distributions from PMIP2 (Paleoclimate Modeling Intercomparison Project) climate models and permafrost data. We evaluate the ability of several climate models to represent permafrost and assess the variability between their results. <br><br> Studying a heterogeneous variable such as permafrost implies conducting analysis at a smaller spatial scale compared with climate models resolution. Our approach consists of applying statistical downscaling methods (SDMs) on large- or regional-scale atmospheric variables provided by climate models, leading to local-scale permafrost modelling. Among the SDMs, we first choose a transfer function approach based on Generalized Additive Models (GAMs) to produce high-resolution climatology of air temperature at the surface. Then we define permafrost distribution over Eurasia by air temperature conditions. In a first validation step on present climate (CTRL period), this method shows some limitations with non-systematic improvements in comparison with the large-scale fields. <br><br> So, we develop an alternative method of statistical downscaling based on a Multinomial Logistic GAM (ML-GAM), which directly predicts the occurrence probabilities of local-scale permafrost. The obtained permafrost distributions appear in a better agreement with CTRL data. In average for the nine PMIP2 models, we measure a global agreement with CTRL permafrost data that is better when using ML-GAM than when applying the GAM method with air temperature conditions. In both cases, the provided local information reduces the variability between climate models results. This also confirms that a simple relationship between permafrost and the air temperature only is not always sufficient to represent local-scale permafrost. <br><br> Finally, we apply each method on a very different climate, the Last Glacial Maximum (LGM) time period, in order to quantify the ability of climate models to represent LGM permafrost. The prediction of the SDMs (GAM and ML-GAM) is not significantly in better agreement with LGM permafrost data than large-scale fields. At the LGM, both methods do not reduce the variability between climate models results. We show that LGM permafrost distribution from climate models strongly depends on large-scale air temperature at the surface. LGM simulations from climate models lead to larger differences with LGM data than in the CTRL period. These differences reduce the contribution of downscaling

    Dealing with non-stationarity in sub-daily stochastic rainfall models

    Get PDF
    Understanding the stationarity properties of rainfall is critical when using stochastic weather generators. Rainfall stationarity means that the statistics being accounted for remain constant over a given period, which is required for both inferring model parameters and simulating synthetic rainfall. Despite its critical importance, the stationarity of precipitation statistics is often regarded as a subjective choice whose examination is left to the judgement of the modeller. It is therefore desirable to establish quantitative and objective criteria for defining stationary rain periods. To this end, we propose a methodology that automatically identifies rain types with homogeneous statistics. It is based on an unsupervised classification of the space–time–intensity structure of weather radar images. The transitions between rain types are interpreted as non-stationarities.Our method is particularly suited to deal with non-stationarity in the context of sub-daily stochastic rainfall models. Results of a synthetic case study show that the proposed approach is able to reliably identify synthetically generated rain types. The application of rain typing to real data indicates that non-stationarity can be significant within meteorological seasons, and even within a single storm. This highlights the need for a careful examination of the temporal stationarity of precipitation statistics when modelling rainfall at high resolution.</p

    Comparison of spatial downscaling methods of general circulation model results to study climate variability during the last glacial maximum

    Get PDF
    The extent to which climate conditions influenced the spatial distribution of hominin populations in the past is highly debated. General circulation models (GCMs) and archaeological data have been used to address this issue. Most GCMs are not currently capable of simulating past surface climate conditions with sufficiently detailed spatial resolution to distinguish areas of potential hominin habitat, however. In this paper, we propose a statistical downscaling method (SDM) for increasing the resolution of climate model outputs in a computationally efficient way. Our method uses a generalised additive model (GAM), calibrated over present-day climatology data, to statistically downscale temperature and precipitation time series from the outputs of a GCM simulating the climate of the Last Glacial Maximum (19 000–23 000 BP) over western Europe. Once the SDM is calibrated, we first interpolate the coarse-scale GCM outputs to the final resolution and then use the GAM to compute surface air temperature and precipitation levels using these interpolated GCM outputs and fine-resolution geographical variables such as topography and distance from an ocean. The GAM acts as a transfer function, capturing non-linear relationships between variables at different spatial scales and correcting for the GCM biases. We tested three different techniques for the first interpolation of GCM output: bilinear, bicubic and kriging. The resulting SDMs were evaluated by comparing downscaled temperature and precipitation at local sites with paleoclimate reconstructions based on paleoclimate archives (archaeozoological and palynological data) and the impact of the interpolation technique on patterns of variability was explored. The SDM based on kriging interpolation, providing the best accuracy, was then validated on present-day data outside of the calibration period. Our results show that the downscaled temperature and precipitation values are in good agreement with paleoclimate reconstructions at local sites, and that our method for producing fine-grained paleoclimate simulations is therefore suitable for conducting paleo-anthropological research. It is nonetheless important to calibrate the GAM on a range of data encompassing the data to be downscaled. Otherwise, the SDM is likely to overcorrect the coarse-grain data. In addition, the bilinear and bicubic interpolation techniques were shown to distort either the temporal variability or the values of the response variables, while the kriging method offered the best compromise. Since climate variability is an aspect of the environment to which human populations may have responded in the past, the choice of interpolation technique is therefore an important consideration.</p
    corecore