88 research outputs found

    Dominance of the mean sea level in the high-percentile sea levels time evolution with respect to large-scale climate variability: a Bayesian statistical approach

    Get PDF
    International audienceChanges in mean sea level (MSL) are a major, but not the unique, cause of changes in high-percentile sea levels (HSL), e.g. the annual 99.9th quantile of sea level (among other factors, climate variability may also have huge influence). To unravel the respective influence of each contributor, we propose to use structural time series models considering six major climate indices (CI) (Artic Oscillation, North Atlantic Oscillation, Atlantic Multidecadal Oscillation, Southern Oscillation Index, Nino 1 + 2 and Nino 3.4) as well as a reconstruction of MSL. The method is applied to eight century-long tide gauges across the world (Brest (France), Newlyn (UK), Cuxhaven (Germany), Stockholm (Sweden), Gedser (Danemark), Halifax (Canada), San Francisco (US), and Honolulu (US)). The treatment within a Bayesian setting enables to derive an importance indicator, which measures how often the considered driver is included in the model. The application to the eight tide gauges outlines that MSL signal is a strong driver (except for Gedser), but is not unique. In particular, the influence of Artic Oscillation index at Cuxhaven, Stockholm and Halifax, and of Nino Sea Surface Temperature index 1 + 2 at San Francisco appear to be very strong as well. Asimilar analysis was conducted by restricting the time period of interest to the 1st part of the 20th century. Over this period, we show that the MSL dominance is lower, whereas an ensemble of CI contribute to a large part to HSL time evolution as well. The proposed setting is flexible and could be applied to incorporate any alternative predictive time series such as river discharge, tidal constituents or vertical ground motions where relevant

    Analyse de sensibilité des incertitudes paramétriques dans les évaluations d’aléas géotechniques

    Get PDF
    Epistemic uncertainty can be reduced via additional lab or in site measurements or additional numerical simulations. We focused here on parameter uncertainty: this corresponds to the incomplete knowledge of the correct setting of the input parameters (like values of soil properties) of the model supporting the geo-hazard assessment. A possible option tomanage it is via sensitivity analysis, which aims at identifying the contribution (i.e. the importance) of the different input parameters in the uncertainty on the final hazard outcome. For this purpose, advanced techniques exist, namely variance-based global sensitivity analysis. Yet, their practical implementation faces three major limitations related to the specificities of the geo-hazard domain: 1. the large computation time cost (several hours if not days) of numerical models; 2. the parameters are complex functions of time and space; 3. data are often scarce, limited if not vague. In the present PhD thesis, statistical approaches were developed, tested and adapted to overcome those limits. A special attention was paid to test the feasibility of those statistical tools by confronting them to real cases (natural hazards related to earthquakes, cavities and landslides).Les incertitudes épistémiques peuvent être réduites via des études supplémentaires (mesures labo, in situ, ou modélisations numériques, etc.). Nous nous concentrons ici sur celle "paramétrique" liée aux difficultés à évaluer quantitativement les paramètres d’entrée du modèle utilisé pour l’analysedes aléas géotechniques. Une stratégie de gestion possible est l’analyse de sensibilité, qui consiste à identifier la contribution (i.e. l’importance) des paramètres dans l’incertitude de l’évaluation de l’aléa. Des approches avancées existent pour conduire une telle analyse. Toutefois, leur applicationau domaine des aléas géotechniques se confronte à plusieurs contraintes : 1. le coût calculatoire des modèles numériques (plusieurs heures voire jours) ; 2. les paramètres sont souvent des fonctions complexes du temps et de l’espace ; 3. les données sont souvent limitées, imprécises voire vagues. Danscette thèse, nous avons testé et adapté des outils statistiques pour surmonter ces limites. Une attention toute particulière a été portée sur le test de faisabilité de ces procédures et sur la confrontation à des cas réels (aléas naturels liés aux séismes, cavités et glissements de terrain)

    Sensitivity analysis of rockfall trajectory simulations to material properties

    Get PDF
    International audienceMany tools have been developed to manage rockfall risk. In particular, many softwares are designed to simulate rockfall trajectories. These softwares require the definition of many parameters, especially those describing the mechanical properties of soils (rigidity, roughness, etc.). Choosing appropriate values for these parameters remains a difficult task and will depend on the expert know-how. Here, we propose a simple method that can be used routinely to evaluate the relative influence of these parameters (about 50 parameters for the examples below) on the simulation results. The objective is 1) to identify the parameters that are playing a key or predominant role in the simulations and that require additional characterization efforts, 2) to estimate the uncertainty that exists on the simulation results. The application cases for this sensitivity analysis are two busy roads on Reunion island (France) when considering the residual rockfall risk after a major rockfall event

    Improving interpretation of sea-level projections through a machine-learning-based local explanation approach

    Get PDF
    Process-based projections of the sea-level contribution from land ice components are often obtained from simulations using a complex chain of numerical models. Because of their importance in supporting the decision-making process for coastal risk assessment and adaptation, improving the interpretability of these projections is of great interest. To this end, we adopt the local attribution approach developed in the machine learning community by combining the game-theoretic approach known as &lsquo;SHAP&rsquo; (SHapley Additive exPlanation) with machine-learning regression models. We apply our methodology to a subset of the multi-model ensemble study of the future contribution of the Greenland ice sheet to sea-level, taking into account different modelling choices related to (1) the numerical implementation, (2) the initial conditions, and (3) the modelling of ice-sheet processes. This allows us to quantify the influence of particular modelling decisions, which is directly expressed in terms of sea level change contribution. This type of diagnosis can be performed on any member of the ensemble, and we show in the Greenland case how the aggregation of the local attribution analyses can help guide future model development as well as scientific interpretation, particularly with regard to spatial model resolution and to retreat parametrisation.</p

    Partitioning the contributions of dependent offshore forcing conditions in the probabilistic assessment of future coastal flooding

    Get PDF
    Getting a deep insight into the role of coastal flooding drivers is of great interest for the planning of adaptation strategies for future climate conditions. Using global sensitivity analysis, we aim to measure the contributions of the offshore forcing conditions (wave–wind characteristics, still water level and sea level rise (SLR) projected up to 2200) to the occurrence of a flooding event at Gâvres town on the French Atlantic coast in a macrotidal environment. This procedure faces, however, two major difficulties, namely (1) the high computational time costs of the hydrodynamic numerical simulations and (2) the statistical dependence between the forcing conditions. By applying a Monte Carlo-based approach combined with multivariate extreme value analysis, our study proposes a procedure to overcome both difficulties by calculating sensitivity measures dedicated to dependent input variables (named Shapley effects) using Gaussian process (GP) metamodels. On this basis, our results show the increasing influence of SLR over time and a small-to-moderate contribution of wave–wind characteristics or even negligible importance in the very long term (beyond 2100). These results were discussed in relation to our modelling choices, in particular the climate change scenario, as well as the uncertainties of the estimation procedure (Monte Carlo sampling and GP error).</p

    CO2 geological storage safety assessment: methodological developments

    Get PDF
    International audienceCarbon dioxide capture and geological storage is seen as a promising technology to mitigate greenhouse gas atmospheric emissions. Its wide-scale implementation necessitates demonstrating its safety for humans and the environment. We have developed a generic approach to provide references for safety assessment of CO2 storage. It is composed of a series of simple tools for identifying risk scenarios, modelling risk events and exposure. It incorporates a rigorous management of uncertainty, distinguishing between variability and knowledge incompleteness. We applied this approach on a case study in the Paris Basin. This demonstrates how it delivers conditions mixing qualitative and quantitative elements for guaranteeing safety. This approach is flexible; it can be used for various sites and with various amounts of data. It can be carried out in a time-efficient manner at various stages of a project. In particular, it provides an operator or an authority with safety indicators in an early phase or for reviewing a risk assessment. Though not a complete risk assessment workflow, it thus partly compensates for the current lack of commonly acknowledged assessment methods or safety standards for CO2 geological storage

    A global experience-sampling method study of well-being during times of crisis : The CoCo project

    Get PDF
    We present a global experience-sampling method (ESM) study aimed at describing, predicting, and understanding individual differences in well-being during times of crisis such as the COVID-19 pandemic. This international ESM study is a collaborative effort of over 60 interdisciplinary researchers from around the world in the “Coping with Corona” (CoCo) project. The study comprises trait-, state-, and daily-level data of 7490 participants from over 20 countries (total ESM measurements = 207,263; total daily measurements = 73,295) collected between October 2021 and August 2022. We provide a brief overview of the theoretical background and aims of the study, present the applied methods (including a description of the study design, data collection procedures, data cleaning, and final sample), and discuss exemplary research questions to which these data can be applied. We end by inviting collaborations on the CoCo dataset

    Short-term forecasting of saltwater occurrence at La Comté River (French Guiana) using a kernel-based support vector machine

    No full text
    International audienc
    • …
    corecore