409 research outputs found

    Efficient treatment and quantification of uncertainty in probabilistic seismic hazard and risk analysis

    Get PDF
    The main goals of this thesis are the development of a computationally efficient framework for stochastic treatment of various important uncertainties in probabilistic seismic hazard and risk assessment, its application to a newly created seismic risk model of Indonesia, and the analysis and quantification of the impact of these uncertainties on the distribution of estimated seismic losses for a large number of synthetic portfolios modeled after real-world counterparts. The treatment and quantification of uncertainty in probabilistic seismic hazard and risk analysis has already been identified as an area that could benefit from increased research attention. Furthermore, it has become evident that the lack of research considering the development and application of suitable sampling schemes to increase the computational efficiency of the stochastic simulation represents a bottleneck for applications where model runtime is an important factor. In this research study, the development and state of the art of probabilistic seismic hazard and risk analysis is first reviewed and opportunities for improved treatment of uncertainties are identified. A newly developed framework for the stochastic treatment of portfolio location uncertainty as well as ground motion and damage uncertainty is presented. The framework is then optimized with respect to computational efficiency. Amongst other techniques, a novel variance reduction scheme for portfolio location uncertainty is developed. Furthermore, in this thesis, some well-known variance reduction schemes such as Quasi Monte Carlo, Latin Hypercube Sampling and MISER (locally adaptive recursive stratified sampling) are applied for the first time to seismic hazard and risk assessment. The effectiveness and applicability of all used schemes is analyzed. Several chapters of this monograph describe the theory, implementation and some exemplary applications of the framework. To conduct these exemplary applications, a seismic hazard model for Indonesia was developed and used for the analysis and quantification of loss uncertainty for a large collection of synthetic portfolios. As part of this work, the new framework was integrated into a probabilistic seismic hazard and risk assessment software suite developed and used by Munich Reinsurance Group. Furthermore, those parts of the framework that deal with location and damage uncertainties are also used by the flood and storm natural catastrophe model development groups at Munich Reinsurance for their risk models

    Efficient treatment and quantification of uncertainty in probabilistic seismic hazard and risk analysis

    Get PDF
    The main goals of this thesis are the development of a computationally efficient framework for stochastic treatment of various important uncertainties in probabilistic seismic hazard and risk assessment, its application to a newly created seismic risk model of Indonesia, and the analysis and quantification of the impact of these uncertainties on the distribution of estimated seismic losses for a large number of synthetic portfolios modeled after real-world counterparts. The treatment and quantification of uncertainty in probabilistic seismic hazard and risk analysis has already been identified as an area that could benefit from increased research attention. Furthermore, it has become evident that the lack of research considering the development and application of suitable sampling schemes to increase the computational efficiency of the stochastic simulation represents a bottleneck for applications where model runtime is an important factor. In this research study, the development and state of the art of probabilistic seismic hazard and risk analysis is first reviewed and opportunities for improved treatment of uncertainties are identified. A newly developed framework for the stochastic treatment of portfolio location uncertainty as well as ground motion and damage uncertainty is presented. The framework is then optimized with respect to computational efficiency. Amongst other techniques, a novel variance reduction scheme for portfolio location uncertainty is developed. Furthermore, in this thesis, some well-known variance reduction schemes such as Quasi Monte Carlo, Latin Hypercube Sampling and MISER (locally adaptive recursive stratified sampling) are applied for the first time to seismic hazard and risk assessment. The effectiveness and applicability of all used schemes is analyzed. Several chapters of this monograph describe the theory, implementation and some exemplary applications of the framework. To conduct these exemplary applications, a seismic hazard model for Indonesia was developed and used for the analysis and quantification of loss uncertainty for a large collection of synthetic portfolios. As part of this work, the new framework was integrated into a probabilistic seismic hazard and risk assessment software suite developed and used by Munich Reinsurance Group. Furthermore, those parts of the framework that deal with location and damage uncertainties are also used by the flood and storm natural catastrophe model development groups at Munich Reinsurance for their risk models

    Robust Estimation of Mahalanobis Distance in Hyperspectral Images

    Get PDF
    This dissertation develops new estimation methods that fit Johnson distributions and generalized Pareto distributions to hyperspectral Mahalanobis distances. The Johnson distribution fit is optimized using a new method which monitors the second derivative behavior of exceedance probability to mitigate potential outlier effects. This univariate distribution is then used to derive an elliptically contoured multivariate density model for the pixel data. The generalized Pareto distribution models are optimized by a new two-pass method that estimates the tail-index parameter. This method minimizes the mean squared fitting error by correcting parameter values using data distance information from an initial pass. A unique method for estimating the posterior density of the tail-index parameter for generalized Pareto models is also developed. Both the Johnson and Pareto distribution models are shown to reduce fitting error and to increase computational efficiency compared to previous models

    Spatial seismic hazard variation and adaptive sampling of portfolio location uncertainty in probabilistic seismic risk analysis

    Get PDF
    Probabilistic seismic risk analysis is widely used in the insurance industry to model the likelihood and severity of losses to insured portfolios by earthquake events. The available ground motion data - especially for strong and infrequent earthquakes - are often limited to a few decades, resulting in incomplete earthquake catalogues and related uncertainties and assumptions. The situation is further aggravated by the sometimes poor data quality with regard to insured portfolios. For example, due to geocoding issues of address information, risk items are often only known to be located within an administrative geographical zone, but precise coordinates remain unknown to the modeler. We analyze spatial seismic hazard and loss rate variation inside administrative geographical zones in western Indonesia. We find that the variation in hazard can vary strongly between different zones. The spatial variation in loss rate displays a similar pattern as the variation in hazard, without depending on the return period. In a recent work, we introduced a framework for stochastic treatment of portfolio location uncertainty. This results in the necessity to simulate ground motion on a high number of sampled geographical coordinates, which typically dominates the computational effort in probabilistic seismic risk analysis. We therefore propose a novel sampling scheme to improve the efficiency of stochastic portfolio location uncertainty treatment. Depending on risk item properties and measures of spatial loss rate variation, the scheme dynamically adapts the location sample size individually for insured risk items. We analyze the convergence and variance reduction of the scheme empirically. The results show that the scheme can improve the efficiency of the estimation of loss frequency curves and may thereby help to spread the treatment and communication of uncertainty in probabilistic seismic risk analysis

    Quantifying uncertainties on excursion sets under a Gaussian random field prior

    Get PDF
    We focus on the problem of estimating and quantifying uncertainties on the excursion set of a function under a limited evaluation budget. We adopt a Bayesian approach where the objective function is assumed to be a realization of a Gaussian random field. In this setting, the posterior distribution on the objective function gives rise to a posterior distribution on excursion sets. Several approaches exist to summarize the distribution of such sets based on random closed set theory. While the recently proposed Vorob'ev approach exploits analytical formulae, further notions of variability require Monte Carlo estimators relying on Gaussian random field conditional simulations. In the present work we propose a method to choose Monte Carlo simulation points and obtain quasi-realizations of the conditional field at fine designs through affine predictors. The points are chosen optimally in the sense that they minimize the posterior expected distance in measure between the excursion set and its reconstruction. The proposed method reduces the computational costs due to Monte Carlo simulations and enables the computation of quasi-realizations on fine designs in large dimensions. We apply this reconstruction approach to obtain realizations of an excursion set on a fine grid which allow us to give a new measure of uncertainty based on the distance transform of the excursion set. Finally we present a safety engineering test case where the simulation method is employed to compute a Monte Carlo estimate of a contour line

    Flood Events:Extreme Value Problems and Efficient Estimation of Loss

    Get PDF
    Widespread flood events have heavy consequences on society and the environment. Gaining insight into the occurrence and impact of these rare flood events is thus of interest to many parties such as governments, environmental organisations and insurance companies. To assess flood risk, past events are studied and used to t statistical models from which plausible flood events are simulated over large areas and large periods of time. These simulated extreme events then drive other models, such as models of loss for insurance purposes, to provide insight into the possible impact of future flood events. This thesis addresses problems in the analysis of extreme river flows which cause flooding, and the inefficiency of simulation of yearly loss due to flooding. Firstly, many extreme value analyses are conducted in reaction to the occurrence of a large flooding event. This timing of the analysis introduces bias and poor coverage probabilities into the associated risk assessments subsequently leading to over-designed flood protection schemes. These problems are explored through studying stochastic stopping criteria and new likelihood-based inferences are proposed that mitigate against these difficulties. Simulated extreme events are used along with geographical knowledge and property information to simulate losses at each property for each flood event over many years. These simulations are then aggregated to obtain total yearly losses and to estimate return levels of yearly loss. The large number of simulations needed makes this process computationally expensive. A new method is proposed, using novel concentration inequalities, which reduces the number of years that need to be simulated. Finally, modelling extreme flood events is complicated due to temporal dependence and the spatial dependencies of river flows between multiple locations with the presence of time lags between locations. The theory of multivariate temporally dependent extremes is explored, with focus on measures of dependence, and areas of further research are highlighted

    Towards the Efficient Probabilistic Characterization of Tropical Cyclone-Generated Storm Surge Hazards Under Stationary and Nonstationary Conditions

    Get PDF
    The scarcity of observations at any single location confounds the probabilistic characterization of tropical cyclone-generated storm surge hazards using annual maxima and peaks-over-threshold methods. The EST and the JPM are indirect approaches aimed at estimating the probability distribution of the response variable of interest (i.e. storm surge) using the probability distributions of predictor variables (e.g. storm size, storm intensity etc.). In the first part of this work, the relative performance of the empirical simulation technique (EST; Borgman et al., 1992) and the joint probability method (JPM; Myers, 1970) is evaluated via stochastic simulation methods. It is shown that the JPM has greater predictive capability for the estimation of the frequency of tropical cyclone winds, an efficient proxy for storm surge. The traditional attractions of the EST have been its economy and ease of implementation; more efficient numerical approximation schemes such as Bayesian quadrature now exist, which allows for more cost effective implementation of the JPM. In addition, typical enhancements of the original EST approach, such as the introduction of synthetic storms to complement the historical sample, are largely ineffective. These observations indicate that the EST should no longer be considered a practical approach for the robust and reliable estimation of the exceedance probabilities of storm surge levels, as required for actuarial purposes, engineering design and flood risk management in tropical cyclone-prone regions. The JPM is, however, not applicable to extratropical storm-prone regions and nonstationary phenomena. Additionally, the JPM requires the evaluation of a multidimensional integral composed of the product of marginal and conditional probability distributions of storm descriptors. This integral is typically approximated as a weighted summation of discrete function evaluations in each dimension and extended to D-dimensions by tensor product rules. To adequately capture the dynamics of the underlying physical process—storm surge driven by tropical cyclone wind fields—one must maintain a large number of explanatory variables in the integral. The complexity and cost of the joint probability problem, however, increases exponentially with dimension, precluding the inclusion of more than a few (≤4) stochastic variables. In the second part of the work, we extend stochastic simulation approaches to the classical joint probability problem. The successful implementation of stochastic simulation to the storm surge frequency problem requires the introduction of a new paradigm: the use of a regression function constructed by the careful selection of an optimal training set from the storm sample space such that the growth of support nodes required for efficient interpolation remains nonexponential while preserving the performance of a product grid equivalent. Apart from retaining the predictive capability of the JPM, the stochastic simulation approach also allows for nonstationary phenomena such as the effects of climate change on tropical cyclone activity to be efficiently modeled. A great utility of the stochastic approach is that the random sampling scheme is readily modified so that it conducts empirical simulation if required in place of parametric simulation. The enhanced empirical simulation technique attains predictive capabilities that are comparable with the JPM and the parametric simulation approach, while also retaining the suitability of empirical methods for application to situations that confound parametric methods, such as, application to extratropical cyclones and complexly distributed data. The parametric and empirical simulation techniques, together, will enable seamless flood hazard estimation for the entire coastline of the United States, with simple elaborations where needed to allow for the joint occurrence of both tropical and extratropical storms as compound stochastic processes. The stochastic approaches proposed hold great promise for the efficient probabilistic modeling of other multi-parameter systems such as earthquakes and riverine floods
    • …
    corecore