19,908 research outputs found

    Implementing Loss Distribution Approach for Operational Risk

    Full text link
    To quantify the operational risk capital charge under the current regulatory framework for banking supervision, referred to as Basel II, many banks adopt the Loss Distribution Approach. There are many modeling issues that should be resolved to use the approach in practice. In this paper we review the quantitative methods suggested in literature for implementation of the approach. In particular, the use of the Bayesian inference method that allows to take expert judgement and parameter uncertainty into account, modeling dependence and inclusion of insurance are discussed

    Informative Data Projections: A Framework and Two Examples

    Get PDF
    Methods for Projection Pursuit aim to facilitate the visual exploration of high-dimensional data by identifying interesting low-dimensional projections. A major challenge is the design of a suitable quality metric of projections, commonly referred to as the projection index, to be maximized by the Projection Pursuit algorithm. In this paper, we introduce a new information-theoretic strategy for tackling this problem, based on quantifying the amount of information the projection conveys to a user given their prior beliefs about the data. The resulting projection index is a subjective quantity, explicitly dependent on the intended user. As a useful illustration, we developed this idea for two particular kinds of prior beliefs. The first kind leads to PCA (Principal Component Analysis), shining new light on when PCA is (not) appropriate. The second kind leads to a novel projection index, the maximization of which can be regarded as a robust variant of PCA. We show how this projection index, though non-convex, can be effectively maximized using a modified power method as well as using a semidefinite programming relaxation. The usefulness of this new projection index is demonstrated in comparative empirical experiments against PCA and a popular Projection Pursuit method

    Detecting spatial patterns with the cumulant function. Part I: The theory

    Get PDF
    In climate studies, detecting spatial patterns that largely deviate from the sample mean still remains a statistical challenge. Although a Principal Component Analysis (PCA), or equivalently a Empirical Orthogonal Functions (EOF) decomposition, is often applied on this purpose, it can only provide meaningful results if the underlying multivariate distribution is Gaussian. Indeed, PCA is based on optimizing second order moments quantities and the covariance matrix can only capture the full dependence structure for multivariate Gaussian vectors. Whenever the application at hand can not satisfy this normality hypothesis (e.g. precipitation data), alternatives and/or improvements to PCA have to be developed and studied. To go beyond this second order statistics constraint that limits the applicability of the PCA, we take advantage of the cumulant function that can produce higher order moments information. This cumulant function, well-known in the statistical literature, allows us to propose a new, simple and fast procedure to identify spatial patterns for non-Gaussian data. Our algorithm consists in maximizing the cumulant function. To illustrate our approach, its implementation for which explicit computations are obtained is performed on three family of of multivariate random vectors. In addition, we show that our algorithm corresponds to selecting the directions along which projected data display the largest spread over the marginal probability density tails.Comment: 9 pages, 3 figure

    Among-site variability in the stochastic dynamics of East African coral reefs

    Get PDF
    Coral reefs are dynamic systems whose composition is highly influenced by unpredictable biotic and abiotic factors. Understanding the spatial scale at which long-term predictions of reef composition can be made will be crucial for guiding conservation efforts. Using a 22-year time series of benthic composition data from 20 reefs on the Kenyan and Tanzanian coast, we studied the long-term behaviour of Bayesian vector autoregressive state-space models for reef dynamics, incorporating among-site variability. We estimate that if there were no among-site variability, the total long-term variability would be approximately one third of its current value. Thus among-site variability contributes more to long-term variability in reef composition than does temporal variability. Individual sites are more predictable than previously thought, and predictions based on current snapshots are informative about long-term properties. Our approach allowed us to identify a subset of possible climate refugia sites with high conservation value, where the long-term probability of coral cover <= 0.1 was very low. Analytical results show that this probability is most strongly influenced by among-site variability and by interactions among benthic components within sites. These findings suggest that conservation initiatives might be successful at the site scale as well as the regional scale.Comment: 97 pages, 49 figure

    Shrinkage Estimation in Multilevel Normal Models

    Full text link
    This review traces the evolution of theory that started when Charles Stein in 1955 [In Proc. 3rd Berkeley Sympos. Math. Statist. Probab. I (1956) 197--206, Univ. California Press] showed that using each separate sample mean from k3k\ge3 Normal populations to estimate its own population mean μi\mu_i can be improved upon uniformly for every possible μ=(μ1,...,μk)\mu=(\mu_1,...,\mu_k)'. The dominating estimators, referred to here as being "Model-I minimax," can be found by shrinking the sample means toward any constant vector. Admissible minimax shrinkage estimators were derived by Stein and others as posterior means based on a random effects model, "Model-II" here, wherein the μi\mu_i values have their own distributions. Section 2 centers on Figure 2, which organizes a wide class of priors on the unknown Level-II hyperparameters that have been proved to yield admissible Model-I minimax shrinkage estimators in the "equal variance case." Putting a flat prior on the Level-II variance is unique in this class for its scale-invariance and for its conjugacy, and it induces Stein's harmonic prior (SHP) on μi\mu_i.Comment: Published in at http://dx.doi.org/10.1214/11-STS363 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Selection of proposal distributions for generalized importance sampling estimators

    Get PDF
    The standard importance sampling (IS) estimator, generally does not work well in examples involving simultaneous inference on several targets as the importance weights can take arbitrarily large values making the estimator highly unstable. In such situations, alternative generalized IS estimators involving samples from multiple proposal distributions are preferred. Just like the standard IS, the success of these multiple IS estimators crucially depends on the choice of the proposal distributions. The selection of these proposal distributions is the focus of this article. We propose three methods based on (i) a geometric space filling coverage criterion, (ii) a minimax variance approach, and (iii) a maximum entropy approach. The first two methods are applicable to any multi-proposal IS estimator, whereas the third approach is described in the context of Doss's (2010) two-stage IS estimator. For the first method we propose a suitable measure of coverage based on the symmetric Kullback-Leibler divergence, while the second and third approaches use estimates of asymptotic variances of Doss's (2010) IS estimator and Geyer's (1994) reverse logistic estimator, respectively. Thus, we provide consistent spectral variance estimators for these asymptotic variances. The proposed methods for selecting proposal densities are illustrated using various detailed examples
    corecore