29,313 research outputs found

    Loss Distribution Approach for Operational Risk Capital Modelling under Basel II: Combining Different Data Sources for Risk Estimation

    Full text link
    The management of operational risk in the banking industry has undergone significant changes over the last decade due to substantial changes in operational risk environment. Globalization, deregulation, the use of complex financial products and changes in information technology have resulted in exposure to new risks very different from market and credit risks. In response, Basel Committee for banking Supervision has developed a regulatory framework, referred to as Basel II, that introduced operational risk category and corresponding capital requirements. Over the past five years, major banks in most parts of the world have received accreditation under the Basel II Advanced Measurement Approach (AMA) by adopting the loss distribution approach (LDA) despite there being a number of unresolved methodological challenges in its implementation. Different approaches and methods are still under hot debate. In this paper, we review methods proposed in the literature for combining different data sources (internal data, external data and scenario analysis) which is one of the regulatory requirement for AMA

    Chain ladder method: Bayesian bootstrap versus classical bootstrap

    Full text link
    The intention of this paper is to estimate a Bayesian distribution-free chain ladder (DFCL) model using approximate Bayesian computation (ABC) methodology. We demonstrate how to estimate quantities of interest in claims reserving and compare the estimates to those obtained from classical and credibility approaches. In this context, a novel numerical procedure utilising Markov chain Monte Carlo (MCMC), ABC and a Bayesian bootstrap procedure was developed in a truly distribution-free setting. The ABC methodology arises because we work in a distribution-free setting in which we make no parametric assumptions, meaning we can not evaluate the likelihood point-wise or in this case simulate directly from the likelihood model. The use of a bootstrap procedure allows us to generate samples from the intractable likelihood without the requirement of distributional assumptions, this is crucial to the ABC framework. The developed methodology is used to obtain the empirical distribution of the DFCL model parameters and the predictive distribution of the outstanding loss liabilities conditional on the observed claims. We then estimate predictive Bayesian capital estimates, the Value at Risk (VaR) and the mean square error of prediction (MSEP). The latter is compared with the classical bootstrap and credibility methods

    Is there more to international Diffusion than Culture? An investigation on the Role of Marketing and Industry Variables

    Get PDF
    Companies employ international diffusion models to assess the local market potential and local diffusion speed to support their decision making on market entry. After their entry into a country, they use the model forecasts for their performance controlling. To this end, empirical applications of international diffusion models aim to link differential diffusion patterns across countries to various exogenous drivers. In the literature, macro- and socioeconomic variables like population characteristics, culture, economic development, etc. have been linked to differential penetration developments across countries. But as companies cannot influence these drivers, their marketing decisions that shape national diffusion patterns are ignored. Is this reasonable? What then, is the role of marketing instruments in an international diffusion context? We address this issue and compare the influence of these prominent exogenous drivers of international diffusion with that of industry and marketing-mix variables. To account for all of these factors and simultaneously accommodate the influence of varying cross-country interactions, we develop a more flexible yet parsimonious model of international diffusion. Finally, to avoid technical issues in implementing spatially dependent error terms we introduce the test concept of Moran's I to international diffusion model. We demonstrate that the lead-lag effect in conjunction with spatial neighborhood effects controls most of the spatial autocorrelation. Using this combined approach we find that --- for cellulars --- industry and marketing-mix variables explain international diffusion patterns better than macro- and socioeconomic drivers. --

    Self-organized Criticality and Absorbing States: Lessons from the Ising Model

    Full text link
    We investigate a suggested path to self-organized criticality. Originally, this path was devised to "generate criticality" in systems displaying an absorbing-state phase transition, but closer examination of the mechanism reveals that it can be used for any continuous phase transition. We used the Ising model as well as the Manna model to demonstrate how the finite-size scaling exponents depend on the tuning of driving and dissipation rates with system size.Our findings limit the explanatory power of the mechanism to non-universal critical behavior.Comment: 5 pages, 2 figures, REVTeX

    Electroproduction of Soft Pions at Large Momentum Transfers

    Get PDF
    We consider pion electroproduction on a proton target close to threshold for Q^2 in the region 1-10 GeV^2. The momentum transfer dependence of the S-wave multipoles at threshold, E_{0+} and L_{0+}, is calculated using light-cone sum rules.Comment: 8 pages, 3 figures; Invited talk at the workshop on Exclusive Reactions at High Momentum Transfer, 21-24 May 2007, Newport News, Virginia, U.S.A. and International Conference on hadron Physics TROIA'07, 30 Aug. - 3 Sept. 2007, Canakkale, Turke

    A State-Space Estimation of the Lee-Carter Mortality Model and Implications for Annuity Pricing

    Full text link
    In this article we investigate a state-space representation of the Lee-Carter model which is a benchmark stochastic mortality model for forecasting age-specific death rates. Existing relevant literature focuses mainly on mortality forecasting or pricing of longevity derivatives, while the full implications and methods of using the state-space representation of the Lee-Carter model in pricing retirement income products is yet to be examined. The main contribution of this article is twofold. First, we provide a rigorous and detailed derivation of the posterior distributions of the parameters and the latent process of the Lee-Carter model via Gibbs sampling. Our assumption for priors is slightly more general than the current literature in this area. Moreover, we suggest a new form of identification constraint not yet utilised in the actuarial literature that proves to be a more convenient approach for estimating the model under the state-space framework. Second, by exploiting the posterior distribution of the latent process and parameters, we examine the pricing range of annuities, taking into account the stochastic nature of the dynamics of the mortality rates. In this way we aim to capture the impact of longevity risk on the pricing of annuities. The outcome of our study demonstrates that an annuity price can be more than 4% under-valued when different assumptions are made on determining the survival curve constructed from the distribution of the forecasted death rates. Given that a typical annuity portfolio consists of a large number of policies with maturities which span decades, we conclude that the impact of longevity risk on the accurate pricing of annuities is a significant issue to be further researched. In addition, we find that mis-pricing is increasingly more pronounced for older ages as well as for annuity policies having a longer maturity.Comment: 9 pages; conference pape

    A unified approach to mortality modelling using state-space framework: characterisation, identification, estimation and forecasting

    Full text link
    This paper explores and develops alternative statistical representations and estimation approaches for dynamic mortality models. The framework we adopt is to reinterpret popular mortality models such as the Lee-Carter class of models in a general state-space modelling methodology, which allows modelling, estimation and forecasting of mortality under a unified framework. Furthermore, we propose an alternative class of model identification constraints which is more suited to statistical inference in filtering and parameter estimation settings based on maximization of the marginalized likelihood or in Bayesian inference. We then develop a novel class of Bayesian state-space models which incorporate apriori beliefs about the mortality model characteristics as well as for more flexible and appropriate assumptions relating to heteroscedasticity that present in observed mortality data. We show that multiple period and cohort effect can be cast under a state-space structure. To study long term mortality dynamics, we introduce stochastic volatility to the period effect. The estimation of the resulting stochastic volatility model of mortality is performed using a recent class of Monte Carlo procedure specifically designed for state and parameter estimation in Bayesian state-space models, known as the class of particle Markov chain Monte Carlo methods. We illustrate the framework we have developed using Danish male mortality data, and show that incorporating heteroscedasticity and stochastic volatility markedly improves model fit despite an increase of model complexity. Forecasting properties of the enhanced models are examined with long term and short term calibration periods on the reconstruction of life tables.Comment: 46 page

    Why will rat's go where rats will not

    Get PDF
    Experimental evidence indicates that regular plurals are nearly always omitted from English compounds (e.g., rats-eater) while irregular plurals may be included within these structures (e.g., mice-chaser). This phenomenon is considered to be good evidence to support the dual mechanism model of morphological processing (Pinker & Prince, 1992). However, evidence from neural net modelling has shown that a single route associative memory based account might provide an equally, if not more, valid explanation of the compounding phenomenon
    corecore