995 research outputs found

    Calculation of aggregate loss distributions

    Full text link
    Estimation of the operational risk capital under the Loss Distribution Approach requires evaluation of aggregate (compound) loss distributions which is one of the classic problems in risk theory. Closed-form solutions are not available for the distributions typically used in operational risk. However with modern computer processing power, these distributions can be calculated virtually exactly using numerical methods. This paper reviews numerical algorithms that can be successfully used to calculate the aggregate loss distributions. In particular Monte Carlo, Panjer recursion and Fourier transformation methods are presented and compared. Also, several closed-form approximations based on moment matching and asymptotic result for heavy-tailed distributions are reviewed

    Bayesian Model Choice of Grouped t-copula

    Full text link
    One of the most popular copulas for modeling dependence structures is t-copula. Recently the grouped t-copula was generalized to allow each group to have one member only, so that a priori grouping is not required and the dependence modeling is more flexible. This paper describes a Markov chain Monte Carlo (MCMC) method under the Bayesian inference framework for estimating and choosing t-copula models. Using historical data of foreign exchange (FX) rates as a case study, we found that Bayesian model choice criteria overwhelmingly favor the generalized t-copula. In addition, all the criteria also agree on the second most likely model and these inferences are all consistent with classical likelihood ratio tests. Finally, we demonstrate the impact of model choice on the conditional Value-at-Risk for portfolios of six major FX rates

    Modeling operational risk data reported above a time-varying threshold

    Full text link
    Typically, operational risk losses are reported above a threshold. Fitting data reported above a constant threshold is a well known and studied problem. However, in practice, the losses are scaled for business and other factors before the fitting and thus the threshold is varying across the scaled data sample. A reporting level may also change when a bank changes its reporting policy. We present both the maximum likelihood and Bayesian Markov chain Monte Carlo approaches to fitting the frequency and severity loss distributions using data in the case of a time varying threshold. Estimation of the annual loss distribution accounting for parameter uncertainty is also presented

    A unified pricing of variable annuity guarantees under the optimal stochastic control framework

    Get PDF
    In this paper, we review pricing of variable annuity living and death guarantees offered to retail investors in many countries. Investors purchase these products to take advantage of market growth and protect savings. We present pricing of these products via an optimal stochastic control framework, and review the existing numerical methods. For numerical valuation of these contracts, we develop a direct integration method based on Gauss-Hermite quadrature with a one-dimensional cubic spline for calculation of the expected contract value, and a bi-cubic spline interpolation for applying the jump conditions across the contract cashflow event times. This method is very efficient when compared to the partial differential equation methods if the transition density (or its moments) of the risky asset underlying the contract is known in closed form between the event times. We also present accurate numerical results for pricing of a Guaranteed Minimum Accumulation Benefit (GMAB) guarantee available on the market that can serve as a benchmark for practitioners and researchers developing pricing of variable annuity guarantees.Comment: Keywords: variable annuity, guaranteed living and death benefits, guaranteed minimum accumulation benefit, optimal stochastic control, direct integration metho

    A Short Tale of Long Tail Integration

    Full text link
    Integration of the form af(x)w(x)dx\int_a^\infty {f(x)w(x)dx} , where w(x)w(x) is either sin(ωx)\sin (\omega {\kern 1pt} x) or cos(ωx)\cos (\omega {\kern 1pt} x), is widely encountered in many engineering and scientific applications, such as those involving Fourier or Laplace transforms. Often such integrals are approximated by a numerical integration over a finite domain (a,b)(a,\,b), leaving a truncation error equal to the tail integration bf(x)w(x)dx\int_b^\infty {f(x)w(x)dx} in addition to the discretization error. This paper describes a very simple, perhaps the simplest, end-point correction to approximate the tail integration, which significantly reduces the truncation error and thus increases the overall accuracy of the numerical integration, with virtually no extra computational effort. Higher order correction terms and error estimates for the end-point correction formula are also derived. The effectiveness of this one-point correction formula is demonstrated through several examples

    Computing Tails of Compound Distributions Using Direct Numerical Integration

    Full text link
    An efficient adaptive direct numerical integration (DNI) algorithm is developed for computing high quantiles and conditional Value at Risk (CVaR) of compound distributions using characteristic functions. A key innovation of the numerical scheme is an effective tail integration approximation that reduces the truncation errors significantly with little extra effort. High precision results of the 0.999 quantile and CVaR were obtained for compound losses with heavy tails and a very wide range of loss frequencies using the DNI, Fast Fourier Transform (FFT) and Monte Carlo (MC) methods. These results, particularly relevant to operational risk modelling, can serve as benchmarks for comparing different numerical methods. We found that the adaptive DNI can achieve high accuracy with relatively coarse grids. It is much faster than MC and competitive with FFT in computing high quantiles and CVaR of compound distributions in the case of moderate to high frequencies and heavy tails

    Holder-extendible European option: corrections and extensions

    Full text link
    Financial contracts with options that allow the holder to extend the contract maturity by paying an additional fixed amount found many applications in finance. Closed-form solutions for the price of these options have appeared in the literature for the case when the contract underlying asset follows a geometric Brownian motion with the constant interest rate, volatility, and non-negative "dividend" yield. In this paper, the option price is derived for the case of the underlying asset that follows a geometric Brownian motion with the time-dependent drift and volatility which is important to use the solutions in real life applications. The formulas are derived for the drift that may include non-negative or negative "dividend" yield. The latter case results in a new solution type that has not been studied in the literature. Several typographical errors in the formula for the holder-extendible put, typically repeated in textbooks and software, are corrected

    Loss Distribution Approach for Operational Risk Capital Modelling under Basel II: Combining Different Data Sources for Risk Estimation

    Full text link
    The management of operational risk in the banking industry has undergone significant changes over the last decade due to substantial changes in operational risk environment. Globalization, deregulation, the use of complex financial products and changes in information technology have resulted in exposure to new risks very different from market and credit risks. In response, Basel Committee for banking Supervision has developed a regulatory framework, referred to as Basel II, that introduced operational risk category and corresponding capital requirements. Over the past five years, major banks in most parts of the world have received accreditation under the Basel II Advanced Measurement Approach (AMA) by adopting the loss distribution approach (LDA) despite there being a number of unresolved methodological challenges in its implementation. Different approaches and methods are still under hot debate. In this paper, we review methods proposed in the literature for combining different data sources (internal data, external data and scenario analysis) which is one of the regulatory requirement for AMA
    corecore