11,035 research outputs found

    There is a VaR beyond usual approximations

    Get PDF
    Basel II and Solvency 2 both use the Value-at-Risk (VaR) as the risk measure to compute the Capital Requirements. In practice, to calibrate the VaR, a normal approximation is often chosen for the unknown distribution of the yearly log returns of financial assets. This is usually justified by the use of the Central Limit Theorem (CLT), when assuming aggregation of independent and identically distributed (iid) observations in the portfolio model. Such a choice of modeling, in particular using light tail distributions, has proven during the crisis of 2008/2009 to be an inadequate approximation when dealing with the presence of extreme returns; as a consequence, it leads to a gross underestimation of the risks. The main objective of our study is to obtain the most accurate evaluations of the aggregated risks distribution and risk measures when working on financial or insurance data under the presence of heavy tail and to provide practical solutions for accurately estimating high quantiles of aggregated risks. We explore a new method, called Normex, to handle this problem numerically as well as theoretically, based on properties of upper order statistics. Normex provides accurate results, only weakly dependent upon the sample size and the tail index. We compare it with existing methods.Comment: 33 pages, 5 figure

    Distortion risk measures for sums of dependent losses

    Get PDF
    We discuss two distinct approaches, for distorting risk measures of sums of dependent random variables, which preserve the property of coherence. The first, based on distorted expectations, operates on the survival function of the sum. The second, simultaneously applies the distortion on the survival function of the sum and the dependence structure of risks, represented by copulas. Our goal is to propose risk measures that take into account the fluctuations of losses and possible correlations between risk components.Comment: Accepted 25 October 2010, Journal Afrika Statistika Vol. 5, N9, 2010, page 260--26

    The AEP algorithm for the fast computation of the distribution of the sum of dependent random variables

    Get PDF
    We propose a new algorithm to compute numerically the distribution function of the sum of dd dependent, non-negative random variables with given joint distribution.Comment: Published in at http://dx.doi.org/10.3150/10-BEJ284 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Should the advanced measurement approach be replaced with the standardized measurement approach for operational risk?

    Get PDF
    Recently, Basel Committee for Banking Supervision proposed to replace all approaches, including Advanced Measurement Approach (AMA), for operational risk capital with a simple formula referred to as the Standardised Measurement Approach (SMA). This paper discusses and studies the weaknesses and pitfalls of SMA such as instability, risk insensitivity, super-additivity and the implicit relationship between SMA capital model and systemic risk in the banking sector. We also discuss the issues with closely related operational risk Capital-at-Risk (OpCar) Basel Committee proposed model which is the precursor to the SMA. In conclusion, we advocate to maintain the AMA internal model framework and suggest as an alternative a number of standardization recommendations that could be considered to unify internal modelling of operational risk. The findings and views presented in this paper have been discussed with and supported by many OpRisk practitioners and academics in Australia, Europe, UK and USA, and recently at OpRisk Europe 2016 conference in London
    corecore