4,008 research outputs found

    Should the advanced measurement approach be replaced with the standardized measurement approach for operational risk?

    Get PDF
    Recently, Basel Committee for Banking Supervision proposed to replace all approaches, including Advanced Measurement Approach (AMA), for operational risk capital with a simple formula referred to as the Standardised Measurement Approach (SMA). This paper discusses and studies the weaknesses and pitfalls of SMA such as instability, risk insensitivity, super-additivity and the implicit relationship between SMA capital model and systemic risk in the banking sector. We also discuss the issues with closely related operational risk Capital-at-Risk (OpCar) Basel Committee proposed model which is the precursor to the SMA. In conclusion, we advocate to maintain the AMA internal model framework and suggest as an alternative a number of standardization recommendations that could be considered to unify internal modelling of operational risk. The findings and views presented in this paper have been discussed with and supported by many OpRisk practitioners and academics in Australia, Europe, UK and USA, and recently at OpRisk Europe 2016 conference in London

    Estimation of Economic Capital Concerning Operational Risk in a Brazilian Banking Industry Case

    Get PDF
    The advance of globalization of the international financial market has implied a more complex portfolio risk for the banks. Furthermore, several points such as the growth of e-banking and the increase in accounting irregularities call attention to operational risk. This article presents an analysis for the estimation of economic capital concerning operational risk in a Brazilian banking industry case making use of Markov chains, extreme value theory, and peaks over threshold modelling. The findings denote that some existent methods present consistent results among institutions with similar characteristics of loss data. Moreover, even when methods considered as goodness of fit are applied, such as EVT-POT, the capital estimations can generate large variations and become unreal.

    Loss Severity Distribution Estimation Of Operational Risk Using Gaussian Mixture Model For Loss Distribution Approach

    Full text link
    Banks must be able to manage all of banking risk; one of them is operational risk. Banks manage operational risk by calculates estimating operational risk which is known as the economic capital (EC). Loss Distribution Approach (LDA) is a popular method to estimate economic capital(EC).This paper propose Gaussian Mixture Model(GMM) for severity distribution estimation of loss distribution approach(LDA). The result on this research is the value at EC of LDA method using GMM is smaller 2 % - 2, 8 % than the value at EC of LDA using existing distribution model

    Kernel alternatives to aproximate operational severity distribution: an empirical application

    Get PDF
    The estimation of severity loss distribution is one the main topic in operational risk estimation. Numerous parametric estimations have been suggested although very few work for both high frequency small losses and low frequency big losses. In this paper several estimation are explored. The good performance of the double transformation kernel estimation in the context of operational risk severity is worthy of a special mention. This method is based on the work of Bolancé and Guillén (2009), it was initially proposed in the context of the cost of claims insurance, and it means an advance in operational risk research

    Implementing Loss Distribution Approach for Operational Risk

    Full text link
    To quantify the operational risk capital charge under the current regulatory framework for banking supervision, referred to as Basel II, many banks adopt the Loss Distribution Approach. There are many modeling issues that should be resolved to use the approach in practice. In this paper we review the quantitative methods suggested in literature for implementation of the approach. In particular, the use of the Bayesian inference method that allows to take expert judgement and parameter uncertainty into account, modeling dependence and inclusion of insurance are discussed

    Analysis of operational risk of banks – catastrophe modelling

    Get PDF
    Nowadays financial institutions due to regulation and internal motivations care more intensively on their risks. Besides previously dominating market and credit risk new trend is to handle operational risk systematically. Operational risk is the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. First we show the basic features of operational risk and its modelling and regulatory approaches, and after we will analyse operational risk in an own developed simulation model framework. Our approach is based on the analysis of latent risk process instead of manifest risk process, which widely popular in risk literature. In our model the latent risk process is a stochastic risk process, so called Ornstein- Uhlenbeck process, which is a mean reversion process. In the model framework we define catastrophe as breach of a critical barrier by the process. We analyse the distributions of catastrophe frequency, severity and first time to hit, not only for single process, but for dual process as well. Based on our first results we could not falsify the Poisson feature of frequency, and long tail feature of severity. Distribution of “first time to hit” requires more sophisticated analysis. At the end of paper we examine advantages of simulation based forecasting, and finally we concluding with the possible, further research directions to be done in the future

    Basel II and Operational Risk: Implications for risk measurement and management in the financial sector

    Get PDF
    This paper proposes a methodology to analyze the implications of the Advanced Measurement Approach (AMA) for the assessment of operational risk put forward by the Basel II Accord. The methodology relies on an integrated procedure for the construction of the distribution of aggregate losses, using internal and external loss data. It is illustrated on a 2x2 matrix of two selected business lines and two event types, drawn from a database of 3000 losses obtained from a large European banking institution. For each cell, the method calibrates three truncated distributions functions for the body of internal data, the tail of internal data, and external data. When the dependence structure between aggregate losses and the non-linear adjustment of external data are explicitly taken into account, the regulatory capital computed with the AMA method proves to be substantially lower than with less sophisticated approaches allowed by the Basel II Accord, although the effect is not uniform for all business lines and event types. In a second phase, our models are used to estimate the effects of operational risk management actions on bank profitability, through a measure of RAROC adapted to operational risk. The results suggest that substantial savings can be achieved through active management techniques, although the estimated effect of a reduction of the number, frequency or severity of operational losses crucially depends on the calibration of the aggregate loss distributions.operational risk management, basel II, advanced measurement approach, copulae, external data, EVT, RAROC, cost-benefit analysis.

    Identifying Rare and Subtle Behaviors: A Weakly Supervised Joint Topic Model

    Get PDF
    corecore