228,896 research outputs found

    Statistical Contributions to Operational Risk Modeling

    Get PDF
    In this dissertation, we focus on statistical aspects of operational risk modeling. Specifically, we are interested in understanding the effects of model uncertainty on capital reserves due to data truncation and in developing better model selection tools for truncated and shifted parametric distributions. We first investigate the model uncertainty question which has been unanswered for many years because researchers, practitioners, and regulators could not agree on how to treat the data collection threshold in operational risk modeling. There are several approaches under consideration—the empirical approach, the “naive” approach, the shifted approach, and the truncated approach—for fitting the loss severity distribution. Since each approach is based on a different set of assumptions, different probability models emerge. Thus, model uncertainty arises. When possible we investigate such model uncertainty analytically using asymptotic theorems of mathematical statistics and several parametric distributions commonly used for operational risk modeling, otherwise we rely on Monte Carlo simulations. The effect of model uncertainty on risk measurements is quantified by evaluating the probability of each approach producing conservative capital allocations based on the value-at-risk measure. These explorations are further illustrated using a real data set for legal losses in a business unit. After clarifying some prevailing misconceptions around the model uncertainty issue in operational risk modeling, we then employ standard (Akaike Information Criterion, AIC, and Bayesian Information Criterion, BIC) and introduce new model selection tools for truncated and shifted parametric models. We find that the new criteria, which are based on information complexity and asymptotic mean curvature of the model likelihood, are more effective at distinguishing between the competing models than AIC and BIC

    Operational risk management and new computational needs in banks

    Get PDF
    Basel II banking regulation introduces new needs for computational schemes. They involve both optimal stochastic control, and large scale simulations of decision processes of preventing low-frequency high loss-impact events. This paper will first state the problem and present its parameters. It then spells out the equations that represent a rational risk management behavior and link together the variables: Levy processes are used to model operational risk losses, where calibration by historical loss databases is possible ; where it is not the case, qualitative variables such as quality of business environment and internal controls can provide both costs-side and profits-side impacts. Among other control variables are business growth rate, and efficiency of risk mitigation. The economic value of a policy is maximized by resolving the resulting Hamilton-Jacobi-Bellman type equation. Computational complexity arises from embedded interactions between 3 levels: * Programming global optimal dynamic expenditures budget in Basel II context, * Arbitraging between the cost of risk-reduction policies (as measured by organizational qualitative scorecards and insurance buying) and the impact of incurred losses themselves. This implies modeling the efficiency of the process through which forward-looking measures of threats minimization, can actually reduce stochastic losses, * And optimal allocation according to profitability across subsidiaries and business lines. The paper next reviews the different types of approaches that can be envisaged in deriving a sound budgetary policy solution for operational risk management, based on this HJB equation. It is argued that while this complex, high dimensional problem can be resolved by taking some usual simplifications (Galerkin approach, imposing Merton form solutions, viscosity approach, ad hoc utility functions that provide closed form solutions, etc.) , the main interest of this model lies in exploring the scenarios in an adaptive learning framework ( MDP, partially observed MDP, Q-learning, neuro-dynamic programming, greedy algorithm, etc.). This makes more sense from a management point of view, and solutions are more easily communicated to, and accepted by, the operational level staff in banks through the explicit scenarios that can be derived. This kind of approach combines different computational techniques such as POMDP, stochastic control theory and learning algorithms under uncertainty and incomplete information. The paper concludes by presenting the benefits of such a consistent computational approach to managing budgets, as opposed to a policy of operational risk management made up from disconnected expenditures. Such consistency satisfies the qualifying criteria for banks to apply for the AMA (Advanced Measurement Approach) that will allow large economies of regulatory capital charge under Basel II Accord.REGULAR - Operational risk management, HJB equation, Levy processes, budget optimization, capital allocation

    Reliability Analysis Approach For Operations Planning Of Hydropower Systems

    Full text link
    Many existing hydropower storage facilities were built decades ago and components of these aging infrastructure facilities have higher risk of failure. Insufficient capacity or forced outages of the spillway and other waterway passage facilities during flooding incident could potentially increase the probability of dam safety incidents leading to public safety concerns. Currently approaches used to assess the risk and uncertainty in operational decision making are mainly based on qualitative assessment and expert judgment and can be significantly improved by the development of a framework that formally incorporates both qualitative and quantitative reliability analysis methods. Event tree analysis and fault tree analysis have traditionally been used in dam safety risk analysis, with results subject to data adequacy and availability. Our research shows that other methods, such as nonparametric analysis and Monte Carlo simulation techniques can yield good results as well. This study investigated the application of reliability analysis methods to existing hydropower storage facilities, with the objective of developing a new systems engineering based approach for risk and uncertainty analysis to assess and manage the risks of hydropower system operations. Our approach integrates reliability-based methods with hydro system optimization modeling to develop an operational reliability-based modeling framework and to formally treat risk and uncertainty in operations planning. This approach incorporates different sources of uncertainty that are typically encountered in operations planning of these systems, including failure probability of hydro system components such as non-power release structures and turbine facilities. This paper presents the framework we have developed and illustrates the application of our investigation for a hydropower system facility in British Columbia, Canada

    Implementing Loss Distribution Approach for Operational Risk

    Full text link
    To quantify the operational risk capital charge under the current regulatory framework for banking supervision, referred to as Basel II, many banks adopt the Loss Distribution Approach. There are many modeling issues that should be resolved to use the approach in practice. In this paper we review the quantitative methods suggested in literature for implementation of the approach. In particular, the use of the Bayesian inference method that allows to take expert judgement and parameter uncertainty into account, modeling dependence and inclusion of insurance are discussed

    The Return of the Rogue

    Get PDF
    The “rogue trader”—a famed figure of the 1990s—recently has returned to prominence due largely to two phenomena. First, recent U.S. mortgage market volatility spilled over into stock, commodity, and derivative markets worldwide, causing large financial institution losses and revealing previously hidden unauthorized positions. Second, the rogue trader has gained importance as banks around the world have focused more attention on operational risk in response to regulatory changes prompted by the Basel II Capital Accord. This Article contends that of the many regulatory options available to the Basel Committee for addressing operational risk it arguably chose the worst: an enforced selfregulatory regime unlikely to substantially alter financial institutions’ ability to successfully manage operational risk. That regime also poses the danger of high costs, a false sense of security, and perverse incentives. Particularly with respect to the low-frequency, high-impact events—including rogue trading—that may be the greatest threat to bank stability and soundness, attempts at enforced self-regulation are unlikely to significantly reduce operational risk, because those financial institutions with the highest operational risk are the least likely to credibly assess that risk and set aside adequate capital under a regime of enforced self-regulation

    Consumer finance: challenges for operational research

    No full text
    Consumer finance has become one of the most important areas of banking, both because of the amount of money being lent and the impact of such credit on global economy and the realisation that the credit crunch of 2008 was partly due to incorrect modelling of the risks in such lending. This paper reviews the development of credit scoring—the way of assessing risk in consumer finance—and what is meant by a credit score. It then outlines 10 challenges for Operational Research to support modelling in consumer finance. Some of these involve developing more robust risk assessment systems, whereas others are to expand the use of such modelling to deal with the current objectives of lenders and the new decisions they have to make in consumer finance. <br/

    Operations research in consumer finance: challenges for operational research

    No full text
    Consumer finance has become one of the most important areas of banking both because of the amount of money being lent and the impact of such credit on the global economy and the realisation that the credit crunch of 2008 was partly due to incorrect modelling of the risks in such lending. This paper reviews the development of credit scoring,-the way of assessing risk in consumer finance- and what is meant by a credit score. It then outlines ten challenges for Operational Research to support modelling in consumer finance. Some of these are to developing more robust risk assessment systems while others are to expand the use of such modelling to deal with the current objectives of lenders and the new decisions they have to make in consumer financ

    Loss Distribution Approach for Operational Risk Capital Modelling under Basel II: Combining Different Data Sources for Risk Estimation

    Full text link
    The management of operational risk in the banking industry has undergone significant changes over the last decade due to substantial changes in operational risk environment. Globalization, deregulation, the use of complex financial products and changes in information technology have resulted in exposure to new risks very different from market and credit risks. In response, Basel Committee for banking Supervision has developed a regulatory framework, referred to as Basel II, that introduced operational risk category and corresponding capital requirements. Over the past five years, major banks in most parts of the world have received accreditation under the Basel II Advanced Measurement Approach (AMA) by adopting the loss distribution approach (LDA) despite there being a number of unresolved methodological challenges in its implementation. Different approaches and methods are still under hot debate. In this paper, we review methods proposed in the literature for combining different data sources (internal data, external data and scenario analysis) which is one of the regulatory requirement for AMA
    corecore