1,761 research outputs found

    Machine Learning-Based Elastic Cloud Resource Provisioning in the Solvency II Framework

    Get PDF
    The Solvency II Directive (Directive 2009/138/EC) is a European Directive issued in November 2009 and effective from January 2016, which has been enacted by the European Union to regulate the insurance and reinsurance sector through the discipline of risk management. Solvency II requires European insurance companies to conduct consistent evaluation and continuous monitoring of risks—a process which is computationally complex and extremely resource-intensive. To this end, companies are required to equip themselves with adequate IT infrastructures, facing a significant outlay. In this paper we present the design and the development of a Machine Learning-based approach to transparently deploy on a cloud environment the most resource-intensive portion of the Solvency II-related computation. Our proposal targets DISAR®, a Solvency II-oriented system initially designed to work on a grid of conventional computers. We show how our solution allows to reduce the overall expenses associated with the computation, without hampering the privacy of the companies’ data (making it suitable for conventional public cloud environments), and allowing to meet the strict temporal requirements required by the Directive. Additionally, the system is organized as a self-optimizing loop, which allows to use information gathered from actual (useful) computations, thus requiring a shorter training phase. We present an experimental study conducted on Amazon EC2 to assess the validity and the efficiency of our proposal

    Generating VaR scenarios with product beta distributions

    Full text link
    We propose a Monte Carlo simulation method to generate stress tests by VaR scenarios under Solvency II for dependent risks on the basis of observed data. This is of particular interest for the construction of Internal Models and requirements on evaluation processes formulated in the Commission Delegated Regulation. The approach is based on former work on partition-ofunity copulas, however with a direct scenario estimation of the joint density by product beta distributions after a suitable transformation of the original data.Comment: 10 pages, 25 figures, 5 table

    A machine learning approach to portfolio pricing and risk management for high-dimensional problems

    Full text link
    We present a general framework for portfolio risk management in discrete time, based on a replicating martingale. This martingale is learned from a finite sample in a supervised setting. The model learns the features necessary for an effective low-dimensional representation, overcoming the curse of dimensionality common to function approximation in high-dimensional spaces. We show results based on polynomial and neural network bases. Both offer superior results to naive Monte Carlo methods and other existing methods like least-squares Monte Carlo and replicating portfolios.Comment: 30 pages (main), 10 pages (appendix), 3 figures, 22 table

    Implementing Loss Distribution Approach for Operational Risk

    Full text link
    To quantify the operational risk capital charge under the current regulatory framework for banking supervision, referred to as Basel II, many banks adopt the Loss Distribution Approach. There are many modeling issues that should be resolved to use the approach in practice. In this paper we review the quantitative methods suggested in literature for implementation of the approach. In particular, the use of the Bayesian inference method that allows to take expert judgement and parameter uncertainty into account, modeling dependence and inclusion of insurance are discussed

    Test for changes in the modeled solvency capital requirement of an internal risk model

    Get PDF
    In the context of the Solvency II directive, the operation of an internal risk model is a possible way for risk assessment and for the determination of the solvency capital requirement of an insurance company in the European Union. A Monte Carlo procedure is customary to generate a model output. To be compliant with the directive, validation of the internal risk model is conducted on the basis of the model output. For this purpose, we suggest a new test for checking whether there is a significant change in the modeled solvency capital requirement. Asymptotic properties of the test statistic are investigated and a bootstrap approximation is justified. A simulation study investigates the performance of the test in the finite sample case and confirms the theoretical results. The internal risk model and the application of the test is illustrated in a simplified example. The method has more general usage for inference of a broad class of law-invariant and coherent risk measures on the basis of a paired sample

    VaR-implied tail-correlation matrices : [Version October 2013]

    Get PDF
    Empirical evidence suggests that asset returns correlate more strongly in bear markets than conventional correlation estimates imply. We propose a method for determining complete tail correlation matrices based on Value-at-Risk (VaR) estimates. We demonstrate how to obtain more efficient tail-correlation estimates by use of overidentification strategies and how to guarantee positive semidefiniteness, a property required for valid risk aggregation and Markowitz{type portfolio optimization. An empirical application to a 30-asset universe illustrates the practical applicability and relevance of the approach in portfolio management
    • …
    corecore