12,076 research outputs found

    Hybrid Historical Simulation VaR and ES: Performance in Developed and Emerging Markets

    Get PDF
    We introduce a new hybrid approach to joint estimation of Value at Risk (VaR) and Expected Shortfall (ES) for high quantiles of return distributions. We investigate the relative performance of VaR and ES models using daily returns for sixteen stock market indices (eight from developed and eight from emerging markets) prior to and during the 2008 financial crisis. In addition to widely used VaR and ES models, we also study the behavior of conditional and unconditional extreme value (EV) models to generate 99 percent confidence level estimates as well as developing a new loss function that relates tail losses to ES forecasts. Backtesting results show that only our proposed new hybrid and Extreme Value (EV)-based VaR models provide adequate protection in both developed and emerging markets, but that the hybrid approach does this at a significantly lower cost in capital reserves. In ES estimation the hybrid model yields the smallest error statistics surpassing even the EV models, especially in the developed markets.value at risk, expected shortfall, hybrid historical simulation, extreme value theory, bootstrapping

    Global sensitivity analysis for stochastic simulators based on generalized lambda surrogate models

    Full text link
    Global sensitivity analysis aims at quantifying the impact of input variability onto the variation of the response of a computational model. It has been widely applied to deterministic simulators, for which a set of input parameters has a unique corresponding output value. Stochastic simulators, however, have intrinsic randomness due to their use of (pseudo)random numbers, so they give different results when run twice with the same input parameters but non-common random numbers. Due to this random nature, conventional Sobol' indices, used in global sensitivity analysis, can be extended to stochastic simulators in different ways. In this paper, we discuss three possible extensions and focus on those that depend only on the statistical dependence between input and output. This choice ignores the detailed data generating process involving the internal randomness, and can thus be applied to a wider class of problems. We propose to use the generalized lambda model to emulate the response distribution of stochastic simulators. Such a surrogate can be constructed without the need for replications. The proposed method is applied to three examples including two case studies in finance and epidemiology. The results confirm the convergence of the approach for estimating the sensitivity indices even with the presence of strong heteroskedasticity and small signal-to-noise ratio

    Accounting for risk of non linear portfolios: a novel Fourier approach

    Full text link
    The presence of non linear instruments is responsible for the emergence of non Gaussian features in the price changes distribution of realistic portfolios, even for Normally distributed risk factors. This is especially true for the benchmark Delta Gamma Normal model, which in general exhibits exponentially damped power law tails. We show how the knowledge of the model characteristic function leads to Fourier representations for two standard risk measures, the Value at Risk and the Expected Shortfall, and for their sensitivities with respect to the model parameters. We detail the numerical implementation of our formulae and we emphasizes the reliability and efficiency of our results in comparison with Monte Carlo simulation.Comment: 10 pages, 12 figures. Final version accepted for publication on Eur. Phys. J.

    Bayesian computation via empirical likelihood

    Full text link
    Approximate Bayesian computation (ABC) has become an essential tool for the analysis of complex stochastic models when the likelihood function is numerically unavailable. However, the well-established statistical method of empirical likelihood provides another route to such settings that bypasses simulations from the model and the choices of the ABC parameters (summary statistics, distance, tolerance), while being convergent in the number of observations. Furthermore, bypassing model simulations may lead to significant time savings in complex models, for instance those found in population genetics. The BCel algorithm we develop in this paper also provides an evaluation of its own performance through an associated effective sample size. The method is illustrated using several examples, including estimation of standard distributions, time series, and population genetics models.Comment: 21 pages, 12 figures, revised version of the previous version with a new titl

    HMM based scenario generation for an investment optimisation problem

    Get PDF
    This is the post-print version of the article. The official published version can be accessed from the link below - Copyright @ 2012 Springer-Verlag.The Geometric Brownian motion (GBM) is a standard method for modelling financial time series. An important criticism of this method is that the parameters of the GBM are assumed to be constants; due to this fact, important features of the time series, like extreme behaviour or volatility clustering cannot be captured. We propose an approach by which the parameters of the GBM are able to switch between regimes, more precisely they are governed by a hidden Markov chain. Thus, we model the financial time series via a hidden Markov model (HMM) with a GBM in each state. Using this approach, we generate scenarios for a financial portfolio optimisation problem in which the portfolio CVaR is minimised. Numerical results are presented.This study was funded by NET ACE at OptiRisk Systems

    Credit Portfolio Losses

    Get PDF

    The safety case and the lessons learned for the reliability and maintainability case

    Get PDF
    This paper examine the safety case and the lessons learned for the reliability and maintainability case
    corecore