184 research outputs found

    The asymptotic loss distribution in a fat-tailed factor model of portfolio credit risk

    Get PDF
    This paper extends the standard asymptotic results concerning the percentage loss distribution in the Vasicek uniform model to a setup where the systematic risk factor is non-normally distributed. We show that the asymptotic density in this new setup can still be obtained in closed form; in particular, we derive the return distributions, the densities and the quantile functions when the common factor follows two types of normal mixture distributions (a two-population scale mixture and a jump mixture) and the Student’s t distribution. Finally, we present a real-data application of the technique to data of the Intesa - San Paolo credit portfolio. The numerical experiments show that the asymptotic loss density is highly flexible and provides the analyst with a VaR which takes into account the event risk incorporated in the fat-tailed distribution of the common factor.Factor model, asymptotic loss, Value at Risk.

    Dynamic VaR models and the Peaks over Threshold method for market risk measurement: an empirical investigation during a financial crisis

    Get PDF
    This paper presents a backtesting exercise involving several VaR models for measuring market risk in a dynamic context. The focus is on the comparison of standard dynamic VaR models, ad hoc fat-tailed models and the dynamic Peaks over Threshold (POT) procedure for VaR estimation with different volatility specifications. We introduce three different stochastic processes for the losses: two of them are of the GARCH-type and one is of the EWMA-type. In order to assess the performance of the models, we implement a backtesting procedure using the log-losses of a diversified sample of 15 financial assets. The backtesting analysis covers the period March 2004 - May 2009, thus including the turmoil period corresponding to the subprime crisis. The results show that the POT approach and a Dynamic Historical Simulation method, both combined with the EWMA volatility specification, are particularly effective at high VaR coverage probabilities and outperform the other models under consideration. Moreover, VaR measures estimated with these models react quickly to the turmoil of the last part of the backtesting period, so that they seem to be efficient in high-risk periods as well.Market risk, Extreme Value Theory, Peaks over Threshold, Value at Risk, Fat tails

    A Monte Carlo EM Algorithm for the Estimation of a Logistic Auto-logistic Model with Missing Data

    Get PDF
    This paper proposes an algorithm for the estimation of the parameters of a Logistic Auto-logistic Model when some values of the target variable are missing at random but the auxiliary information is known for the same areas. First, we derive a Monte Carlo EM algorithm in the setup of maximum pseudo-likelihood estimation; given the analytical intractability of the conditional expectation of the complete pseudo-likelihood function, we implement the E-step by means of Monte Carlo simulation. Second, we give an example using a simulated dataset. Finally, a comparison with the standard non-missing data case shows that the algorithm gives consistent results.Spatial Missing Data, Monte Carlo EM Algorithm, Logistic Auto-logistic Model, Pseudo-Likelihood.

    Testing the Profitability of Simple Technical Trading Rules: A Bootstrap Analysis of the Italian Stock Market.

    Get PDF
    The aim of this paper consists in testing the profitability of simple technical trading rules in the Italian stock market. By means of a recently developed bootstrap methodology we assess whether technical rules based on moving averages are capable of producing excess returns with respect to the Buy-and-Hold strategy. We find that in most cases the rules are profitable and the excess return is statistically significant. However, the well-known problem of data-snooping, which seems to be confirmed by our analysis, requires some caution in the application of these methods.

    Spatial models for flood risk assessment

    Get PDF
    The problem of computing risk measures associated to flood events is extremely important not only from the point of view of civil protection systems but also because of the necessity for the municipalities of insuring against the damages. In this work we propose, in the framework of an integrated strategy, an operating solution which merges in a conditional approach the information usually available in this setup. First we use a Logistic Auto-Logistic (LAM) model for the estimation of the univariate conditional probabilities of flood events. This approach has two fundamental advantages: it allows to incorporate auxiliary information and does not require the target variables to be independent. Then we simulate the joint distribution of floodings by means of the Gibbs Sampler. Finally we propose an algorithm to increase ex post the spatial autocorrelation of the simulated events. The methodology is shown to be effective by means of an application to the estimation of the flood probability of Italian hydrographic regions

    Unsupervised Mixture Estimation via Approximate Maximum Likelihood based on the Cram\'er - von Mises distance

    Full text link
    Mixture distributions with dynamic weights are an efficient way of modeling loss data characterized by heavy tails. However, maximum likelihood estimation of this family of models is difficult, mostly because of the need to evaluate numerically an intractable normalizing constant. In such a setup, simulation-based estimation methods are an appealing alternative. The approximate maximum likelihood estimation (AMLE) approach is employed. It is a general method that can be applied to mixtures with any component densities, as long as simulation is feasible. The focus is on the dynamic lognormal-generalized Pareto distribution, and the Cram\'er - von Mises distance is used to measure the discrepancy between observed and simulated samples. After deriving the theoretical properties of the estimators, a hybrid procedure is developed, where standard maximum likelihood is first employed to determine the bounds of the uniform priors required as input for AMLE. Simulation experiments and two real-data applications suggest that this approach yields a major improvement with respect to standard maximum likelihood estimation.Comment: 31 pages, 7 figures, 14 table

    A Maximum Entropy Approach to Loss Distribution Analysis

    Get PDF
    In this paper we propose an approach to the estimation and simulation of loss distributions based on Maximum Entropy (ME), a non-parametric technique that maximizes the Shannon entropy of the data under moment constraints. Special cases of the ME density correspond to standard distribution

    Spatial models for flood risk assessment

    Get PDF
    The problem of computing risk measures associated to flood events is extremely important not only from the point of view of civil protection systems but also because of the necessity for the municipalities of insuring against the damages. In this work we propose, in the framework of an integrated strategy, an operating solution which merges in a conditional approach the information usually available in this setup. First we use a Logistic Auto-Logistic (LAM) model for the estimation of the univariate conditional probabilities of flood events. This approach has two fundamental advantages: it allows to incorporate auxiliary information and does not require the target variables to be indepen- dent. Then we simulate the joint distribution of floodings by means of the Gibbs Sampler. Finally we propose an algorithm to increase ex post the spatial autocorrelation of the simulated events. The methodology is shown to be effective by means of an application to the estimation of the flood probability of Italian hydrographic regions.Flood Risk, Conditional Approach, LAM Model, Pseudo-Maximum Likelihood Estimation, Spatial Autocorrelation, Gibbs Sampler.

    A note on maximum likelihood estimation of a Pareto mixture

    Get PDF
    In this paper we study Maximum Likelihood Estimation of the parameters of a Pareto mixture. Application of standard techniques to a mixture of Pareto is problematic. For this reason we develop two alternative algorithms. The first one is the Simulated Annealing and the second one is based on Cross-Entropy minimization. The Pareto distribution is a commonly used model for heavy-tailed data. It is a two-parameter distribution whose shape parameter determines the degree of heaviness of the tail, so that it can be adapted to data with different features. This work is motivated by an application in the operational risk measurement field: we fit a Pareto mixture to operational losses recorded by a bank in two different business lines. Losses below an unknown threshold are discarded, so that the observed data are truncated. The thresholds used in the two business lines are unknown. Thus, under the assumption that each population follows a Pareto distribution, the appropriate model is a mixture of Pareto where all the parameters have to be estimated.
    • 

    corecore