3,291 research outputs found

    Heavy-tailed distributions in VaR calculations

    Get PDF
    The essence of the Value-at-Risk (VaR) and Expected Shortfall (ES) computations is estimation of low quantiles in the portfolio return distributions. Hence, the performance of market risk measurement methods depends on the quality of distributional assumptions on the underlying risk factors. This chapter is intended as a guide to heavy-tailed models for VaR-type calculations. We first describe stable laws and their lighter-tailed generalizations, the so-called truncated and tempered stable distributions. Next we study the class of generalized hyperbolic laws, which – like tempered stable distributions – can be classified somewhere between infinite variance stable laws and the Gaussian distribution. Then we discuss copulas, which enable us to construct a multivariate distribution function from the marginal (possibly different) distribution functions of n individual asset returns in a way that takes their dependence structure into account. This dependence structure may be no longer measured by correlation, but by other adequate functions like rank correlation, comonotonicity or tail dependence. Finally, we provide numerical examples.Heavy-tailed distribution; Stable distribution; Tempered stable distribution; Generalized hyperbolic distribution; Parameter estimation; Value-at-Risk (VaR); Expected Shortfall (ES); Copula; Filtered historical simulation (FHS);

    Computationally intensive Value at Risk calculations

    Get PDF
    Market risks are the prospect of financial losses- or gains- due to unexpected changes in market prices and rates. Evaluating the exposure to such risks is nowadays of primary concern to risk managers in financial and non-financial institutions alike. Until late 1980s market risks were estimated through gap and duration analysis (interest rates), portfolio theory (securities), sensitivity analysis (derivatives) or "what-if" scenarios. However, all these methods either could be applied only to very specific assets or relied on subjective reasoning. --

    Stochastic foundations of undulatory transport phenomena: Generalized Poisson-Kac processes - Part I Basic theory

    Full text link
    This article introduces the notion of Generalized Poisson-Kac (GPK) processes which generalize the class of "telegrapher's noise dynamics" introduced by Marc Kac in 1974, usingPoissonian stochastic perturbations. In GPK processes the stochastic perturbation acts as a switching amongst a set of stochastic velocity vectors controlled by a Markov-chain dynamics. GPK processes possess trajectory regularity (almost everywhere) and asymptotic Kac limit, namely the convergence towards Brownian motion (and to stochastic dynamics driven by Wiener perturbations), which characterizes also the long-term/long-distance properties of these processes. In this article we introduce the structural properties of GPK processes, leaving all the physical implications to part II and part III

    Global sensitivity analysis for stochastic simulators based on generalized lambda surrogate models

    Full text link
    Global sensitivity analysis aims at quantifying the impact of input variability onto the variation of the response of a computational model. It has been widely applied to deterministic simulators, for which a set of input parameters has a unique corresponding output value. Stochastic simulators, however, have intrinsic randomness due to their use of (pseudo)random numbers, so they give different results when run twice with the same input parameters but non-common random numbers. Due to this random nature, conventional Sobol' indices, used in global sensitivity analysis, can be extended to stochastic simulators in different ways. In this paper, we discuss three possible extensions and focus on those that depend only on the statistical dependence between input and output. This choice ignores the detailed data generating process involving the internal randomness, and can thus be applied to a wider class of problems. We propose to use the generalized lambda model to emulate the response distribution of stochastic simulators. Such a surrogate can be constructed without the need for replications. The proposed method is applied to three examples including two case studies in finance and epidemiology. The results confirm the convergence of the approach for estimating the sensitivity indices even with the presence of strong heteroskedasticity and small signal-to-noise ratio

    Self-Similar Anisotropic Texture Analysis: the Hyperbolic Wavelet Transform Contribution

    Full text link
    Textures in images can often be well modeled using self-similar processes while they may at the same time display anisotropy. The present contribution thus aims at studying jointly selfsimilarity and anisotropy by focusing on a specific classical class of Gaussian anisotropic selfsimilar processes. It will first be shown that accurate joint estimates of the anisotropy and selfsimilarity parameters are performed by replacing the standard 2D-discrete wavelet transform by the hyperbolic wavelet transform, which permits the use of different dilation factors along the horizontal and vertical axis. Defining anisotropy requires a reference direction that needs not a priori match the horizontal and vertical axes according to which the images are digitized, this discrepancy defines a rotation angle. Second, we show that this rotation angle can be jointly estimated. Third, a non parametric bootstrap based procedure is described, that provides confidence interval in addition to the estimates themselves and enables to construct an isotropy test procedure, that can be applied to a single texture image. Fourth, the robustness and versatility of the proposed analysis is illustrated by being applied to a large variety of different isotropic and anisotropic self-similar fields. As an illustration, we show that a true anisotropy built-in self-similarity can be disentangled from an isotropic self-similarity to which an anisotropic trend has been superimposed

    Building a path-integral calculus: a covariant discretization approach

    Full text link
    Path integrals are a central tool when it comes to describing quantum or thermal fluctuations of particles or fields. Their success dates back to Feynman who showed how to use them within the framework of quantum mechanics. Since then, path integrals have pervaded all areas of physics where fluctuation effects, quantum and/or thermal, are of paramount importance. Their appeal is based on the fact that one converts a problem formulated in terms of operators into one of sampling classical paths with a given weight. Path integrals are the mirror image of our conventional Riemann integrals, with functions replacing the real numbers one usually sums over. However, unlike conventional integrals, path integration suffers a serious drawback: in general, one cannot make non-linear changes of variables without committing an error of some sort. Thus, no path-integral based calculus is possible. Here we identify which are the deep mathematical reasons causing this important caveat, and we come up with cures for systems described by one degree of freedom. Our main result is a construction of path integration free of this longstanding problem, through a direct time-discretization procedure.Comment: 22 pages, 2 figures, 1 table. Typos correcte

    A pedestrian's view on interacting particle systems, KPZ universality, and random matrices

    Full text link
    These notes are based on lectures delivered by the authors at a Langeoog seminar of SFB/TR12 "Symmetries and universality in mesoscopic systems" to a mixed audience of mathematicians and theoretical physicists. After a brief outline of the basic physical concepts of equilibrium and nonequilibrium states, the one-dimensional simple exclusion process is introduced as a paradigmatic nonequilibrium interacting particle system. The stationary measure on the ring is derived and the idea of the hydrodynamic limit is sketched. We then introduce the phenomenological Kardar-Parisi-Zhang (KPZ) equation and explain the associated universality conjecture for surface fluctuations in growth models. This is followed by a detailed exposition of a seminal paper of Johansson that relates the current fluctuations of the totally asymmetric simple exclusion process (TASEP) to the Tracy-Widom distribution of random matrix theory. The implications of this result are discussed within the framework of the KPZ conjecture.Comment: 52 pages, 4 figures; to appear in J. Phys. A: Math. Theo
    • 

    corecore