41 research outputs found

    Systemic Risk and Default Clustering for Large Financial Systems

    Full text link
    As it is known in the finance risk and macroeconomics literature, risk-sharing in large portfolios may increase the probability of creation of default clusters and of systemic risk. We review recent developments on mathematical and computational tools for the quantification of such phenomena. Limiting analysis such as law of large numbers and central limit theorems allow to approximate the distribution in large systems and study quantities such as the loss distribution in large portfolios. Large deviations analysis allow us to study the tail of the loss distribution and to identify pathways to default clustering. Sensitivity analysis allows to understand the most likely ways in which different effects, such as contagion and systematic risks, combine to lead to large default rates. Such results could give useful insights into how to optimally safeguard against such events.Comment: in Large Deviations and Asymptotic Methods in Finance, (Editors: P. Friz, J. Gatheral, A. Gulisashvili, A. Jacqier, J. Teichmann) , Springer Proceedings in Mathematics and Statistics, Vol. 110 2015

    Two quantum analogues of Fisher information from a large deviation viewpoint of quantum estimation

    Get PDF
    We discuss two quantum analogues of Fisher information, symmetric logarithmic derivative (SLD) Fisher information and Kubo-Mori-Bogoljubov (KMB) Fisher information from a large deviation viewpoint of quantum estimation and prove that the former gives the true bound and the latter gives the bound of consistent superefficient estimators. In another comparison, it is shown that the difference between them is characterized by the change of the order of limits.Comment: LaTeX with iopart.cls, iopart12.clo, iopams.st

    Quadratic optimal functional quantization of stochastic processes and numerical applications

    Get PDF
    In this paper, we present an overview of the recent developments of functional quantization of stochastic processes, with an emphasis on the quadratic case. Functional quantization is a way to approximate a process, viewed as a Hilbert-valued random variable, using a nearest neighbour projection on a finite codebook. A special emphasis is made on the computational aspects and the numerical applications, in particular the pricing of some path-dependent European options.Comment: 41 page

    Effective bandwidth of non-Markovian packet traffic

    Get PDF
    We demonstrate the application of recent advances in statistical mechanics to a problem in telecommunication engineering: the assessment of the quality of a communication channel in terms of rare and extreme events. In particular, we discuss non-Markovian models for telecommunication traffic in continuous time and deploy the "cloning" procedure of non-equilibrium statistical mechanics to efficiently compute their effective bandwidths. The cloning method allows us to evaluate the performance of a traffic protocol even in the absence of analytical results, which are often hard to obtain when the dynamics are non-Markovian

    Efficient rare event simulation by optimal nonequilibrium forcing

    No full text
    Rare event simulation and estimation for systems in equilibrium are among the most challenging topics in molecular dynamics. As was shown by Jarzynski and others, nonequilibrium forcing can theoretically be used to obtain equilibrium rare event statistics. The advantage seems to be that the external force can speed up the sampling of the rare events by biasing the equilibrium distribution towards a distribution under which the rare events is no longer rare. Yet algorithmic methods based on Jarzynski's and related results often fail to be efficient because they are based on sampling in path space. We present a new method that replaces the path sampling problem by minimization of a cross-entropy-like functional which boils down to finding the optimal nonequilibrium forcing. We show how to solve the related optimization problem in an efficient way by using an iterative strategy based on milestoning.Comment: 15 pages, 7 figure

    Variance Reduction Techniques for Estimating Value-at-Risk

    No full text
    This paper describes, analyzes and evaluates an algorithm for estimating portfolio loss probabilities using Monte Carlo simulation.Obtaining accurate estimates of such loss probabilities is essential to calculating value-at-risk, which is a quantile of the loss distribution. The method employs a quadratic ("delta-gamma") approximation to the change in portfolio value to guide the selection of effective variance reduction techniques;specifically importance sampling and stratified sampling.If the approximation is exact, then the importance sampling is shown to be asymptotically optimal.Numerical results indicate that an appropriate combination of importance sampling and stratified sampling can result in large variance reductions when estimating the probability of large portfolio losses.value-at-risk, monte carlo, simulation, variance reduction technique, importance sampling, stratified sampling, rare event
    corecore