74 research outputs found

    Application of importance sampling to the computation of large deviations in non-equilibrium processes

    Full text link
    We present an algorithm for finding the probabilities of rare events in nonequilibrium processes. The algorithm consists of evolving the system with a modified dynamics for which the required event occurs more frequently. By keeping track of the relative weight of phase-space trajectories generated by the modified and the original dynamics one can obtain the required probabilities. The algorithm is tested on two model systems of steady-state particle and heat transport where we find a huge improvement from direct simulation methods.Comment: 5 pages, 4 figures; some modification

    Rare behavior of growth processes via umbrella sampling of trajectories

    Get PDF
    We compute probability distributions of trajectory observables for reversible and irreversible growth processes. These results reveal a correspondence between reversible and irreversible processes, at particular points in parameter space, in terms of their typical and atypical trajectories. Thus key features of growth processes can be insensitive to the precise form of the rate constants used to generate them, recalling the insensitivity to microscopic details of certain equilibrium behavior. We obtained these results using a sampling method, inspired by the “s-ensemble” large-deviation formalism, that amounts to umbrella sampling in trajectory space. The method is a simple variant of existing approaches, and applies to ensembles of trajectories controlled by the total number of events. It can be used to determine large-deviation rate functions for trajectory observables in or out of equilibrium

    Two quantum analogues of Fisher information from a large deviation viewpoint of quantum estimation

    Get PDF
    We discuss two quantum analogues of Fisher information, symmetric logarithmic derivative (SLD) Fisher information and Kubo-Mori-Bogoljubov (KMB) Fisher information from a large deviation viewpoint of quantum estimation and prove that the former gives the true bound and the latter gives the bound of consistent superefficient estimators. In another comparison, it is shown that the difference between them is characterized by the change of the order of limits.Comment: LaTeX with iopart.cls, iopart12.clo, iopams.st

    Systemic Risk and Default Clustering for Large Financial Systems

    Full text link
    As it is known in the finance risk and macroeconomics literature, risk-sharing in large portfolios may increase the probability of creation of default clusters and of systemic risk. We review recent developments on mathematical and computational tools for the quantification of such phenomena. Limiting analysis such as law of large numbers and central limit theorems allow to approximate the distribution in large systems and study quantities such as the loss distribution in large portfolios. Large deviations analysis allow us to study the tail of the loss distribution and to identify pathways to default clustering. Sensitivity analysis allows to understand the most likely ways in which different effects, such as contagion and systematic risks, combine to lead to large default rates. Such results could give useful insights into how to optimally safeguard against such events.Comment: in Large Deviations and Asymptotic Methods in Finance, (Editors: P. Friz, J. Gatheral, A. Gulisashvili, A. Jacqier, J. Teichmann) , Springer Proceedings in Mathematics and Statistics, Vol. 110 2015

    Minimum memory for generating rare events

    Full text link
    We classify the rare events of structured, memoryful stochastic processes and use this to analyze sequential and parallel generators for these events. Given a stochastic process, we introduce a method to construct a process whose typical realizations are a given process' rare events. This leads to an expression for the minimum memory required to generate rare events. We then show that the recently discovered classical-quantum ambiguity of simplicity also occurs when comparing the structure of process fluctuations

    Quadratic optimal functional quantization of stochastic processes and numerical applications

    Get PDF
    In this paper, we present an overview of the recent developments of functional quantization of stochastic processes, with an emphasis on the quadratic case. Functional quantization is a way to approximate a process, viewed as a Hilbert-valued random variable, using a nearest neighbour projection on a finite codebook. A special emphasis is made on the computational aspects and the numerical applications, in particular the pricing of some path-dependent European options.Comment: 41 page

    Effective bandwidth of non-Markovian packet traffic

    Get PDF
    We demonstrate the application of recent advances in statistical mechanics to a problem in telecommunication engineering: the assessment of the quality of a communication channel in terms of rare and extreme events. In particular, we discuss non-Markovian models for telecommunication traffic in continuous time and deploy the "cloning" procedure of non-equilibrium statistical mechanics to efficiently compute their effective bandwidths. The cloning method allows us to evaluate the performance of a traffic protocol even in the absence of analytical results, which are often hard to obtain when the dynamics are non-Markovian
    corecore