41 research outputs found

    The History of the Quantitative Methods in Finance Conference Series. 1992-2007

    Get PDF
    This report charts the history of the Quantitative Methods in Finance (QMF) conference from its beginning in 1993 to the 15th conference in 2007. It lists alphabetically the 1037 speakers who presented at all 15 conferences and the titles of their papers.

    Importance sampling for metastable dynamical systems in molecular dynamics

    Get PDF
    The behaviour of molecules is determined by rare events and these rare events matter. For example, a large conformational change in a molecule can lead to complete different behaviour of this molecule. But these rare events also affect the numerical simulation of molecules. They can cause a high variance of certain estimators. This is why it is important to develop effective and reliable numerical tools for the sampling of these rare events. The problems caused by rare events in the effective sampling of the different quantities are caused by the stochastic behaviour of the dynamical system and a phenomenon called metastability. Metastability means that a dynamical system remains in a certain area for a comparatively long time before hopping rapidly into another metastable area. Therefore, metastability is one of the most challenging problems for effective sampling. This thesis is about importance sampling strategies for metastable dynamical systems. The main idea of this thesis is to decrease the metastability to get estimators with a lower variance and reduce the sampling effort. After an introduction and a presentation of the relevant theory we explore in Chapter 3 an idea of global optimization to decrease the metastability in the dynamical system. We show how the approach can be used for sampling thermodynamic and dynamic quantities and support the results with numerical experiments. In Chapter 4 we use a local approach to decrease the metastability and thus build an importance sampling scheme for dynamic quantities. We use the experience of well-known MD algorithms to build good local perturbations. For the importance sampling scheme the algorithms have to be assimilated and combined with a result from stochastic analysis. The resulting algorithm is tested in different numerical settings. In Chapter 5 we consider two different methods (Gradient descent and Cross-Entropy method) which have been proposed for finding the optimal perturbation in terms of variance reduction. For the gradient descent we develop different gradient estimators and for the Cross-Entropy method we develop a non-parametric representation of the optimal perturbation. The results are supported by numerical examples. The thesis finishes with a summary of our findings and an outlook on future research.Das Verhalten von MolekĂŒlen wird bestimmt von seltenen Ereignissen. So kann zum Beispiel eine KonformationsĂ€nderung dazu fĂŒhren, dass sich die FunktionalitĂ€t eines MolekĂŒls komplett Ă€ndert. DarĂŒberhinaus haben diese seltenen Ereignisse auch einen großen Einfluss auf numerische Simulationen von MolekĂŒlen. Darum ist es wichtig effektive und zuverlĂ€ssige numerische Methoden zu haben, um diese seltenen Ereignisse vorherzusagen. Die Probleme, die durch seltene Ereignisse hervorgerufen werden, werden hauptsĂ€chlich durch das stochastische Verhalten des dynamischen Systems und einem daraus resultierenden PhĂ€nomen, welches MetastabilitĂ€t genannt wird, verursacht. MetastabilitĂ€t heißt, dass das dynamische System fĂŒr lange Zeit in einem bestimmten metastabilen Zustand verweilt, bevor es sehr schnell in einen anderen metastabilen Zustand ĂŒbergeht. Deshalb ist MetastabilitĂ€t eines der grĂ¶ĂŸten Probleme fĂŒr die effektive SchĂ€tzung der unterschiedlichen GrĂ¶ĂŸen. In der Molekulardynamik gibt es zwei unterschiedliche GrĂ¶ĂŸen und die SchĂ€tzung von beiden wird durch seltene Ereignisse beeinflusst. FĂŒr thermodynamsche GrĂ¶ĂŸen sind viele unterschiedliche Methoden entwickelt worden, die sich nicht ohne Weiteres auf die SchĂ€tzung von dynamischen GrĂ¶ĂŸen ĂŒbertragen lassen. Diese Arbeit beschĂ€ftigt sich mit der Verbesserung von SchĂ€tzmethoden dieser GrĂ¶ĂŸen. Die zugrundeliegende Idee ist, die MetastabilitĂ€t des Systems zu beeinflussen, um den Simulationsaufwand zu verringern und eine Varianzreduktion des SchĂ€tzers zu bekommen. Nach einer EinfĂŒhrung und einer Zusammenfassung der relevanten Theorie beschĂ€ftigt sich das 3. Kapitel mit einer Idee aus der globalen Optimierung, um die MetastabilitĂ€t zu reduzieren. Wir zeigen, dass der Ansatz sowohl fĂŒr thermodynamische GrĂ¶ĂŸen als auch fĂŒr dynamische GrĂ¶ĂŸen genutzt werden kann. Im 4. Kapitel werden lokale AnsĂ€tze genutzt, um ein Importance-Sampling-Schema fĂŒr dynamische GrĂ¶ĂŸen zu entwickeln. Wir nutzen die Expertise gut etablierter MD-Methoden, um eine gute lokale Perturbation zu erstellen. FĂŒr das Importance-Sampling-Schema mĂŒssen diese Algorithmen angepasst und mit Ergebnissen aus der stochastischen Analysis verbunden werden. Die Methode wird an unterschiedlichen Beispielen getestet. Das letzte Kapitel beschĂ€ftigt sich mit zwei Methoden, die eine optimale Perturbation im Sinne der Varianz finden können (Gradientenabstieg und Cross-Entropy-Methode). FĂŒr den Gradientenabstieg werden unterschiedliche SchĂ€tzer des Gradienten entwickelt und mithilfe der Cross-Entropy-Methode wird eine nicht parametrische Approximation der optimalen Pertubration hergeleitet. Am Ende der Arbeit werden die Ergebnisse zusammengefasst, diskutiert und weiterfĂŒhrende Ideen prĂ€sentiert

    On the probability density function of baskets

    Full text link
    The state price density of a basket, even under uncorrelated Black-Scholes dynamics, does not allow for a closed from density. (This may be rephrased as statement on the sum of lognormals and is especially annoying for such are used most frequently in Financial and Actuarial Mathematics.) In this note we discuss short time and small volatility expansions, respectively. The method works for general multi-factor models with correlations and leads to the analysis of a system of ordinary (Hamiltonian) differential equations. Surprisingly perhaps, even in two asset Black-Scholes situation (with its flat geometry), the expansion can degenerate at a critical (basket) strike level; a phenomena which seems to have gone unnoticed in the literature to date. Explicit computations relate this to a phase transition from a unique to more than one "most-likely" paths (along which the diffusion, if suitably conditioned, concentrates in the afore-mentioned regimes). This also provides a (quantifiable) understanding of how precisely a presently out-of-money basket option may still end up in-the-money.Comment: Appeared in: Large Deviations and Asymptotic Methods in Finance, Springer proceedings in Mathematics & Statistics, Editors: Friz, P.K., Gatheral, J., Gulisashvili, A., Jacquier, A., Teichmann, J., 2015, with minor typos remove

    Molecular Dynamics Simulation

    Get PDF
    Condensed matter systems, ranging from simple fluids and solids to complex multicomponent materials and even biological matter, are governed by well understood laws of physics, within the formal theoretical framework of quantum theory and statistical mechanics. On the relevant scales of length and time, the appropriate ‘first-principles’ description needs only the Schroedinger equation together with Gibbs averaging over the relevant statistical ensemble. However, this program cannot be carried out straightforwardly—dealing with electron correlations is still a challenge for the methods of quantum chemistry. Similarly, standard statistical mechanics makes precise explicit statements only on the properties of systems for which the many-body problem can be effectively reduced to one of independent particles or quasi-particles. [...

    Numerical schemes and Monte Carlo techniques for Greeks in stochastic volatility models

    Get PDF
    The main objective of this thesis is to propose approximations to option sensitivities in stochastic volatility models. The first part explores sequential Monte Carlo techniques for approximating the latent state in a Hidden Markov Model. These techniques are applied to the computation of Greeks by adapting the likelihood ratio method. Convergence of the Greek estimates is proved and tracking of option prices is performed in a stochastic volatility model. The second part defines a class of approximate Greek weights and provides high-order approximations and justification for extrapolation techniques. Under certain regularity assumptions on the value function of the problem, Greek approximations are proved for a fully implementable Monte Carlo framework, using weak Taylor discretisation schemes. The variance and bias are studied for the Delta and Gamma, when using such discrete-time approximations. The final part of the thesis introduces a modified explicit Euler scheme for stochastic differential equations with non-Lipschitz continuous drift or diffusion; a strong rate of convergence is proved. The literature on discretisation techniques for stochastic differential equations has been motivational for the development of techniques preserving the explicitness of the algorithm. Stochastic differential equations in the mathematical finance literature, including the Cox-Ingersoll-Ross, the 3/2 and the Ait-Sahalia models can be discretised, with a strong rate of convergence proved, which is a requirement for multilevel Monte Carlo techniques.Open Acces

    Financial cycles, credit networks and macroeconomic fluctuations: multi-scale stochastic models and wavelet analysis

    Get PDF
    This project focuses on the macroeconomics of financial cycles. Usually defined in terms of self-reinforcing interactions between perceptions of value and risk, attitudes towards risk and financing constraints, which translate into booms followed by bust, the recent empirical literature has recurred to two approaches \u2013 turning point analysis and frequency-based filters - applied to measures of credit and asset prices to pose a number of stylized facts. First, financial cycles tend to display a greater amplitude and a lower frequency in comparison to business cycles, with peaks associated with systemic crises. Second, financial cycles depend on policy regimes and on the pace of financial innovations, leading to a wide cross-country heterogeneity and a time-varying degree of global synchronization. The latter point is clearly related to the structural transformations occurred in financial systems over the last three decades, like the cumulative integration of traditional banking with capital market developments and the increasing degree of interconnections among financial institutions. However, to date very little is known about determinants and mechanisms behind financial cycles, and on how they interact with business cycles and medium-to-long-run macroeconomic performance. In this project we plan to research along three dimensions: i) measurement issues, in order to provide a comprehensive assessment of the evolution of co-movements between financial and real variables across a sample of financial developed countries, both over time and at different frequencies; ii) theoretical issues, aimed at exploring under what circumstances the network of interconnections among financial intermediaries and between intermediaries and non-financial borrowers might evolve cyclically, contributing this way to regulate the incentives agents have in taking risks, and to set the importance of credit and financial frictions in accounting for time-varying misallocations of resources; iii) policy issues, given the role assigned by international supervisory bodies to a proper characterization and knowledge of the financial cycle as a prerequisite for the macro-prudential regulation of banks, and the scope of monetary policy in promoting financial stability in addition to the typical mandate of price stability. Our task requires the employment of a new approach to macroeconomic analysis, diverse analytical tools and one unifying economic principle. As regards the latter, our focal point is the notion of risk externalities, across financial institutions and between the financial sector and the real economy. The set of tools we plan to employ spans from wavelets methods to multi-scale models in continuous time, and from strategic network formation to agent-based computational techniques. All these tools are instrumental in building and estimating macroeconomic models characterized by interrelated markets operating at different time scales

    On the probability density function of baskets

    Get PDF
    The state price density of a basket, even under uncorrelated Black-Scholes dynamics, does not allow for a closed from density. (This may be rephrased as statement on the sum of lognormals and is especially annoying for such are used most frequently in Financial and Actuarial Mathematics.) In this note we discuss short time and small volatility expansions, respectively. The method works for general multi-factor models with correlations and leads to the analysis of a system of ordinary (Hamiltonian) differential equations. Surprisingly perhaps, even in two asset Black-Scholes situation (with its flat geometry), the expansion can degenerate at a critical (basket) strike level; a phenomena which seems to have gone unnoticed in the literature to date. Explicit computations relate this to a phase transition from a unique to more than one "most-likely" paths (along which the diffusion, if suitably conditioned, concentrates in the afore-mentioned regimes). This also provides a (quantifiable) understanding of how precisely a presently out-of-money basket option may still end up in-the-money

    Properties and advances of probabilistic and statistical algorithms with applications in finance

    Get PDF
    This thesis is concerned with the construction and enhancement of algorithms involving probability and statistics. The main motivation for these are problems that appear in finance and more generally in applied science. We consider three distinct areas, namely, credit risk modelling, numerics for McKean Vlasov stochastic differential equations and stochastic representations of Partial Differential Equations (PDEs), therefore the thesis is split into three parts. Firstly, we consider the problem of estimating a continuous time Markov chain (CTMC) generator from discrete time observations, which is essentially a missing data problem in statistics. These generators give rise to transition probabilities (in particular probabilities of default) over any time horizon, hence the estimation of such generators is a key problem in the world of banking, where the regulator requires banks to calculate risk over different time horizons. For this particular problem several algorithms have been proposed, however, through a combination of theoretical and numerical results we show the Expectation Maximisation (EM) algorithm to be the superior choice. Furthermore we derive closed form expressions for the associated Wald confidence intervals (error) estimated by the EM algorithm. Previous attempts to calculate such intervals relied on numerical schemes which were slower and less stable. We further provide a closed form expression (via the Delta method) to transfer these errors to the level of the transition probabilities, which are more intuitive. Although one can establish more precise mathematical results with the Markov assumption, there is empirical evidence suggesting this assumption is not valid. We finish this part by carrying out empirical research on non-Markov phenomena and propose a model to capture the so-called rating momentum. This model has many appealing features and is a natural extension to the Markov set up. The second part is based on McKean Vlasov Stochastic Differential Equations (MV-SDEs), these Stochastic Differential Equations (SDEs) arise from looking at the limit, as the number of weakly interacting particles (e.g. gas particles) tends to infinity. The resulting SDE has coefficients which can depend on its own law, making them theoretically more involved. Although MV-SDEs arise from statistical physics, there has been an explosion in interest recently to use MV-SDEs in models for economics. We firstly derive an explicit approximation scheme for MV-SDEs with one-sided Lipschitz growth in the drift. Such a condition was observed to be an issue for standard SDEs and required more sophisticated schemes. There are implicit and explicit schemes one can use and we develop both types in the setting of MV-SDEs. Another main issue for MVSDEs is, due to the dependency on their own law they are extremely expensive to simulate compared to standard SDEs, hence techniques to improve computational cost are in demand. The final result in this part is to develop an importance sampling algorithm for MV-SDEs, where our measure change is obtained through the theory of large deviation principles. Although importance sampling results for standard SDEs are reasonably well understood, there are several difficulties one must overcome to apply a good importance sampling change of measure in this setting. The importance sampling is used here as a variance reduction technique although our results hint that one may be able to use it to reduce propagation of chaos error as well. Finally we consider stochastic algorithms to solve PDEs. It is known one can achieve numerical advantages by using probabilistic methods to solve PDEs, through the so-called probabilistic domain decomposition method. The main result of this part is to present an unbiased stochastic representation for a first order PDE, based on the theory of branching diffusions and regime switching. This is a very interesting result since previously (Itî based) stochastic representations only applied to second order PDEs. There are multiple issues one must overcome in order to obtain an algorithm that is numerically stable and solves such a PDE. We conclude by showing the algorithm’s potential on a more general first order PDE
    corecore