373 research outputs found

    Langevin and Hamiltonian based Sequential MCMC for Efficient Bayesian Filtering in High-dimensional Spaces

    Full text link
    Nonlinear non-Gaussian state-space models arise in numerous applications in statistics and signal processing. In this context, one of the most successful and popular approximation techniques is the Sequential Monte Carlo (SMC) algorithm, also known as particle filtering. Nevertheless, this method tends to be inefficient when applied to high dimensional problems. In this paper, we focus on another class of sequential inference methods, namely the Sequential Markov Chain Monte Carlo (SMCMC) techniques, which represent a promising alternative to SMC methods. After providing a unifying framework for the class of SMCMC approaches, we propose novel efficient strategies based on the principle of Langevin diffusion and Hamiltonian dynamics in order to cope with the increasing number of high-dimensional applications. Simulation results show that the proposed algorithms achieve significantly better performance compared to existing algorithms

    Loss Distribution Approach for Operational Risk Capital Modelling under Basel II: Combining Different Data Sources for Risk Estimation

    Full text link
    The management of operational risk in the banking industry has undergone significant changes over the last decade due to substantial changes in operational risk environment. Globalization, deregulation, the use of complex financial products and changes in information technology have resulted in exposure to new risks very different from market and credit risks. In response, Basel Committee for banking Supervision has developed a regulatory framework, referred to as Basel II, that introduced operational risk category and corresponding capital requirements. Over the past five years, major banks in most parts of the world have received accreditation under the Basel II Advanced Measurement Approach (AMA) by adopting the loss distribution approach (LDA) despite there being a number of unresolved methodological challenges in its implementation. Different approaches and methods are still under hot debate. In this paper, we review methods proposed in the literature for combining different data sources (internal data, external data and scenario analysis) which is one of the regulatory requirement for AMA

    Multiple barrier-crossings of an Ornstein-Uhlenbeck diffusion in consecutive periods

    Get PDF
    We investigate the joint distribution and the multivariate survival functions for the maxima of an Ornstein-Uhlenbeck (OU) process in consecutive time-intervals. A PDE method, alongside an eigenfunction expansion is adopted, with which we first calculate the distribution and the survival functions for the maximum of a homogeneous OU-process in a single interval. By a deterministic time-change and a parameter translation, this result can be extended to an inhomogeneous OU-process. Next, we derive a general formula for the joint distribution and the survival functions for the maxima of a continuous Markov process in consecutive periods. With these results, one can obtain semi-analytical expressions for the joint distribution and the multivariate survival functions for the maxima of an OU-process, with piecewise constant parameter functions, in consecutive time periods. The joint distribution and the survival functions can be evaluated numerically by an iterated quadrature scheme, which can be implemented efficiently by matrix multiplications. Moreover, we show that the computation can be further simplified to the product of single quadratures if the filtration is enlarged. Such results may be used for the modelling of heatwaves and related risk management challenges.Comment: 38 pages, 10 figures, 2 table

    Trends in crypto-currencies and blockchain technologies: A monetary theory and regulation perspective

    Full text link
    The internet era has generated a requirement for low cost, anonymous and rapidly verifiable transactions to be used for online barter, and fast settling money have emerged as a consequence. For the most part, e-money has fulfilled this role, but the last few years have seen two new types of money emerge. Centralised virtual currencies, usually for the purpose of transacting in social and gaming economies, and crypto-currencies, which aim to eliminate the need for financial intermediaries by offering direct peer-to-peer online payments. We describe the historical context which led to the development of these currencies and some modern and recent trends in their uptake, in terms of both usage in the real economy and as investment products. As these currencies are purely digital constructs, with no government or local authority backing, we then discuss them in the context of monetary theory, in order to determine how they may be have value under each. Finally, we provide an overview of the state of regulatory readiness in terms of dealing with transactions in these currencies in various regions of the world

    Heavy-Tailed Features and Empirical Analysis of the Limit Order Book Volume Profiles in Futures Markets

    Full text link
    This paper poses a few fundamental questions regarding the attributes of the volume profile of a Limit Order Books stochastic structure by taking into consideration aspects of intraday and interday statistical features, the impact of different exchange features and the impact of market participants in different asset sectors. This paper aims to address the following questions: 1. Is there statistical evidence that heavy-tailed sub-exponential volume profiles occur at different levels of the Limit Order Book on the bid and ask and if so does this happen on intra or interday time scales ? 2.In futures exchanges, are heavy tail features exchange (CBOT, CME, EUREX, SGX and COMEX) or asset class (government bonds, equities and precious metals) dependent and do they happen on ultra-high (<1sec) or mid-range (1sec -10min) high frequency data? 3.Does the presence of stochastic heavy-tailed volume profile features evolve in a manner that would inform or be indicative of market participant behaviors, such as high frequency algorithmic trading, quote stuffing and price discovery intra-daily? 4. Is there statistical evidence for a need to consider dynamic behavior of the parameters of models for Limit Order Book volume profiles on an intra-daily time scale ? Progress on aspects of each question is obtained via statistically rigorous results to verify the empirical findings for an unprecedentedly large set of futures market LOB data. The data comprises several exchanges, several futures asset classes and all trading days of 2010, using market depth (Type II) order book data to 5 levels on the bid and ask

    A State-Space Estimation of the Lee-Carter Mortality Model and Implications for Annuity Pricing

    Full text link
    In this article we investigate a state-space representation of the Lee-Carter model which is a benchmark stochastic mortality model for forecasting age-specific death rates. Existing relevant literature focuses mainly on mortality forecasting or pricing of longevity derivatives, while the full implications and methods of using the state-space representation of the Lee-Carter model in pricing retirement income products is yet to be examined. The main contribution of this article is twofold. First, we provide a rigorous and detailed derivation of the posterior distributions of the parameters and the latent process of the Lee-Carter model via Gibbs sampling. Our assumption for priors is slightly more general than the current literature in this area. Moreover, we suggest a new form of identification constraint not yet utilised in the actuarial literature that proves to be a more convenient approach for estimating the model under the state-space framework. Second, by exploiting the posterior distribution of the latent process and parameters, we examine the pricing range of annuities, taking into account the stochastic nature of the dynamics of the mortality rates. In this way we aim to capture the impact of longevity risk on the pricing of annuities. The outcome of our study demonstrates that an annuity price can be more than 4% under-valued when different assumptions are made on determining the survival curve constructed from the distribution of the forecasted death rates. Given that a typical annuity portfolio consists of a large number of policies with maturities which span decades, we conclude that the impact of longevity risk on the accurate pricing of annuities is a significant issue to be further researched. In addition, we find that mis-pricing is increasingly more pronounced for older ages as well as for annuity policies having a longer maturity.Comment: 9 pages; conference pape
    • …
    corecore