2,965 research outputs found
Recommended from our members
Sequential Inference Methods for Non-Homogeneous Poisson Processes with State-Space Prior
The non-homogeneous Poisson process provides a generalised framework for the modelling of random point data by allowing the intensity of point generation to vary across its domain of interest (time or space). The use of non-homogeneous Poisson processes have arisen in many areas of signal processing and machine learning, but application is still largely limited by its intractable likelihood function and the lack of computationally efficient inference schemes, although some methods do exist for the batch data case. In this paper, we propose for the first time a sequential framework for intensity inference which combines the non-homogeneous model of Poisson data with continuous-time state-space models for their time-varying intensity. This approach enables us to design efficient online inference schemes, for which we propose a novel sequential Markov chain Monte Carlo (SMCMC) algorithm, as is demanded by many applications where point data arrive sequentially and decisions need to be made with low latency. The proposed approach is compared with competing methods on synthetic datasets and tested with high-frequency financial order book data, showing in general improved performance and better computational efficiency than the main batch-based competitor algorithm, and better performance than a simple baseline kernel estimation scheme
Recommended from our members
Sequential Modelling and Inference of High-frequency Limit Order Book with State-space Models and Monte Carlo Algorithms
The high-frequency limit order book (LOB) market has recently attracted increasing research attention from both the industry and the academia as a result of expanding algorithmic trading. However, the massive data throughput and the inherent complexity of high-frequency market dynamics also present challenges to some classic statistical modelling approaches. By adopting powerful state-space models from the field of signal processing as well as a number of Bayesian inference algorithms such as particle filtering, Markov chain Monte Carlo and variational inference algorithms, this thesis presents my extensive research into the high-frequency limit order book covering a wide scope of topics.
Chapter 2 presents a novel construction of the non-homogeneous Poisson process to allow online intensity inference of limit order transactions arriving at a central exchange as point data. Chapter 3 extends a baseline jump diffusion model for market fair-price process to include three additional model features taken from real-world market intuitions. In Chapter 4, another price model is developed to account for both long-term and short-term diffusion behaviours of the price process. This is achieved by incorporating multiple jump-diffusion processes each exhibiting a unique characteristic. Chapter 5 observes the multi-regime nature of price diffusion processes as well as the non-Markovian switching behaviour between regimes. As such, a novel model is proposed which combines the continuous-time state-space model, the hidden semi-Markov switching model and the non-parametric Dirichlet process model. Additionally, building upon the general structure of the particle Markov chain Monte Carlo algorithm, I further propose an algorithm which achieves sequential state inference, regime identification and regime parameters learning requiring minimal prior assumptions. Chapter 6 focuses on the development of efficient parameter-learning algorithms for state-space models and presents three algorithms each demonstrating promising results in comparison to some well-established methods.
The models and algorithms proposed in this thesis not only are practical tools for analysing high-frequency LOB markets, but can also be applied in various areas and disciplines beyond finance
Importance Sampling: Intrinsic Dimension and Computational Cost
The basic idea of importance sampling is to use independent samples from a
proposal measure in order to approximate expectations with respect to a target
measure. It is key to understand how many samples are required in order to
guarantee accurate approximations. Intuitively, some notion of distance between
the target and the proposal should determine the computational cost of the
method. A major challenge is to quantify this distance in terms of parameters
or statistics that are pertinent for the practitioner. The subject has
attracted substantial interest from within a variety of communities. The
objective of this paper is to overview and unify the resulting literature by
creating an overarching framework. A general theory is presented, with a focus
on the use of importance sampling in Bayesian inverse problems and filtering.Comment: Statistical Scienc
Stochastic volatility
Given the importance of return volatility on a number of practical financial management decisions, the efforts to provide good real- time estimates and forecasts of current and future volatility have been extensive. The main framework used in this context involves stochastic volatility models. In a broad sense, this model class includes GARCH, but we focus on a narrower set of specifications in which volatility follows its own random process, as is common in models originating within financial economics. The distinguishing feature of these specifications is that volatility, being inherently unobservable and subject to independent random shocks, is not measurable with respect to observable information. In what follows, we refer to these models as genuine stochastic volatility models. Much modern asset pricing theory is built on continuous- time models. The natural concept of volatility within this setting is that of genuine stochastic volatility. For example, stochastic-volatility (jump-) diffusions have provided a useful tool for a wide range of applications, including the pricing of options and other derivatives, the modeling of the term structure of risk-free interest rates, and the pricing of foreign currencies and defaultable bonds. The increased use of intraday transaction data for construction of so-called realized volatility measures provides additional impetus for considering genuine stochastic volatility models. As we demonstrate below, the realized volatility approach is closely associated with the continuous-time stochastic volatility framework of financial economics. There are some unique challenges in dealing with genuine stochastic volatility models. For example, volatility is truly latent and this feature complicates estimation and inference. Further, the presence of an additional state variable - volatility - renders the model less tractable from an analytic perspective. We examine how such challenges have been addressed through development of new estimation methods and imposition of model restrictions allowing for closed-form solutions while remaining consistent with the dominant empirical features of the data.Stochastic analysis
Self-Evaluation Applied Mathematics 2003-2008 University of Twente
This report contains the self-study for the research assessment of the Department of Applied Mathematics (AM) of the Faculty of Electrical Engineering, Mathematics and Computer Science (EEMCS) at the University of Twente (UT). The report provides the information for the Research Assessment Committee for Applied Mathematics, dealing with mathematical sciences at the three universities of technology in the Netherlands. It describes the state of affairs pertaining to the period 1 January 2003 to 31 December 2008
Maximum likelihood estimation of time series models: the Kalman filter and beyond
The purpose of this chapter is to provide a comprehensive treatment of likelihood inference for state space models. These are a class of time series models relating an observable time series to quantities called states, which are characterized by a simple temporal dependence structure, typically a first order Markov process. The states have sometimes substantial interpretation. Key estimation problems in economics concern latent variables, such as the output gap, potential output, the non-accelerating-inflation rate of unemployment, or NAIRU, core inflation, and so forth. Time-varying volatility, which is quintessential to finance, is an important feature also in macroeconomics. In the multivariate framework relevant features can be common to different series, meaning that the driving forces of a particular feature and/or the transmission mechanism are the same. The objective of this chapter is reviewing this algorithm and discussing maximum likelihood inference, starting from the linear Gaussian case and discussing the extensions to a nonlinear and non Gaussian framework
- …