Modelling empirical features and liquidity resilience in the Limit Order Book


The contribution of this body of work is in developing new methods for modelling interactions in modern financial markets and understanding the origins of pervasive features of trading data. The advent of electronic trading and the improvement in trading technology has brought about vast changes in individual trading behaviours, and thus in the overall dynamics of trading interactions. The increased sophistication of market venues has led to the diminishing of the role of specialists in making markets, a more direct interaction between trading parties and the emergence of the Limit Order Book (LOB) as the pre-eminent trading system. However, this has also been accompanied by an increased fluctuation in the liquidity available for immediate execution, as market makers try to balance the provision of liquidity against the probability of an adverse price move, with liquidity traders being increasingly aware of this and searching for the optimal placement strategy to reduce execution costs. The varying intra-day liquidity levels in the LOB are one of the main issues examined here. The thesis proposes a new measure for the resilience of liquidity, based on the duration of intra-day liquidity droughts. The flexible survival regression framework employed can accommodate any liquidity measure and any threshold liquidity level of choice to model these durations, and relate them to covariates summarising the state of the LOB. Of these covariates, the frequency of the droughts and the value of the liquidity measure are found to have substantial power in explaining the variation in the new resilience metric. We have shown that the model also has substantial predictive power for the duration of these liquidity droughts, and could thus be of use in estimating the time between subsequent tranches of a large order in an optimal execution setting. A number of recent studies have uncovered a commonality in liquidity that extends across markets and across countries. We outline the implications of using the PCA regression approaches that have been employed in recent studies through synthetic examples, and demonstrate that using such an approach for the study of European stocks can mislead regarding the level of liquidity commonality. We also propose a method via which to measure commonality in liquidity resilience, using an extension of the resilience metric identified earlier. This involves the first use of functional data analysis in this setting, as a way of summarising resilience data, as well as measuring commonality via functional principal components analysis regression. Trading interactions are considered using a form of agent-based modelling in the LOB, where the activity is assumed to arise from the interaction of liquidity providers, liquidity demanders and noise traders. The highly detailed nature of the model entails that one can quantify the dependence between order arrival rates at different prices, as well as market orders and cancellations. In this context, we demonstrate the value of indirect inference and simulation-based estimation methods (multi-objective optimisation in particular) for models for which direct estimation through maximum likelihood is difficult (for example, when the likelihood cannot be obtained in closed form). Besides being a novel contribution to the area of agent-based modelling, we demonstrate how the model can be used in a regulation setting, to quantify the effect of the introduction of new financial regulation

Similar works

This paper was published in UCL Discovery.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.