2 research outputs found

    Regularised inference for changepoint and dependency analysis in non-stationary processes

    Get PDF
    Multivariate correlated time series are found in many modern socio-scientific domains such as neurology, cyber-security, genetics and economics. The focus of this thesis is on efficiently modelling and inferring dependency structure both between data-streams and across points in time. In particular, it is considered that generating processes may vary over time, and are thus non-stationary. For example, patterns of brain activity are expected to change when performing different tasks or thought processes. Models that can describe such behaviour must be adaptable over time. However, such adaptability creates challenges for model identification. In order to perform learning or estimation one must control how model complexity grows in relation to the volume of data. To this extent, one of the main themes of this work is to investigate both the implementation and effect of assumptions on sparsity; relating to model parsimony at an individual time- point, and smoothness; how quickly a model may change over time. Throughout this thesis two basic classes of non-stationary model are stud- ied. Firstly, a class of piecewise constant Gaussian Graphical models (GGM) is introduced that can encode graphical dependencies between data-streams. In particular, a group-fused regulariser is examined that allows for the estima- tion of changepoints across graphical models. The second part of the thesis focuses on extending a class of locally-stationary wavelet (LSW) models. Un- like the raw GGM this enables one to encode dependencies not only between data-streams, but also across time. A set of sparsity aware estimators are developed for estimation of the spectral parameters of such models which are then compared to previous works in the domain
    corecore