1,144 research outputs found

    Adaptive weighted least squares algorithm for Volterra signal modeling

    No full text
    Published versio

    Time-varying model identification for time-frequency feature extraction from EEG data

    Get PDF
    A novel modelling scheme that can be used to estimate and track time-varying properties of nonstationary signals is investigated. This scheme is based on a class of time-varying AutoRegressive with an eXogenous input (ARX) models where the associated time-varying parameters are represented by multi-wavelet basis functions. The orthogonal least square (OLS) algorithm is then applied to refine the model parameter estimates of the time-varying ARX model. The main features of the multi-wavelet approach is that it enables smooth trends to be tracked but also to capture sharp changes in the time-varying process parameters. Simulation studies and applications to real EEG data show that the proposed algorithm can provide important transient information on the inherent dynamics of nonstationary processes

    Dynamic modeling of mean-reverting spreads for statistical arbitrage

    Full text link
    Statistical arbitrage strategies, such as pairs trading and its generalizations, rely on the construction of mean-reverting spreads enjoying a certain degree of predictability. Gaussian linear state-space processes have recently been proposed as a model for such spreads under the assumption that the observed process is a noisy realization of some hidden states. Real-time estimation of the unobserved spread process can reveal temporary market inefficiencies which can then be exploited to generate excess returns. Building on previous work, we embrace the state-space framework for modeling spread processes and extend this methodology along three different directions. First, we introduce time-dependency in the model parameters, which allows for quick adaptation to changes in the data generating process. Second, we provide an on-line estimation algorithm that can be constantly run in real-time. Being computationally fast, the algorithm is particularly suitable for building aggressive trading strategies based on high-frequency data and may be used as a monitoring device for mean-reversion. Finally, our framework naturally provides informative uncertainty measures of all the estimated parameters. Experimental results based on Monte Carlo simulations and historical equity data are discussed, including a co-integration relationship involving two exchange-traded funds.Comment: 34 pages, 6 figures. Submitte

    LQG Online Learning

    Get PDF
    Optimal control theory and machine learning techniques are combined to formulate and solve in closed form an optimal control formulation of online learning from supervised examples with regularization of the updates. The connections with the classical Linear Quadratic Gaussian (LQG) optimal control problem, of which the proposed learning paradigm is a non-trivial variation as it involves random matrices, are investigated. The obtained optimal solutions are compared with the Kalman-filter estimate of the parameter vector to be learned. It is shown that the proposed algorithm is less sensitive to outliers with respect to the Kalman estimate (thanks to the presence of the regularization term), thus providing smoother estimates with respect to time. The basic formulation of the proposed online-learning framework refers to a discrete-time setting with a finite learning horizon and a linear model. Various extensions are investigated, including the infinite learning horizon and, via the so-called "kernel trick", the case of nonlinear models

    Performance analysis of the generalised projection identification for time-varying systems

    Get PDF
    © The Institution of Engineering and Technology 2016. The least mean square methods include two typical parameter estimation algorithms, which are the projection algorithm and the stochastic gradient algorithm, the former is sensitive to noise and the latter is not capable of tracking the timevarying parameters. On the basis of these two typical algorithms, this study presents a generalised projection identification algorithm (or a finite data window stochastic gradient identification algorithm) for time-varying systems and studies its convergence by using the stochastic process theory. The analysis indicates that the generalised projection algorithm can track the time-varying parameters and requires less computational effort compared with the forgetting factor recursive least squares algorithm. The way of choosing the data window length is stated so that the minimum parameter estimation error upper bound can be obtained. The numerical examples are provided

    Countering the Effects of Measurement Noise during the Identification of Dynamical Systems

    Get PDF
    Sensor noise is an unavoidable fact of life when it comes to measurements on physical systems, as is the case in feedback control. Therefore, it must be properly addressed during dynamic system identification. In this work, a novel approach is developed toward the treatment of measurement noise in dynamical systems. This approach hinges on proper stochastic modeling, and it can be adapted easily to many different scenarios, where it yields consistently good parameter estimates. The Generalized Minimum Variance algorithm developed and used in this work is based on the theory behind the minimum variance identification process, and the estimate produced is a fixed point of a mapping based on the minimum variance solution. Additionally, the algorithm yields an accurate prediction of the estimation error. This algorithm is applied to many different noise models associated with three basic identification problems. First, continuous-time systems are identified using frequency domain measurements. Next, a discrete-time plant is identified using discrete-time measurements. Finally, the physical parameters of a continuous-time plant are identified using sampled measurements of the continuous-time input and output. Validation of the estimates is performed correctly, and the results are compared with other, more common, identification algorithms
    corecore