2,192 research outputs found

    Underdetermined-order recursive least-squares adaptive filtering: The concept and algorithms

    No full text
    Published versio

    A Stieltjes transform approach for analyzing the RLS adaptive Filter

    Get PDF
    Although the RLS filter is well-known and various algorithms have been developed for its implementation, analyzing its performance when the regressors are random, as is often the case, has proven to be a formidable task. The reason is that the Riccati recursion, which propagates the error covariance matrix, becomes a random recursion. The existing results are approximations based on assumptions that are often not very realistic. In this paper we use ideas from the theory of large random matrices to find the asymptotic (in time) eigendistribution of the error covariance matrix of the RLS filter. Under the assumption of a large dimensional state vector (in most cases n = 10-20 is large enough to get quite accurate predictions) we find the asymptotic eigendistribution of the error covariance for temporally white regressors, shift structured regressors, and for the RLS filter with intermittent observations

    On line power spectra identification and whitening for the noise in interferometric gravitational wave detectors

    Get PDF
    In this paper we address both to the problem of identifying the noise Power Spectral Density of interferometric detectors by parametric techniques and to the problem of the whitening procedure of the sequence of data. We will concentrate the study on a Power Spectral Density like the one of the Italian-French detector VIRGO and we show that with a reasonable finite number of parameters we succeed in modeling a spectrum like the theoretical one of VIRGO, reproducing all its features. We propose also the use of adaptive techniques to identify and to whiten on line the data of interferometric detectors. We analyze the behavior of the adaptive techniques in the field of stochastic gradient and in the Least Squares ones.Comment: 28 pages, 21 figures, uses iopart.cls accepted for pubblication on Classical and Quantum Gravit

    A Stochastic Majorize-Minimize Subspace Algorithm for Online Penalized Least Squares Estimation

    Full text link
    Stochastic approximation techniques play an important role in solving many problems encountered in machine learning or adaptive signal processing. In these contexts, the statistics of the data are often unknown a priori or their direct computation is too intensive, and they have thus to be estimated online from the observed signals. For batch optimization of an objective function being the sum of a data fidelity term and a penalization (e.g. a sparsity promoting function), Majorize-Minimize (MM) methods have recently attracted much interest since they are fast, highly flexible, and effective in ensuring convergence. The goal of this paper is to show how these methods can be successfully extended to the case when the data fidelity term corresponds to a least squares criterion and the cost function is replaced by a sequence of stochastic approximations of it. In this context, we propose an online version of an MM subspace algorithm and we study its convergence by using suitable probabilistic tools. Simulation results illustrate the good practical performance of the proposed algorithm associated with a memory gradient subspace, when applied to both non-adaptive and adaptive filter identification problems

    Robust Kalman tracking and smoothing with propagating and non-propagating outliers

    Full text link
    A common situation in filtering where classical Kalman filtering does not perform particularly well is tracking in the presence of propagating outliers. This calls for robustness understood in a distributional sense, i.e.; we enlarge the distribution assumptions made in the ideal model by suitable neighborhoods. Based on optimality results for distributional-robust Kalman filtering from Ruckdeschel[01,10], we propose new robust recursive filters and smoothers designed for this purpose as well as specialized versions for non-propagating outliers. We apply these procedures in the context of a GPS problem arising in the car industry. To better understand these filters, we study their behavior at stylized outlier patterns (for which they are not designed) and compare them to other approaches for the tracking problem. Finally, in a simulation study we discuss efficiency of our procedures in comparison to competitors.Comment: 27 pages, 12 figures, 2 table

    Forecasting Time Series with VARMA Recursions on Graphs

    Full text link
    Graph-based techniques emerged as a choice to deal with the dimensionality issues in modeling multivariate time series. However, there is yet no complete understanding of how the underlying structure could be exploited to ease this task. This work provides contributions in this direction by considering the forecasting of a process evolving over a graph. We make use of the (approximate) time-vertex stationarity assumption, i.e., timevarying graph signals whose first and second order statistical moments are invariant over time and correlated to a known graph topology. The latter is combined with VAR and VARMA models to tackle the dimensionality issues present in predicting the temporal evolution of multivariate time series. We find out that by projecting the data to the graph spectral domain: (i) the multivariate model estimation reduces to that of fitting a number of uncorrelated univariate ARMA models and (ii) an optimal low-rank data representation can be exploited so as to further reduce the estimation costs. In the case that the multivariate process can be observed at a subset of nodes, the proposed models extend naturally to Kalman filtering on graphs allowing for optimal tracking. Numerical experiments with both synthetic and real data validate the proposed approach and highlight its benefits over state-of-the-art alternatives.Comment: submitted to the IEEE Transactions on Signal Processin
    corecore