11,330 research outputs found

    Underdetermined-order recursive least-squares adaptive filtering: The concept and algorithms

    No full text
    Published versio

    Distributed Recursive Least-Squares: Stability and Performance Analysis

    Full text link
    The recursive least-squares (RLS) algorithm has well-documented merits for reducing complexity and storage requirements, when it comes to online estimation of stationary signals as well as for tracking slowly-varying nonstationary processes. In this paper, a distributed recursive least-squares (D-RLS) algorithm is developed for cooperative estimation using ad hoc wireless sensor networks. Distributed iterations are obtained by minimizing a separable reformulation of the exponentially-weighted least-squares cost, using the alternating-minimization algorithm. Sensors carry out reduced-complexity tasks locally, and exchange messages with one-hop neighbors to consent on the network-wide estimates adaptively. A steady-state mean-square error (MSE) performance analysis of D-RLS is conducted, by studying a stochastically-driven `averaged' system that approximates the D-RLS dynamics asymptotically in time. For sensor observations that are linearly related to the time-invariant parameter vector sought, the simplifying independence setting assumptions facilitate deriving accurate closed-form expressions for the MSE steady-state values. The problems of mean- and MSE-sense stability of D-RLS are also investigated, and easily-checkable sufficient conditions are derived under which a steady-state is attained. Without resorting to diminishing step-sizes which compromise the tracking ability of D-RLS, stability ensures that per sensor estimates hover inside a ball of finite radius centered at the true parameter vector, with high-probability, even when inter-sensor communication links are noisy. Interestingly, computer simulations demonstrate that the theoretical findings are accurate also in the pragmatic settings whereby sensors acquire temporally-correlated data.Comment: 30 pages, 4 figures, submitted to IEEE Transactions on Signal Processin

    Optimal control of partially observable linear quadratic systems with asymmetric observation errors

    Get PDF
    This paper deals with the optimal quadratic control problem for non-Gaussian discrete-time stochastic systems. Our main result gives explicit solutions for the optimal quadratic control problem for partially observable dynamic linear systems with asymmetric observation errors. For this purpose an asymmetric version of the Kalman filter based on asymmetric least squares estimation is used. We illustrate the applicability of our approach with numerical results

    OPTIMAL CONTROL OF PARTIALLY OBSERVABLE LINEAR QUADRATIC SYSTEMS WITH ASYMMETRIC OBSERVATION ERRORS

    Get PDF
    This paper deals with the optimal quadratic control problem for non-Gaussian discrete-time stochastic systems. Our main result gives explicit solutions for the optimal quadratic control problem for partially observable dynamic linear systems with asymmetric observation errors. For this purpose an asymmetric version of the Kalman filter based on asymmetric least squares estimation is used. We illustrate the applicability of our approach with numerical results.

    A Stochastic Majorize-Minimize Subspace Algorithm for Online Penalized Least Squares Estimation

    Full text link
    Stochastic approximation techniques play an important role in solving many problems encountered in machine learning or adaptive signal processing. In these contexts, the statistics of the data are often unknown a priori or their direct computation is too intensive, and they have thus to be estimated online from the observed signals. For batch optimization of an objective function being the sum of a data fidelity term and a penalization (e.g. a sparsity promoting function), Majorize-Minimize (MM) methods have recently attracted much interest since they are fast, highly flexible, and effective in ensuring convergence. The goal of this paper is to show how these methods can be successfully extended to the case when the data fidelity term corresponds to a least squares criterion and the cost function is replaced by a sequence of stochastic approximations of it. In this context, we propose an online version of an MM subspace algorithm and we study its convergence by using suitable probabilistic tools. Simulation results illustrate the good practical performance of the proposed algorithm associated with a memory gradient subspace, when applied to both non-adaptive and adaptive filter identification problems
    corecore