1,350 research outputs found

    On the regularization of forgetting recursive least square

    Full text link

    Underdetermined-order recursive least-squares adaptive filtering: The concept and algorithms

    No full text
    Published versio

    A new regularized QRD recursive least M-estimate algorithm: Performance analysis and applications

    Get PDF
    Proceedings of the International Conference on Green Circuits and Systems, 2010, p. 190-195This paper proposes a new regularized QR decomposition based recursive least M-estimate (R-QRRLM) adaptive filtering algorithm and studies its mean and mean square convergence performance and application to acoustic echo cancellation (AEC). The proposed algorithm extends the conventional RLM algorithm by imposing a weighted L2 regularization term on the coefficients to reduce the variance of the estimator. Moreover, a QRD-based algorithm is employed for efficient recursive implementation and improved numerical property. The mean convergence analysis shows that a bias solution to the classical Wiener solution will be introduced due to the regularization. The steady-state excess mean square error (EMSE) is derived and it suggests that the variance will decrease while the bias will increase with the regularization parameter. Therefore, regularization can help to trade bias for variance. In this study, the regularization parameter can be adaptively selected and the resultant variable regularization parameter QRRLM (VR-QRRLM) algorithm can obtain both high immunity to input variation and low steady-state EMSE values. The theoretical results are in good agreement with simulation results. Computer simulation results on AEC show that the R-QRRLM and VR-QRRLM algorithms considerably outperform the traditional RLS algorithm when the input signal level is low or during double talk. © 2010 IEEE.published_or_final_versio

    A Stochastic Majorize-Minimize Subspace Algorithm for Online Penalized Least Squares Estimation

    Full text link
    Stochastic approximation techniques play an important role in solving many problems encountered in machine learning or adaptive signal processing. In these contexts, the statistics of the data are often unknown a priori or their direct computation is too intensive, and they have thus to be estimated online from the observed signals. For batch optimization of an objective function being the sum of a data fidelity term and a penalization (e.g. a sparsity promoting function), Majorize-Minimize (MM) methods have recently attracted much interest since they are fast, highly flexible, and effective in ensuring convergence. The goal of this paper is to show how these methods can be successfully extended to the case when the data fidelity term corresponds to a least squares criterion and the cost function is replaced by a sequence of stochastic approximations of it. In this context, we propose an online version of an MM subspace algorithm and we study its convergence by using suitable probabilistic tools. Simulation results illustrate the good practical performance of the proposed algorithm associated with a memory gradient subspace, when applied to both non-adaptive and adaptive filter identification problems
    corecore